Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
5
112
repo_url
stringlengths
34
141
action
stringclasses
3 values
title
stringlengths
1
757
labels
stringlengths
4
664
body
stringlengths
3
261k
index
stringclasses
10 values
text_combine
stringlengths
96
261k
label
stringclasses
2 values
text
stringlengths
96
232k
binary_label
int64
0
1
758,022
26,540,641,917
IssuesEvent
2023-01-19 18:58:14
linkerd/linkerd2
https://api.github.com/repos/linkerd/linkerd2
closed
Dropped request with HTTP 400 and "GET request with illegal URI" in logs
area/proxy priority/P1 bug needs/repro
### What is the issue? Application is configured with linkerd-proxy. And currently connects to other microservice which runs without linkerd-proxy (yet). So mTLS encryption is not involved. When the app starts it makes HTTP connections. The last three are: 1. HTTP GET http://2.2.2.2:9200/ - with response HTTP 200 in app log 2. HTTP POST http://2.2.2.2:9200/products/_open?xxxx - with response HTTP 200 in app log 3. HTTP GET http://2.2.2.2:9200/products - with response HTTP 400 Bad request in app log All in one TCP stream. Taking tcpdump shows, that requests 1,2 are leaving the pod but request 3 is not. So no HTTP GET http://1.2.3.4:9200/products in tcpdump capture. This traffic works without any problems without linkerd-proxy injected. From the logs (below) it looks like this `linkerd_proxy_http::h1: GET request with illegal URI: products` is a possible problem. But why? ### How can it be reproduced? Inject linkerd-proxy into the microservice. ### Logs, error output, etc Relevant debug logs from linkerd-proxy with privacy changes; ``` [ 323.241735s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}: linkerd_service_profiles::client: Resolved profile [ 323.241754s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}: linkerd_app_outbound::switch_logical: Profile describes a logical service [ 323.241759s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}: linkerd_app_outbound::http::detect: Attempting HTTP protocol detection [ 323.275412s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}: linkerd_detect: DetectResult protocol=Some(Http1) elapsed=33.62501ms [ 323.275442s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}: linkerd_proxy_http::server: Creating HTTP service version=Http1 [ 323.275462s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}: linkerd_cache: Caching new value key=Logical { protocol: Http1, profile: .., logical_addr: LogicalAddr(svcname.dev.svc.cluster.local:9200) } [ 323.275508s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}: linkerd_proxy_http::server: Handling as HTTP version=Http1 [ 323.275575s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}: linkerd_service_profiles::http::service: Updating HTTP routes routes=0 [ 323.275602s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}: linkerd_service_profiles::split: Updating targets=[Target { addr: svcname.dev.svc.cluster.local:9200, weight: 1 }] [ 323.275618s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}: linkerd_proxy_api_resolve::resolve: Resolving dst=svcname.dev.svc.cluster.local:9200 context={"ns":"dev", "nodeName":"aks-microservice-123456-vmss000002"} [ 323.275653s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}: linkerd_stack::failfast: HTTP Logical service has become unavailable [ 323.275659s] DEBUG ThreadId(01) evict{key=Logical { protocol: Http1, profile: .., logical_addr: LogicalAddr(svcname.dev.svc.cluster.local:9200) }}: linkerd_cache: Awaiting idleness [ 323.278074s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}: linkerd_stack::failfast: HTTP Balancer service has become unavailable [ 323.278108s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}: linkerd_proxy_api_resolve::resolve: Add endpoints=2 [ 323.278122s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}: linkerd_proxy_discover::from_resolve: Changed change=Insert(2.2.2.3:9200, Endpoint { addr: Remote(ServerAddr(2.2.2.3:9200)), tls: None(NotProvidedByServiceDiscovery), metadata: Metadata { labels: {"namespace": "dev", "pod": "podname-1", "service": "podname", "serviceaccount": "default", "statefulset": "podname"}, protocol_hint: Unknown, opaque_transport_port: None, identity: None, authority_override: None }, logical_addr: Some(LogicalAddr(svcname.dev.svc.cluster.local:9200)), protocol: Http1, opaque_protocol: false }) [ 323.278154s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}: linkerd_proxy_discover::from_resolve: Changed change=Insert(2.2.2.4:9200, Endpoint { addr: Remote(ServerAddr(2.2.2.4:9200)), tls: None(NotProvidedByServiceDiscovery), metadata: Metadata { labels: {"namespace": "dev", "pod": "podname-0", "service": "podname", "serviceaccount": "default", "statefulset": "podname"}, protocol_hint: Unknown, opaque_transport_port: None, identity: None, authority_override: None }, logical_addr: Some(LogicalAddr(svcname.dev.svc.cluster.local:9200)), protocol: Http1, opaque_protocol: false }) [ 323.278182s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}: linkerd_reconnect: Disconnected backoff=false [ 323.278195s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}: linkerd_reconnect: Creating service backoff=false [ 323.278201s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}: linkerd_proxy_http::client: Building HTTP client settings=Http1 [ 323.278208s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}: linkerd_reconnect: Connected [ 323.278218s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}: linkerd_reconnect: Disconnected backoff=false [ 323.278226s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}: linkerd_reconnect: Creating service backoff=false [ 323.278229s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}: linkerd_proxy_http::client: Building HTTP client settings=Http1 [ 323.278233s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}: linkerd_reconnect: Connected [ 323.278262s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}:http1: linkerd_proxy_http::client: method=GET uri=http://svcname:9200/ version=HTTP/1.1 [ 323.278275s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}:http1: linkerd_proxy_http::client: headers={"qwerty", "content-length": "0", "host": "podname:9200", "user-agent": "asdf", "l5d-dst-canonical": "svcname.dev.svc.cluster.local:9200"} [ 323.278289s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}:http1: linkerd_proxy_http::h1: Caching new client use_absolute_form=false [ 323.278327s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}:http1: linkerd_tls::client: Peer does not support TLS reason=not_provided_by_service_discovery [ 323.278340s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}:http1: linkerd_proxy_transport::connect: Connecting server.addr=2.2.2.3:9200 [ 323.279156s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}:http1: linkerd_proxy_transport::connect: Connected local.addr=1.1.1.1:37482 keepalive=Some(10s) [ 323.279173s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}:http1: linkerd_transport_metrics::client: client connection open [ 323.320453s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}:http1: linkerd_proxy_http::client: method=POST uri=http://svcname:9200/products/_open?xxx version=HTTP/1.1 [ 323.320481s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}:http1: linkerd_proxy_http::client: headers={"qwerty", "content-length": "0", "host": "podname:9200", "user-agent": "asdf", "l5d-dst-canonical": "svcname.dev.svc.cluster.local:9200"} [ 323.320493s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}:http1: linkerd_proxy_http::h1: Caching new client use_absolute_form=false [ 323.320544s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}:http1: linkerd_tls::client: Peer does not support TLS reason=not_provided_by_service_discovery [ 323.320553s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}:http1: linkerd_proxy_transport::connect: Connecting server.addr=2.2.2.4:9200 [ 323.322329s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}:http1: linkerd_proxy_transport::connect: Connected local.addr=1.1.1.1:60988 keepalive=Some(10s) [ 323.322345s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}:http1: linkerd_transport_metrics::client: client connection open [ 323.330389s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}: linkerd_proxy_http::server: The client is shutting down the connection res=Ok(()) [ 323.330436s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}: linkerd_app_core::serve: Connection closed [ 323.330467s] DEBUG ThreadId(01) evict{key=Accept { orig_dst: OrigDstAddr(2.2.2.2:9200), protocol: () }}: linkerd_cache: Awaiting idleness [ 323.337492s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40520}:proxy{addr=2.2.2.2:9200}: linkerd_detect: DetectResult protocol=Some(Http1) elapsed=2.483888ms [ 323.337527s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40520}:proxy{addr=2.2.2.2:9200}:http{v=1.x}: linkerd_proxy_http::server: Creating HTTP service version=Http1 [ 323.337556s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40520}:proxy{addr=2.2.2.2:9200}:http{v=1.x}: linkerd_proxy_http::server: Handling as HTTP version=Http1 [ 323.337582s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40520}:proxy{addr=2.2.2.2:9200}:http{v=1.x}: linkerd_proxy_http::h1: GET request with illegal URI: products [ 323.343875s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40520}:proxy{addr=2.2.2.2:9200}:http{v=1.x}: linkerd_proxy_http::server: The client is shutting down the connection res=Ok(()) [ 323.343915s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40520}: linkerd_app_core::serve: Connection closed ``` ### output of `linkerd check -o short` ``` Status check results are √ ``` ### Environment - Kubernetes 1.23.8 - AKS - Ubuntu 18.04.6 LTS 5.4.0-1089-azure containerd://1.5.11+azure-2 - Linkerd 2.12.1 ### Possible solution As you can see in the logs: - first two HTTP request are in one tcp stream (source port 40514) - third problematic HTTP request is in new tcp stream (source port 40520) - `linkerd_proxy_http::h1: GET request with illegal URI: products` as possible problem? ### Additional context _No response_ ### Would you like to work on fixing this bug? _No response_
1.0
Dropped request with HTTP 400 and "GET request with illegal URI" in logs - ### What is the issue? Application is configured with linkerd-proxy. And currently connects to other microservice which runs without linkerd-proxy (yet). So mTLS encryption is not involved. When the app starts it makes HTTP connections. The last three are: 1. HTTP GET http://2.2.2.2:9200/ - with response HTTP 200 in app log 2. HTTP POST http://2.2.2.2:9200/products/_open?xxxx - with response HTTP 200 in app log 3. HTTP GET http://2.2.2.2:9200/products - with response HTTP 400 Bad request in app log All in one TCP stream. Taking tcpdump shows, that requests 1,2 are leaving the pod but request 3 is not. So no HTTP GET http://1.2.3.4:9200/products in tcpdump capture. This traffic works without any problems without linkerd-proxy injected. From the logs (below) it looks like this `linkerd_proxy_http::h1: GET request with illegal URI: products` is a possible problem. But why? ### How can it be reproduced? Inject linkerd-proxy into the microservice. ### Logs, error output, etc Relevant debug logs from linkerd-proxy with privacy changes; ``` [ 323.241735s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}: linkerd_service_profiles::client: Resolved profile [ 323.241754s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}: linkerd_app_outbound::switch_logical: Profile describes a logical service [ 323.241759s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}: linkerd_app_outbound::http::detect: Attempting HTTP protocol detection [ 323.275412s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}: linkerd_detect: DetectResult protocol=Some(Http1) elapsed=33.62501ms [ 323.275442s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}: linkerd_proxy_http::server: Creating HTTP service version=Http1 [ 323.275462s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}: linkerd_cache: Caching new value key=Logical { protocol: Http1, profile: .., logical_addr: LogicalAddr(svcname.dev.svc.cluster.local:9200) } [ 323.275508s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}: linkerd_proxy_http::server: Handling as HTTP version=Http1 [ 323.275575s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}: linkerd_service_profiles::http::service: Updating HTTP routes routes=0 [ 323.275602s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}: linkerd_service_profiles::split: Updating targets=[Target { addr: svcname.dev.svc.cluster.local:9200, weight: 1 }] [ 323.275618s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}: linkerd_proxy_api_resolve::resolve: Resolving dst=svcname.dev.svc.cluster.local:9200 context={"ns":"dev", "nodeName":"aks-microservice-123456-vmss000002"} [ 323.275653s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}: linkerd_stack::failfast: HTTP Logical service has become unavailable [ 323.275659s] DEBUG ThreadId(01) evict{key=Logical { protocol: Http1, profile: .., logical_addr: LogicalAddr(svcname.dev.svc.cluster.local:9200) }}: linkerd_cache: Awaiting idleness [ 323.278074s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}: linkerd_stack::failfast: HTTP Balancer service has become unavailable [ 323.278108s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}: linkerd_proxy_api_resolve::resolve: Add endpoints=2 [ 323.278122s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}: linkerd_proxy_discover::from_resolve: Changed change=Insert(2.2.2.3:9200, Endpoint { addr: Remote(ServerAddr(2.2.2.3:9200)), tls: None(NotProvidedByServiceDiscovery), metadata: Metadata { labels: {"namespace": "dev", "pod": "podname-1", "service": "podname", "serviceaccount": "default", "statefulset": "podname"}, protocol_hint: Unknown, opaque_transport_port: None, identity: None, authority_override: None }, logical_addr: Some(LogicalAddr(svcname.dev.svc.cluster.local:9200)), protocol: Http1, opaque_protocol: false }) [ 323.278154s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}: linkerd_proxy_discover::from_resolve: Changed change=Insert(2.2.2.4:9200, Endpoint { addr: Remote(ServerAddr(2.2.2.4:9200)), tls: None(NotProvidedByServiceDiscovery), metadata: Metadata { labels: {"namespace": "dev", "pod": "podname-0", "service": "podname", "serviceaccount": "default", "statefulset": "podname"}, protocol_hint: Unknown, opaque_transport_port: None, identity: None, authority_override: None }, logical_addr: Some(LogicalAddr(svcname.dev.svc.cluster.local:9200)), protocol: Http1, opaque_protocol: false }) [ 323.278182s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}: linkerd_reconnect: Disconnected backoff=false [ 323.278195s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}: linkerd_reconnect: Creating service backoff=false [ 323.278201s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}: linkerd_proxy_http::client: Building HTTP client settings=Http1 [ 323.278208s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}: linkerd_reconnect: Connected [ 323.278218s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}: linkerd_reconnect: Disconnected backoff=false [ 323.278226s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}: linkerd_reconnect: Creating service backoff=false [ 323.278229s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}: linkerd_proxy_http::client: Building HTTP client settings=Http1 [ 323.278233s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}: linkerd_reconnect: Connected [ 323.278262s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}:http1: linkerd_proxy_http::client: method=GET uri=http://svcname:9200/ version=HTTP/1.1 [ 323.278275s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}:http1: linkerd_proxy_http::client: headers={"qwerty", "content-length": "0", "host": "podname:9200", "user-agent": "asdf", "l5d-dst-canonical": "svcname.dev.svc.cluster.local:9200"} [ 323.278289s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}:http1: linkerd_proxy_http::h1: Caching new client use_absolute_form=false [ 323.278327s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}:http1: linkerd_tls::client: Peer does not support TLS reason=not_provided_by_service_discovery [ 323.278340s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}:http1: linkerd_proxy_transport::connect: Connecting server.addr=2.2.2.3:9200 [ 323.279156s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}:http1: linkerd_proxy_transport::connect: Connected local.addr=1.1.1.1:37482 keepalive=Some(10s) [ 323.279173s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.3:9200}:http1: linkerd_transport_metrics::client: client connection open [ 323.320453s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}:http1: linkerd_proxy_http::client: method=POST uri=http://svcname:9200/products/_open?xxx version=HTTP/1.1 [ 323.320481s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}:http1: linkerd_proxy_http::client: headers={"qwerty", "content-length": "0", "host": "podname:9200", "user-agent": "asdf", "l5d-dst-canonical": "svcname.dev.svc.cluster.local:9200"} [ 323.320493s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}:http1: linkerd_proxy_http::h1: Caching new client use_absolute_form=false [ 323.320544s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}:http1: linkerd_tls::client: Peer does not support TLS reason=not_provided_by_service_discovery [ 323.320553s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}:http1: linkerd_proxy_transport::connect: Connecting server.addr=2.2.2.4:9200 [ 323.322329s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}:http1: linkerd_proxy_transport::connect: Connected local.addr=1.1.1.1:60988 keepalive=Some(10s) [ 323.322345s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}:logical{dst=svcname.dev.svc.cluster.local:9200}:concrete{addr=svcname.dev.svc.cluster.local:9200}:endpoint{server.addr=2.2.2.4:9200}:http1: linkerd_transport_metrics::client: client connection open [ 323.330389s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}:proxy{addr=2.2.2.2:9200}:http{v=1.x}: linkerd_proxy_http::server: The client is shutting down the connection res=Ok(()) [ 323.330436s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40514}: linkerd_app_core::serve: Connection closed [ 323.330467s] DEBUG ThreadId(01) evict{key=Accept { orig_dst: OrigDstAddr(2.2.2.2:9200), protocol: () }}: linkerd_cache: Awaiting idleness [ 323.337492s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40520}:proxy{addr=2.2.2.2:9200}: linkerd_detect: DetectResult protocol=Some(Http1) elapsed=2.483888ms [ 323.337527s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40520}:proxy{addr=2.2.2.2:9200}:http{v=1.x}: linkerd_proxy_http::server: Creating HTTP service version=Http1 [ 323.337556s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40520}:proxy{addr=2.2.2.2:9200}:http{v=1.x}: linkerd_proxy_http::server: Handling as HTTP version=Http1 [ 323.337582s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40520}:proxy{addr=2.2.2.2:9200}:http{v=1.x}: linkerd_proxy_http::h1: GET request with illegal URI: products [ 323.343875s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40520}:proxy{addr=2.2.2.2:9200}:http{v=1.x}: linkerd_proxy_http::server: The client is shutting down the connection res=Ok(()) [ 323.343915s] DEBUG ThreadId(01) outbound:accept{client.addr=1.1.1.1:40520}: linkerd_app_core::serve: Connection closed ``` ### output of `linkerd check -o short` ``` Status check results are √ ``` ### Environment - Kubernetes 1.23.8 - AKS - Ubuntu 18.04.6 LTS 5.4.0-1089-azure containerd://1.5.11+azure-2 - Linkerd 2.12.1 ### Possible solution As you can see in the logs: - first two HTTP request are in one tcp stream (source port 40514) - third problematic HTTP request is in new tcp stream (source port 40520) - `linkerd_proxy_http::h1: GET request with illegal URI: products` as possible problem? ### Additional context _No response_ ### Would you like to work on fixing this bug? _No response_
non_defect
dropped request with http and get request with illegal uri in logs what is the issue application is configured with linkerd proxy and currently connects to other microservice which runs without linkerd proxy yet so mtls encryption is not involved when the app starts it makes http connections the last three are http get with response http in app log http post with response http in app log http get with response http bad request in app log all in one tcp stream taking tcpdump shows that requests are leaving the pod but request is not so no http get in tcpdump capture this traffic works without any problems without linkerd proxy injected from the logs below it looks like this linkerd proxy http get request with illegal uri products is a possible problem but why how can it be reproduced inject linkerd proxy into the microservice logs error output etc relevant debug logs from linkerd proxy with privacy changes debug threadid outbound accept client addr proxy addr linkerd service profiles client resolved profile debug threadid outbound accept client addr proxy addr linkerd app outbound switch logical profile describes a logical service debug threadid outbound accept client addr proxy addr linkerd app outbound http detect attempting http protocol detection debug threadid outbound accept client addr proxy addr linkerd detect detectresult protocol some elapsed debug threadid outbound accept client addr proxy addr http v x linkerd proxy http server creating http service version debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local linkerd cache caching new value key logical protocol profile logical addr logicaladdr svcname dev svc cluster local debug threadid outbound accept client addr proxy addr http v x linkerd proxy http server handling as http version debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local linkerd service profiles http service updating http routes routes debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local linkerd service profiles split updating targets debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local linkerd proxy api resolve resolve resolving dst svcname dev svc cluster local context ns dev nodename aks microservice debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local linkerd stack failfast http logical service has become unavailable debug threadid evict key logical protocol profile logical addr logicaladdr svcname dev svc cluster local linkerd cache awaiting idleness debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local linkerd stack failfast http balancer service has become unavailable debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local linkerd proxy api resolve resolve add endpoints debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local linkerd proxy discover from resolve changed change insert endpoint addr remote serveraddr tls none notprovidedbyservicediscovery metadata metadata labels namespace dev pod podname service podname serviceaccount default statefulset podname protocol hint unknown opaque transport port none identity none authority override none logical addr some logicaladdr svcname dev svc cluster local protocol opaque protocol false debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local linkerd proxy discover from resolve changed change insert endpoint addr remote serveraddr tls none notprovidedbyservicediscovery metadata metadata labels namespace dev pod podname service podname serviceaccount default statefulset podname protocol hint unknown opaque transport port none identity none authority override none logical addr some logicaladdr svcname dev svc cluster local protocol opaque protocol false debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd reconnect disconnected backoff false debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd reconnect creating service backoff false debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd proxy http client building http client settings debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd reconnect connected debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd reconnect disconnected backoff false debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd reconnect creating service backoff false debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd proxy http client building http client settings debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd reconnect connected debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd proxy http client method get uri version http debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd proxy http client headers qwerty content length host podname user agent asdf dst canonical svcname dev svc cluster local debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd proxy http caching new client use absolute form false debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd tls client peer does not support tls reason not provided by service discovery debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd proxy transport connect connecting server addr debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd proxy transport connect connected local addr keepalive some debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd transport metrics client client connection open debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd proxy http client method post uri version http debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd proxy http client headers qwerty content length host podname user agent asdf dst canonical svcname dev svc cluster local debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd proxy http caching new client use absolute form false debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd tls client peer does not support tls reason not provided by service discovery debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd proxy transport connect connecting server addr debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd proxy transport connect connected local addr keepalive some debug threadid outbound accept client addr proxy addr http v x logical dst svcname dev svc cluster local concrete addr svcname dev svc cluster local endpoint server addr linkerd transport metrics client client connection open debug threadid outbound accept client addr proxy addr http v x linkerd proxy http server the client is shutting down the connection res ok debug threadid outbound accept client addr linkerd app core serve connection closed debug threadid evict key accept orig dst origdstaddr protocol linkerd cache awaiting idleness debug threadid outbound accept client addr proxy addr linkerd detect detectresult protocol some elapsed debug threadid outbound accept client addr proxy addr http v x linkerd proxy http server creating http service version debug threadid outbound accept client addr proxy addr http v x linkerd proxy http server handling as http version debug threadid outbound accept client addr proxy addr http v x linkerd proxy http get request with illegal uri products debug threadid outbound accept client addr proxy addr http v x linkerd proxy http server the client is shutting down the connection res ok debug threadid outbound accept client addr linkerd app core serve connection closed output of linkerd check o short status check results are √ environment kubernetes aks ubuntu lts azure containerd azure linkerd possible solution as you can see in the logs first two http request are in one tcp stream source port third problematic http request is in new tcp stream source port linkerd proxy http get request with illegal uri products as possible problem additional context no response would you like to work on fixing this bug no response
0
61,738
15,057,869,140
IssuesEvent
2021-02-03 22:23:51
grafana/grafana
https://api.github.com/repos/grafana/grafana
closed
Grafana MSI Installer does not upgrade cleanly
type/build-packaging
**What happened**: After installing and UPDATING grafana (OSS) on windows several times using the MSI installer, I noticed it does not cleanly upgrade the old version but rather installs each version seperately in windows although its overriding the installation directory. Also the `INSTALLDIR` (*Speicherort*) information is missing from the MSI installation (column is empty). ![grafik](https://user-images.githubusercontent.com/33002073/87818523-fcb48900-c86a-11ea-95b8-3b9d43232400.png) > > > **⚠ Uninstalling an "older" version via windows software panel will UNINSTALL THE LATEST VERSION** > **What you expected to happen**: The MSI installer should have a unique product GUID (?) and be able to detect its running an upgrade. **How to reproduce it (as minimally and precisely as possible)**: - Install `grafana-7.0.6.windows-amd64.msi` all default settings - Install `grafana-7.1.0.windows-amd64.msi` all default settings Now you got two entries in apps list, easily reproducible in e.g. Windows Sandbox: ![grafik](https://user-images.githubusercontent.com/33002073/87819102-012d7180-c86c-11ea-82a4-8e00e76d2210.png) **Environment**: - Grafana version: 6.6.2 till 7.1.0 ongoing - OS Grafana is installed on: Windows 10 x64 2004
1.0
Grafana MSI Installer does not upgrade cleanly - **What happened**: After installing and UPDATING grafana (OSS) on windows several times using the MSI installer, I noticed it does not cleanly upgrade the old version but rather installs each version seperately in windows although its overriding the installation directory. Also the `INSTALLDIR` (*Speicherort*) information is missing from the MSI installation (column is empty). ![grafik](https://user-images.githubusercontent.com/33002073/87818523-fcb48900-c86a-11ea-95b8-3b9d43232400.png) > > > **⚠ Uninstalling an "older" version via windows software panel will UNINSTALL THE LATEST VERSION** > **What you expected to happen**: The MSI installer should have a unique product GUID (?) and be able to detect its running an upgrade. **How to reproduce it (as minimally and precisely as possible)**: - Install `grafana-7.0.6.windows-amd64.msi` all default settings - Install `grafana-7.1.0.windows-amd64.msi` all default settings Now you got two entries in apps list, easily reproducible in e.g. Windows Sandbox: ![grafik](https://user-images.githubusercontent.com/33002073/87819102-012d7180-c86c-11ea-82a4-8e00e76d2210.png) **Environment**: - Grafana version: 6.6.2 till 7.1.0 ongoing - OS Grafana is installed on: Windows 10 x64 2004
non_defect
grafana msi installer does not upgrade cleanly what happened after installing and updating grafana oss on windows several times using the msi installer i noticed it does not cleanly upgrade the old version but rather installs each version seperately in windows although its overriding the installation directory also the installdir speicherort information is missing from the msi installation column is empty ⚠ uninstalling an older version via windows software panel will uninstall the latest version what you expected to happen the msi installer should have a unique product guid and be able to detect its running an upgrade how to reproduce it as minimally and precisely as possible install grafana windows msi all default settings install grafana windows msi all default settings now you got two entries in apps list easily reproducible in e g windows sandbox environment grafana version till ongoing os grafana is installed on windows
0
19,717
3,248,233,683
IssuesEvent
2015-10-17 04:25:00
jimradford/superputty
https://api.github.com/repos/jimradford/superputty
closed
Shourtcuts
auto-migrated Priority-Medium Type-Defect
``` Hello, shourttucts are sometimes acting very wierd: A -> A [OK] L Ctrl -> control+ControlKey L Ctrl + A - > Ctrl+A [OK] Alt -> Alt+Menu R Alt -> Ctrl+Alt+Menu R Ctrl -> control+ControlKey L Ctrl+Alt -> Ctrl+Alt+Menu L Ctrl+Alt+A -> Ctrl+Alt+Menu I support stat A will give me A CTRL+A -> CTRL+A ALT+A -> ALT+A CTRL+ALT+A -> CTRL+ALT+A I cant map some combination like this. (1.4.0.4) ``` Original issue reported on code.google.com by `xsoft...@gmail.com` on 20 Jun 2014 at 2:13
1.0
Shourtcuts - ``` Hello, shourttucts are sometimes acting very wierd: A -> A [OK] L Ctrl -> control+ControlKey L Ctrl + A - > Ctrl+A [OK] Alt -> Alt+Menu R Alt -> Ctrl+Alt+Menu R Ctrl -> control+ControlKey L Ctrl+Alt -> Ctrl+Alt+Menu L Ctrl+Alt+A -> Ctrl+Alt+Menu I support stat A will give me A CTRL+A -> CTRL+A ALT+A -> ALT+A CTRL+ALT+A -> CTRL+ALT+A I cant map some combination like this. (1.4.0.4) ``` Original issue reported on code.google.com by `xsoft...@gmail.com` on 20 Jun 2014 at 2:13
defect
shourtcuts hello shourttucts are sometimes acting very wierd a a l ctrl control controlkey l ctrl a ctrl a alt alt menu r alt ctrl alt menu r ctrl control controlkey l ctrl alt ctrl alt menu l ctrl alt a ctrl alt menu i support stat a will give me a ctrl a ctrl a alt a alt a ctrl alt a ctrl alt a i cant map some combination like this original issue reported on code google com by xsoft gmail com on jun at
1
199,908
15,786,230,989
IssuesEvent
2021-04-01 17:26:58
FirebaseExtended/flutterfire
https://api.github.com/repos/FirebaseExtended/flutterfire
closed
[firebase_crashlytics] Example doesn't match the docs
plugin: crashlytics type: bug type: documentation
The docs [here](https://github.com/FirebaseExtended/flutterfire/tree/master/packages/firebase_crashlytics#use-the-plugin) say we just set `FlutterError.onError` and then call `runApp()`, later the docs say: >If you want to catch errors that occur in runZoned, you can supply Crashlytics.instance.recordError to the onError parameter But the example [here ](https://github.com/FirebaseExtended/flutterfire/blob/b49837250c619e7363463db732261737e093557d/packages/firebase_crashlytics/example/lib/main.dart#L20) does: ``` runZoned<Future<void>>(() async { runApp(MyApp()); }, onError: Crashlytics.instance.recordError); } ``` Which is confusing, as there's no explanation on the benefits of wrapping `RunApp` in a `runZoned`. Also the examples doesn't show `recordError`, just `log` which doesn't fully match the Android (`logException`) or the iOS (`recordError` with other parameters) docs, so this should be better explained as well.
1.0
[firebase_crashlytics] Example doesn't match the docs - The docs [here](https://github.com/FirebaseExtended/flutterfire/tree/master/packages/firebase_crashlytics#use-the-plugin) say we just set `FlutterError.onError` and then call `runApp()`, later the docs say: >If you want to catch errors that occur in runZoned, you can supply Crashlytics.instance.recordError to the onError parameter But the example [here ](https://github.com/FirebaseExtended/flutterfire/blob/b49837250c619e7363463db732261737e093557d/packages/firebase_crashlytics/example/lib/main.dart#L20) does: ``` runZoned<Future<void>>(() async { runApp(MyApp()); }, onError: Crashlytics.instance.recordError); } ``` Which is confusing, as there's no explanation on the benefits of wrapping `RunApp` in a `runZoned`. Also the examples doesn't show `recordError`, just `log` which doesn't fully match the Android (`logException`) or the iOS (`recordError` with other parameters) docs, so this should be better explained as well.
non_defect
example doesn t match the docs the docs say we just set fluttererror onerror and then call runapp later the docs say if you want to catch errors that occur in runzoned you can supply crashlytics instance recorderror to the onerror parameter but the example does runzoned async runapp myapp onerror crashlytics instance recorderror which is confusing as there s no explanation on the benefits of wrapping runapp in a runzoned also the examples doesn t show recorderror just log which doesn t fully match the android logexception or the ios recorderror with other parameters docs so this should be better explained as well
0
9,686
2,615,165,850
IssuesEvent
2015-03-01 06:46:11
chrsmith/reaver-wps
https://api.github.com/repos/chrsmith/reaver-wps
opened
Waiting for Beacon from BSSID: No obvious cause?
auto-migrated Priority-Triage Type-Defect
``` Answer the following questions for every issue submitted: 0. What version of Reaver are you using? (Only defects against the latest version will be considered.) v1.4 1. What operating system are you using (Linux is the only supported OS)? Ubuntu 12.04 2. Is your wireless card in monitor mode (yes/no)? Yes 3. What is the signal strength of the Access Point you are trying to crack? -12 4. What is the manufacturer and model # of the device you are trying to crack? 2WIRE249 2701 HG-D 5. What is the entire command line string you are supplying to reaver? reaver -i mon0 -b BSSID -c 9 -e 2WIRE249 6. Please describe what you think the issue is. Process sticks on "Waiting for beacon from BSSID" after the above string is entered. As noted, the string is entered while in monitor mode & the signal strength is well above -50, so these should not be the cause. I ran through all the support pages & comments to no avail. Found no evidence this router isn't supported. 7. Paste the output from Reaver below. [+] Switching mon0 to channel 9 [+] Waiting for beacon from BSSID That's all I got. If more info is needed, lemme know. ``` Original issue reported on code.google.com by `bettysue...@gmail.com` on 12 Apr 2013 at 3:16
1.0
Waiting for Beacon from BSSID: No obvious cause? - ``` Answer the following questions for every issue submitted: 0. What version of Reaver are you using? (Only defects against the latest version will be considered.) v1.4 1. What operating system are you using (Linux is the only supported OS)? Ubuntu 12.04 2. Is your wireless card in monitor mode (yes/no)? Yes 3. What is the signal strength of the Access Point you are trying to crack? -12 4. What is the manufacturer and model # of the device you are trying to crack? 2WIRE249 2701 HG-D 5. What is the entire command line string you are supplying to reaver? reaver -i mon0 -b BSSID -c 9 -e 2WIRE249 6. Please describe what you think the issue is. Process sticks on "Waiting for beacon from BSSID" after the above string is entered. As noted, the string is entered while in monitor mode & the signal strength is well above -50, so these should not be the cause. I ran through all the support pages & comments to no avail. Found no evidence this router isn't supported. 7. Paste the output from Reaver below. [+] Switching mon0 to channel 9 [+] Waiting for beacon from BSSID That's all I got. If more info is needed, lemme know. ``` Original issue reported on code.google.com by `bettysue...@gmail.com` on 12 Apr 2013 at 3:16
defect
waiting for beacon from bssid no obvious cause answer the following questions for every issue submitted what version of reaver are you using only defects against the latest version will be considered what operating system are you using linux is the only supported os ubuntu is your wireless card in monitor mode yes no yes what is the signal strength of the access point you are trying to crack what is the manufacturer and model of the device you are trying to crack hg d what is the entire command line string you are supplying to reaver reaver i b bssid c e please describe what you think the issue is process sticks on waiting for beacon from bssid after the above string is entered as noted the string is entered while in monitor mode the signal strength is well above so these should not be the cause i ran through all the support pages comments to no avail found no evidence this router isn t supported paste the output from reaver below switching to channel waiting for beacon from bssid that s all i got if more info is needed lemme know original issue reported on code google com by bettysue gmail com on apr at
1
36,118
7,866,943,517
IssuesEvent
2018-06-23 01:14:45
StrikeNP/trac_test
https://api.github.com/repos/StrikeNP/trac_test
closed
Troubleshooting the differences between DYCOMS II RF02 with K&K microphysics vs Morrison microphysics (Trac #450)
Migrated from Trac clubb_src defect dschanen@uwm.edu
**Problem Description** While exploring issues with simulating the ARM 97 case in https://github.com/larson-group/climate_process_team/issues/53 we found that DYCOMS II RF02 differs sharply in the rain fields when using Morrison and Khairoutdinov Kogan microphysics. This seems problematic, since with local formulation (l_local_kk = .true.) for a marine Sc cloud the two should produce very similar fields. I've tried a few experiments thus far and come to a few conclusions: * The simplest case is to simply compare Morrison and Khairoutdinov Kogan with Nc fixed and using the local formulation. * Looking at past plots on my protected page I would conclude this problem has existed with CLUBB-Morrison for a year or more (perhaps always?). See for e.g. [http://www.larson-group.com/dschanen/protected/micro_compare_plot/ this plot]. * In the simplest simulation the rain water mixing ratio and number concentration fields differ by as much as 10^-7^ on the first timestep and the Morrison simulation seems shifted down in altitude. Figuring out why that is seems like it should be easy to isolate. * When I manually print out values for rrainm_auto and PRC (the Morrison variable for rain water mixing ratio autoconversion) they appear to be in agreement up to round off and on the correct levels. I have no budgets for the Morrison code, but rrainm_cond and rrainm_accr are zero in the K&K simulation. The difference seems to be something in the rrainm_mc budget other than autoconversion. Attachments: [plot_explicit_ta_configs.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_explicit_ta_configs.maff) [plot_new_pdf_config_1_plot_2.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_new_pdf_config_1_plot_2.maff) [plot_combo_pdf_run_3.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_combo_pdf_run_3.maff) [plot_input_fields_rtp3_thlp3_1.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_input_fields_rtp3_thlp3_1.maff) [plot_new_pdf_20180522_test_1.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_new_pdf_20180522_test_1.maff) [plot_attempts_8_10.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_attempts_8_10.maff) [plot_attempt_8_only.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_attempt_8_only.maff) [plot_beta_1p3.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_beta_1p3.maff) [plot_beta_1p3_all.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_beta_1p3_all.maff) Migrated from http://carson.math.uwm.edu/trac/clubb/ticket/450 ```json { "status": "closed", "changetime": "2011-08-25T18:17:17", "description": "'''Problem Description'''\nWhile exploring issues with simulating the ARM 97 case in climate_process_team:ticket:53 we found that DYCOMS II RF02 differs sharply in the rain fields when using Morrison and Khairoutdinov Kogan microphysics. This seems problematic, since with local formulation (l_local_kk = .true.) for a marine Sc cloud the two should produce very similar fields. I've tried a few experiments thus far and come to a few conclusions:\n\n * The simplest case is to simply compare Morrison and Khairoutdinov Kogan with Nc fixed and using the local formulation. \n * Looking at past plots on my protected page I would conclude this problem has existed with CLUBB-Morrison for a year or more (perhaps always?). See for e.g. [http://www.larson-group.com/dschanen/protected/micro_compare_plot/ this plot].\n * In the simplest simulation the rain water mixing ratio and number concentration fields differ by as much as 10^-7^ on the first timestep and the Morrison simulation seems shifted down in altitude. Figuring out why that is seems like it should be easy to isolate.\n * When I manually print out values for rrainm_auto and PRC (the Morrison variable for rain water mixing ratio autoconversion) they appear to be in agreement up to round off and on the correct levels. I have no budgets for the Morrison code, but rrainm_cond and rrainm_accr are zero in the K&K simulation. The difference seems to be something in the rrainm_mc budget other than autoconversion.\n\n", "reporter": "dschanen@uwm.edu", "cc": "vlarson@uwm.edu, roehl@uwm.edu", "resolution": "wontfix", "_ts": "1314296237000000", "component": "clubb_src", "summary": "Troubleshooting the differences between DYCOMS II RF02 with K&K microphysics vs Morrison microphysics", "priority": "major", "keywords": "", "time": "2011-08-15T19:21:46", "milestone": "", "owner": "dschanen@uwm.edu", "type": "defect" } ```
1.0
Troubleshooting the differences between DYCOMS II RF02 with K&K microphysics vs Morrison microphysics (Trac #450) - **Problem Description** While exploring issues with simulating the ARM 97 case in https://github.com/larson-group/climate_process_team/issues/53 we found that DYCOMS II RF02 differs sharply in the rain fields when using Morrison and Khairoutdinov Kogan microphysics. This seems problematic, since with local formulation (l_local_kk = .true.) for a marine Sc cloud the two should produce very similar fields. I've tried a few experiments thus far and come to a few conclusions: * The simplest case is to simply compare Morrison and Khairoutdinov Kogan with Nc fixed and using the local formulation. * Looking at past plots on my protected page I would conclude this problem has existed with CLUBB-Morrison for a year or more (perhaps always?). See for e.g. [http://www.larson-group.com/dschanen/protected/micro_compare_plot/ this plot]. * In the simplest simulation the rain water mixing ratio and number concentration fields differ by as much as 10^-7^ on the first timestep and the Morrison simulation seems shifted down in altitude. Figuring out why that is seems like it should be easy to isolate. * When I manually print out values for rrainm_auto and PRC (the Morrison variable for rain water mixing ratio autoconversion) they appear to be in agreement up to round off and on the correct levels. I have no budgets for the Morrison code, but rrainm_cond and rrainm_accr are zero in the K&K simulation. The difference seems to be something in the rrainm_mc budget other than autoconversion. Attachments: [plot_explicit_ta_configs.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_explicit_ta_configs.maff) [plot_new_pdf_config_1_plot_2.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_new_pdf_config_1_plot_2.maff) [plot_combo_pdf_run_3.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_combo_pdf_run_3.maff) [plot_input_fields_rtp3_thlp3_1.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_input_fields_rtp3_thlp3_1.maff) [plot_new_pdf_20180522_test_1.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_new_pdf_20180522_test_1.maff) [plot_attempts_8_10.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_attempts_8_10.maff) [plot_attempt_8_only.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_attempt_8_only.maff) [plot_beta_1p3.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_beta_1p3.maff) [plot_beta_1p3_all.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_beta_1p3_all.maff) Migrated from http://carson.math.uwm.edu/trac/clubb/ticket/450 ```json { "status": "closed", "changetime": "2011-08-25T18:17:17", "description": "'''Problem Description'''\nWhile exploring issues with simulating the ARM 97 case in climate_process_team:ticket:53 we found that DYCOMS II RF02 differs sharply in the rain fields when using Morrison and Khairoutdinov Kogan microphysics. This seems problematic, since with local formulation (l_local_kk = .true.) for a marine Sc cloud the two should produce very similar fields. I've tried a few experiments thus far and come to a few conclusions:\n\n * The simplest case is to simply compare Morrison and Khairoutdinov Kogan with Nc fixed and using the local formulation. \n * Looking at past plots on my protected page I would conclude this problem has existed with CLUBB-Morrison for a year or more (perhaps always?). See for e.g. [http://www.larson-group.com/dschanen/protected/micro_compare_plot/ this plot].\n * In the simplest simulation the rain water mixing ratio and number concentration fields differ by as much as 10^-7^ on the first timestep and the Morrison simulation seems shifted down in altitude. Figuring out why that is seems like it should be easy to isolate.\n * When I manually print out values for rrainm_auto and PRC (the Morrison variable for rain water mixing ratio autoconversion) they appear to be in agreement up to round off and on the correct levels. I have no budgets for the Morrison code, but rrainm_cond and rrainm_accr are zero in the K&K simulation. The difference seems to be something in the rrainm_mc budget other than autoconversion.\n\n", "reporter": "dschanen@uwm.edu", "cc": "vlarson@uwm.edu, roehl@uwm.edu", "resolution": "wontfix", "_ts": "1314296237000000", "component": "clubb_src", "summary": "Troubleshooting the differences between DYCOMS II RF02 with K&K microphysics vs Morrison microphysics", "priority": "major", "keywords": "", "time": "2011-08-15T19:21:46", "milestone": "", "owner": "dschanen@uwm.edu", "type": "defect" } ```
defect
troubleshooting the differences between dycoms ii with k k microphysics vs morrison microphysics trac problem description while exploring issues with simulating the arm case in we found that dycoms ii differs sharply in the rain fields when using morrison and khairoutdinov kogan microphysics this seems problematic since with local formulation l local kk true for a marine sc cloud the two should produce very similar fields i ve tried a few experiments thus far and come to a few conclusions the simplest case is to simply compare morrison and khairoutdinov kogan with nc fixed and using the local formulation looking at past plots on my protected page i would conclude this problem has existed with clubb morrison for a year or more perhaps always see for e g in the simplest simulation the rain water mixing ratio and number concentration fields differ by as much as on the first timestep and the morrison simulation seems shifted down in altitude figuring out why that is seems like it should be easy to isolate when i manually print out values for rrainm auto and prc the morrison variable for rain water mixing ratio autoconversion they appear to be in agreement up to round off and on the correct levels i have no budgets for the morrison code but rrainm cond and rrainm accr are zero in the k k simulation the difference seems to be something in the rrainm mc budget other than autoconversion attachments migrated from json status closed changetime description problem description nwhile exploring issues with simulating the arm case in climate process team ticket we found that dycoms ii differs sharply in the rain fields when using morrison and khairoutdinov kogan microphysics this seems problematic since with local formulation l local kk true for a marine sc cloud the two should produce very similar fields i ve tried a few experiments thus far and come to a few conclusions n n the simplest case is to simply compare morrison and khairoutdinov kogan with nc fixed and using the local formulation n looking at past plots on my protected page i would conclude this problem has existed with clubb morrison for a year or more perhaps always see for e g n in the simplest simulation the rain water mixing ratio and number concentration fields differ by as much as on the first timestep and the morrison simulation seems shifted down in altitude figuring out why that is seems like it should be easy to isolate n when i manually print out values for rrainm auto and prc the morrison variable for rain water mixing ratio autoconversion they appear to be in agreement up to round off and on the correct levels i have no budgets for the morrison code but rrainm cond and rrainm accr are zero in the k k simulation the difference seems to be something in the rrainm mc budget other than autoconversion n n reporter dschanen uwm edu cc vlarson uwm edu roehl uwm edu resolution wontfix ts component clubb src summary troubleshooting the differences between dycoms ii with k k microphysics vs morrison microphysics priority major keywords time milestone owner dschanen uwm edu type defect
1
37,008
8,201,463,770
IssuesEvent
2018-09-01 18:00:28
jccastillo0007/eFacturaT
https://api.github.com/repos/jccastillo0007/eFacturaT
opened
Navision - Cuando la moneda de pago es distinta a la moneda del CFDI relacionado, debe enviar el tipo de cambio
bug defect
Te envié un correo. Aquí te lo ponto CUANDO LA MONEDA DEL DOCUMENTO RELACIONADO, ES DISTINTA A LA MONEDA DEL PAGO, SE DEBE ENVIAR EL TIPO DE CAMBIO A NIVEL DETALLE: EN ESTE EJEMPLO, LA FACTURA SE EMITIÓ EN USD, Y EL PAGO ES EN MXN ENTONCES A NIVEL DETALLE (RESALTADO EN AMARILLO), FALTÓ REPORTAR EL TIPO DE CAMBIO, TANTO EN XML COMO EN PDF. <pagos:Pagos xmlns:pagos="http://www.sat.gob.mx/Pagos" Version="1.0"> <pagos:Pago CtaBeneficiario="03504856627" CtaOrdenante="072306003233612088" FechaPago="2018-08-30T12:00:00" FormaDePagoP="03" MonedaP="MXN" Monto="7674.52" NumOperacion="1" RfcEmisorCtaBen="SIN9412025I4" RfcEmisorCtaOrd="BMN930209927"> <pagos:DoctoRelacionado Folio="188" IdDocumento="08b44fdd-c1d5-4226-8032-f9b745ce8806" ImpPagado="7674.52" ImpSaldoAnt="405.30" ImpSaldoInsoluto="0.00" MetodoDePagoDR="PUE" MonedaDR="USD" Serie="18FVMX"/> </pagos:Pago> </pagos:Pagos>
1.0
Navision - Cuando la moneda de pago es distinta a la moneda del CFDI relacionado, debe enviar el tipo de cambio - Te envié un correo. Aquí te lo ponto CUANDO LA MONEDA DEL DOCUMENTO RELACIONADO, ES DISTINTA A LA MONEDA DEL PAGO, SE DEBE ENVIAR EL TIPO DE CAMBIO A NIVEL DETALLE: EN ESTE EJEMPLO, LA FACTURA SE EMITIÓ EN USD, Y EL PAGO ES EN MXN ENTONCES A NIVEL DETALLE (RESALTADO EN AMARILLO), FALTÓ REPORTAR EL TIPO DE CAMBIO, TANTO EN XML COMO EN PDF. <pagos:Pagos xmlns:pagos="http://www.sat.gob.mx/Pagos" Version="1.0"> <pagos:Pago CtaBeneficiario="03504856627" CtaOrdenante="072306003233612088" FechaPago="2018-08-30T12:00:00" FormaDePagoP="03" MonedaP="MXN" Monto="7674.52" NumOperacion="1" RfcEmisorCtaBen="SIN9412025I4" RfcEmisorCtaOrd="BMN930209927"> <pagos:DoctoRelacionado Folio="188" IdDocumento="08b44fdd-c1d5-4226-8032-f9b745ce8806" ImpPagado="7674.52" ImpSaldoAnt="405.30" ImpSaldoInsoluto="0.00" MetodoDePagoDR="PUE" MonedaDR="USD" Serie="18FVMX"/> </pagos:Pago> </pagos:Pagos>
defect
navision cuando la moneda de pago es distinta a la moneda del cfdi relacionado debe enviar el tipo de cambio te envié un correo aquí te lo ponto cuando la moneda del documento relacionado es distinta a la moneda del pago se debe enviar el tipo de cambio a nivel detalle en este ejemplo la factura se emitió en usd y el pago es en mxn entonces a nivel detalle resaltado en amarillo faltó reportar el tipo de cambio tanto en xml como en pdf
1
759,746
26,609,044,515
IssuesEvent
2023-01-23 22:00:35
coral-xyz/backpack
https://api.github.com/repos/coral-xyz/backpack
closed
recovery flow should support recover of multiple wallets if all derived from the same mnemonic
priority 1
Potentially spanning multiple blockchains.
1.0
recovery flow should support recover of multiple wallets if all derived from the same mnemonic - Potentially spanning multiple blockchains.
non_defect
recovery flow should support recover of multiple wallets if all derived from the same mnemonic potentially spanning multiple blockchains
0
18,206
3,035,168,298
IssuesEvent
2015-08-06 00:26:33
mkpanchal/phppi
https://api.github.com/repos/mkpanchal/phppi
closed
Wrong GD support check
auto-migrated Priority-Medium Type-Defect
``` Hi! Now i'm having WD MyBookLive. It's Debian 5.0.4 as OS. I've install phppi to this. But install script say that "Your GD install is missing JPEG support" and show notice in top: "Notice: Undefined index: JPEG Support in /DataVolume/shares/www/photo_pub/phppi/admin/includes/classes/phppi.php on line 89". But phpinfo() says: gd GD Support enabled GD Version 2.0 or higher FreeType Support enabled FreeType Linkage with freetype FreeType Version 2.3.7 T1Lib Support enabled GIF Read Support enabled GIF Create Support enabled JPG Support enabled PNG Support enabled WBMP Support enabled ``` Original issue reported on code.google.com by `Uksus...@gmail.com` on 11 Apr 2013 at 11:47
1.0
Wrong GD support check - ``` Hi! Now i'm having WD MyBookLive. It's Debian 5.0.4 as OS. I've install phppi to this. But install script say that "Your GD install is missing JPEG support" and show notice in top: "Notice: Undefined index: JPEG Support in /DataVolume/shares/www/photo_pub/phppi/admin/includes/classes/phppi.php on line 89". But phpinfo() says: gd GD Support enabled GD Version 2.0 or higher FreeType Support enabled FreeType Linkage with freetype FreeType Version 2.3.7 T1Lib Support enabled GIF Read Support enabled GIF Create Support enabled JPG Support enabled PNG Support enabled WBMP Support enabled ``` Original issue reported on code.google.com by `Uksus...@gmail.com` on 11 Apr 2013 at 11:47
defect
wrong gd support check hi now i m having wd mybooklive it s debian as os i ve install phppi to this but install script say that your gd install is missing jpeg support and show notice in top notice undefined index jpeg support in datavolume shares www photo pub phppi admin includes classes phppi php on line but phpinfo says gd gd support enabled gd version or higher freetype support enabled freetype linkage with freetype freetype version support enabled gif read support enabled gif create support enabled jpg support enabled png support enabled wbmp support enabled original issue reported on code google com by uksus gmail com on apr at
1
22,134
3,602,634,721
IssuesEvent
2016-02-03 16:17:12
department-of-veterans-affairs/gi-bill-comparison-tool
https://api.github.com/repos/department-of-veterans-affairs/gi-bill-comparison-tool
closed
Remove private repository dependency
cleanup Defect
Doing the initial setup, but couldn't get past `bundle install` because of dependency on a private repo: ![image](https://cloud.githubusercontent.com/assets/950486/10834435/b63bf61a-7e6f-11e5-930e-60be091a7ac5.png) Tried a work-around by not including `development` and `test` groups, but didn't work for some reason: ![image](https://cloud.githubusercontent.com/assets/950486/10834439/ba53023e-7e6f-11e5-8aa1-832808de9149.png) For now, just locally commenting out the group this dependency is enclosed in to get past. ![image](https://cloud.githubusercontent.com/assets/950486/10834473/f1370e12-7e6f-11e5-8b5b-7060ad8a4289.png) I'm a newbie to Capistrano. Is this required? (cc @mphprogrammer )
1.0
Remove private repository dependency - Doing the initial setup, but couldn't get past `bundle install` because of dependency on a private repo: ![image](https://cloud.githubusercontent.com/assets/950486/10834435/b63bf61a-7e6f-11e5-930e-60be091a7ac5.png) Tried a work-around by not including `development` and `test` groups, but didn't work for some reason: ![image](https://cloud.githubusercontent.com/assets/950486/10834439/ba53023e-7e6f-11e5-8aa1-832808de9149.png) For now, just locally commenting out the group this dependency is enclosed in to get past. ![image](https://cloud.githubusercontent.com/assets/950486/10834473/f1370e12-7e6f-11e5-8b5b-7060ad8a4289.png) I'm a newbie to Capistrano. Is this required? (cc @mphprogrammer )
defect
remove private repository dependency doing the initial setup but couldn t get past bundle install because of dependency on a private repo tried a work around by not including development and test groups but didn t work for some reason for now just locally commenting out the group this dependency is enclosed in to get past i m a newbie to capistrano is this required cc mphprogrammer
1
11,398
2,649,926,671
IssuesEvent
2015-03-15 12:52:10
karulis/pybluez
https://api.github.com/repos/karulis/pybluez
closed
No way to select device id for device discovery
auto-migrated Priority-Medium Type-Defect
``` My Linux device contains 3 dongles (class 1,2,3). I want to select one of them for device discovery (user may select which to use via a GUI). Is there a way to select hci0,1,2 through code? Al my tests with socket.bind did not work. Any tips? Thanks in advance! Mart ``` Original issue reported on code.google.com by `mart.ste...@gmail.com` on 3 Feb 2015 at 6:56
1.0
No way to select device id for device discovery - ``` My Linux device contains 3 dongles (class 1,2,3). I want to select one of them for device discovery (user may select which to use via a GUI). Is there a way to select hci0,1,2 through code? Al my tests with socket.bind did not work. Any tips? Thanks in advance! Mart ``` Original issue reported on code.google.com by `mart.ste...@gmail.com` on 3 Feb 2015 at 6:56
defect
no way to select device id for device discovery my linux device contains dongles class i want to select one of them for device discovery user may select which to use via a gui is there a way to select through code al my tests with socket bind did not work any tips thanks in advance mart original issue reported on code google com by mart ste gmail com on feb at
1
20,970
3,441,841,924
IssuesEvent
2015-12-14 20:06:50
wdg/blacktree-secrets
https://api.github.com/repos/wdg/blacktree-secrets
closed
Secrets Website is down.
auto-migrated Priority-Medium Type-Defect
``` The website is down. Unable to Update Secrets. ``` Original issue reported on code.google.com by `themacin...@gmail.com` on 11 May 2008 at 5:53
1.0
Secrets Website is down. - ``` The website is down. Unable to Update Secrets. ``` Original issue reported on code.google.com by `themacin...@gmail.com` on 11 May 2008 at 5:53
defect
secrets website is down the website is down unable to update secrets original issue reported on code google com by themacin gmail com on may at
1
157,676
13,710,747,993
IssuesEvent
2020-10-02 02:06:34
CMPUT301F20T41/boromi
https://api.github.com/repos/CMPUT301F20T41/boromi
closed
Compile List of Team Members
Priority: High Status: In Progress Type: Documentation
## Description Each team member is required to make a commit to the doc/teams.txt file. Each commit should just be a line with their CCID and GitHub username. ### Example ```bchelle blchelle```
1.0
Compile List of Team Members - ## Description Each team member is required to make a commit to the doc/teams.txt file. Each commit should just be a line with their CCID and GitHub username. ### Example ```bchelle blchelle```
non_defect
compile list of team members description each team member is required to make a commit to the doc teams txt file each commit should just be a line with their ccid and github username example bchelle blchelle
0
40,078
2,862,977,628
IssuesEvent
2015-06-04 09:14:26
OCHA-DAP/hdx-ckan
https://api.github.com/repos/OCHA-DAP/hdx-ckan
reopened
Custom org page: Admin - topline numbers
Custom org page Priority-Medium
- [ ] Dataset ID - do we need it? In case we don't it should be removed. For custom location we don't have it - ping @cjhendrix - [ ] Create a different section for topline numbers (similar with custom location page) In the end we should have the following sections: 1. General information 2. Custom styles - logos - Background logo same... - Custom Styling 3. Topline numbers - Resource ID 4. Visualization Configuration
1.0
Custom org page: Admin - topline numbers - - [ ] Dataset ID - do we need it? In case we don't it should be removed. For custom location we don't have it - ping @cjhendrix - [ ] Create a different section for topline numbers (similar with custom location page) In the end we should have the following sections: 1. General information 2. Custom styles - logos - Background logo same... - Custom Styling 3. Topline numbers - Resource ID 4. Visualization Configuration
non_defect
custom org page admin topline numbers dataset id do we need it in case we don t it should be removed for custom location we don t have it ping cjhendrix create a different section for topline numbers similar with custom location page in the end we should have the following sections general information custom styles logos background logo same custom styling topline numbers resource id visualization configuration
0
51,662
13,210,117,750
IssuesEvent
2020-08-15 15:13:51
networkx/networkx
https://api.github.com/repos/networkx/networkx
closed
steiner_tree should accept MultiGraph
Defect
I'm using `steiner_tree` on a road network which may have multiple edges between nodes. It looks like `steiner_tree` will fail if passed a `MultiGraph`: - as a next-to-last step, edges are generated as `(u, v)` tuples pairwise [here](https://github.com/networkx/networkx/blob/master/networkx/algorithms/approximation/steinertree.py#L87) - before being passed to `G.edge_subgraph` which raises a `ValueError` from `nx.filter.show_multiedges` This should reproduce the issue: ```python import networkx as nx import networkx.algorithms.approximation as nxa def test_simple_steiner_tree(): G = nx.Graph() G.add_weighted_edges_from([ (1, 2, 1), (2, 3, 1), (3, 4, 1), (3, 5, 1) ]) terminal_nodes = [2, 4, 5] expected_edges = [ (2, 3), (3, 4), (3, 5) ] T = nxa.steiner_tree(G, terminal_nodes) assert list(T.edges) == expected_edges def test_multi_steiner_tree(): G = nx.MultiGraph() G.add_weighted_edges_from([ (1, 2, 1), (2, 3, 1), (2, 3, 999), (3, 4, 1), (3, 5, 1) ]) terminal_nodes = [2, 4, 5] expected_edges = [ (2, 3, 0), (3, 4, 0), # first edge has weight one (3, 5, 0) ] T = nxa.steiner_tree(G, terminal_nodes) test_simple_steiner_tree() # passes test_multi_steiner_tree() # throws ValueError ``` The quick fix might be to add `@not_implemented_for('multigraph')`. For my current purposes, the following does the trick to handle the `MultiGraph` case: ```python # get unique links pairwise (u, v) links = set(chain.from_iterable( pairwise(d['path']) for u, v, d in mst_edges )) # for each link in the chain multi_edges = [] for u, v in links: # consider each edge between the pair of nodes, # keeping track of the one with the minimum weight # (there may be a better way - convenience functions/accessors?) num_edges = G.number_of_edges(u, v) min_k = 0 min_weight = None for k in range(num_edges): curr_weight = G.edges[u, v, k][weight] if min_weight is None: min_weight = curr_weight elif curr_weight < min_weight: min_weight = curr_weight min_k = k multi_edges.append((u, v, min_k)) # create subgraph from multi edges - list of (u, v, k) T = G.edge_subgraph(multi_edges) ```
1.0
steiner_tree should accept MultiGraph - I'm using `steiner_tree` on a road network which may have multiple edges between nodes. It looks like `steiner_tree` will fail if passed a `MultiGraph`: - as a next-to-last step, edges are generated as `(u, v)` tuples pairwise [here](https://github.com/networkx/networkx/blob/master/networkx/algorithms/approximation/steinertree.py#L87) - before being passed to `G.edge_subgraph` which raises a `ValueError` from `nx.filter.show_multiedges` This should reproduce the issue: ```python import networkx as nx import networkx.algorithms.approximation as nxa def test_simple_steiner_tree(): G = nx.Graph() G.add_weighted_edges_from([ (1, 2, 1), (2, 3, 1), (3, 4, 1), (3, 5, 1) ]) terminal_nodes = [2, 4, 5] expected_edges = [ (2, 3), (3, 4), (3, 5) ] T = nxa.steiner_tree(G, terminal_nodes) assert list(T.edges) == expected_edges def test_multi_steiner_tree(): G = nx.MultiGraph() G.add_weighted_edges_from([ (1, 2, 1), (2, 3, 1), (2, 3, 999), (3, 4, 1), (3, 5, 1) ]) terminal_nodes = [2, 4, 5] expected_edges = [ (2, 3, 0), (3, 4, 0), # first edge has weight one (3, 5, 0) ] T = nxa.steiner_tree(G, terminal_nodes) test_simple_steiner_tree() # passes test_multi_steiner_tree() # throws ValueError ``` The quick fix might be to add `@not_implemented_for('multigraph')`. For my current purposes, the following does the trick to handle the `MultiGraph` case: ```python # get unique links pairwise (u, v) links = set(chain.from_iterable( pairwise(d['path']) for u, v, d in mst_edges )) # for each link in the chain multi_edges = [] for u, v in links: # consider each edge between the pair of nodes, # keeping track of the one with the minimum weight # (there may be a better way - convenience functions/accessors?) num_edges = G.number_of_edges(u, v) min_k = 0 min_weight = None for k in range(num_edges): curr_weight = G.edges[u, v, k][weight] if min_weight is None: min_weight = curr_weight elif curr_weight < min_weight: min_weight = curr_weight min_k = k multi_edges.append((u, v, min_k)) # create subgraph from multi edges - list of (u, v, k) T = G.edge_subgraph(multi_edges) ```
defect
steiner tree should accept multigraph i m using steiner tree on a road network which may have multiple edges between nodes it looks like steiner tree will fail if passed a multigraph as a next to last step edges are generated as u v tuples pairwise before being passed to g edge subgraph which raises a valueerror from nx filter show multiedges this should reproduce the issue python import networkx as nx import networkx algorithms approximation as nxa def test simple steiner tree g nx graph g add weighted edges from terminal nodes expected edges t nxa steiner tree g terminal nodes assert list t edges expected edges def test multi steiner tree g nx multigraph g add weighted edges from terminal nodes expected edges first edge has weight one t nxa steiner tree g terminal nodes test simple steiner tree passes test multi steiner tree throws valueerror the quick fix might be to add not implemented for multigraph for my current purposes the following does the trick to handle the multigraph case python get unique links pairwise u v links set chain from iterable pairwise d for u v d in mst edges for each link in the chain multi edges for u v in links consider each edge between the pair of nodes keeping track of the one with the minimum weight there may be a better way convenience functions accessors num edges g number of edges u v min k min weight none for k in range num edges curr weight g edges if min weight is none min weight curr weight elif curr weight min weight min weight curr weight min k k multi edges append u v min k create subgraph from multi edges list of u v k t g edge subgraph multi edges
1
66,164
20,022,670,444
IssuesEvent
2022-02-01 17:50:42
vector-im/element-web
https://api.github.com/repos/vector-im/element-web
opened
The timeline used for maximised widgets has no jump-to-bottom button.
T-Defect
### Steps to reproduce 1. go to a room with a widget 2. maximise it 3. scroll up the rightpanel chat view 4. observe no green jump-to-bottom button appears. ### Outcome #### What did you expect? all timelines should have a jump-to-bottom FAB on the bottom right so if you scroll up you can quickly get back to the current day. #### What happened instead? no button. ### Operating system macOS ### Application version 1.9.9 ### How did you install the app? chat.fosdem.org ### Homeserver attendees.fosdem.org ### Will you send logs? No
1.0
The timeline used for maximised widgets has no jump-to-bottom button. - ### Steps to reproduce 1. go to a room with a widget 2. maximise it 3. scroll up the rightpanel chat view 4. observe no green jump-to-bottom button appears. ### Outcome #### What did you expect? all timelines should have a jump-to-bottom FAB on the bottom right so if you scroll up you can quickly get back to the current day. #### What happened instead? no button. ### Operating system macOS ### Application version 1.9.9 ### How did you install the app? chat.fosdem.org ### Homeserver attendees.fosdem.org ### Will you send logs? No
defect
the timeline used for maximised widgets has no jump to bottom button steps to reproduce go to a room with a widget maximise it scroll up the rightpanel chat view observe no green jump to bottom button appears outcome what did you expect all timelines should have a jump to bottom fab on the bottom right so if you scroll up you can quickly get back to the current day what happened instead no button operating system macos application version how did you install the app chat fosdem org homeserver attendees fosdem org will you send logs no
1
28,226
5,221,389,717
IssuesEvent
2017-01-27 01:18:57
elTiempoVuela/https-finder
https://api.github.com/repos/elTiempoVuela/https-finder
closed
Integrate with 'Force HTTPS' feature of NoScript addon
auto-migrated Priority-Medium Type-Defect
``` What steps will reproduce the problem? 1. Browse to a HTTP site that has a HTTPS version 2. Allow HTTPS Finder to redirect you to the HTTPS page 3. HTTPS Finder will offer to save a rule for HTTPS Everywhere What is the expected output? What do you see instead? I would like to be able to automatically add the site to the NoScript 'Force HTTPS' settings as well as HTTPS Everywhere. What version of the product are you using? On what operating system? HTTPS Finder 0.85 on Ubuntu Linux Please provide any additional information below. NoScript Force HTTPS settings are contained in the about:config 'noscript.httpsForced' setting (which is of type string), delimited with line breaks. Syntax is described at http://noscript.net/faq#qa6_3 ``` Original issue reported on code.google.com by `carl.ant...@gmail.com` on 27 Sep 2012 at 2:24
1.0
Integrate with 'Force HTTPS' feature of NoScript addon - ``` What steps will reproduce the problem? 1. Browse to a HTTP site that has a HTTPS version 2. Allow HTTPS Finder to redirect you to the HTTPS page 3. HTTPS Finder will offer to save a rule for HTTPS Everywhere What is the expected output? What do you see instead? I would like to be able to automatically add the site to the NoScript 'Force HTTPS' settings as well as HTTPS Everywhere. What version of the product are you using? On what operating system? HTTPS Finder 0.85 on Ubuntu Linux Please provide any additional information below. NoScript Force HTTPS settings are contained in the about:config 'noscript.httpsForced' setting (which is of type string), delimited with line breaks. Syntax is described at http://noscript.net/faq#qa6_3 ``` Original issue reported on code.google.com by `carl.ant...@gmail.com` on 27 Sep 2012 at 2:24
defect
integrate with force https feature of noscript addon what steps will reproduce the problem browse to a http site that has a https version allow https finder to redirect you to the https page https finder will offer to save a rule for https everywhere what is the expected output what do you see instead i would like to be able to automatically add the site to the noscript force https settings as well as https everywhere what version of the product are you using on what operating system https finder on ubuntu linux please provide any additional information below noscript force https settings are contained in the about config noscript httpsforced setting which is of type string delimited with line breaks syntax is described at original issue reported on code google com by carl ant gmail com on sep at
1
39,064
9,195,566,149
IssuesEvent
2019-03-07 02:58:35
STEllAR-GROUP/phylanx
https://api.github.com/repos/STEllAR-GROUP/phylanx
closed
PhySL: lambda returning nil
category: PhySL submodule: backend type: defect
``` define( test, lambda(nil) ) test() ``` throws: ``` physl: exception caught: test.physl(3, 5): lambda:: the expression representing the function target has not been initialized: HPX(invalid_status) ```
1.0
PhySL: lambda returning nil - ``` define( test, lambda(nil) ) test() ``` throws: ``` physl: exception caught: test.physl(3, 5): lambda:: the expression representing the function target has not been initialized: HPX(invalid_status) ```
defect
physl lambda returning nil define test lambda nil test throws physl exception caught test physl lambda the expression representing the function target has not been initialized hpx invalid status
1
20,005
3,288,453,022
IssuesEvent
2015-10-29 15:13:56
g4gaurang/bcbsmaissuestracker
https://api.github.com/repos/g4gaurang/bcbsmaissuestracker
opened
Capture File Fill Diabetes and Maternity Values error - % retail and mail service supply
Environment-Production Priority-ShowStopper Status- New Type-Defect
Hello, One of our BEs attempted to generate an SBC, but received the Capture File Fill Diabetes and Maternity Values error. It was determined that the calculator is unable to generate due to the % retail and mail service supply values demonstrated. Note that these values are correct, and the calculator should account for them. The custom plan generated was 3456851_Entegris, Inc_Blue_Care_Elect_Saver_90_1500_154854BS Please fix globally ASAP ![screen shot 2015-10-29 at 10 24 56 am](https://cloud.githubusercontent.com/assets/13453008/10822530/096e212e-7e2e-11e5-82e5-0403edf5c66c.png) ![screen shot 2015-10-29 at 10 24 41 am](https://cloud.githubusercontent.com/assets/13453008/10822485/dc186036-7e2d-11e5-8f79-2a7fdcf8c4c8.png)
1.0
Capture File Fill Diabetes and Maternity Values error - % retail and mail service supply - Hello, One of our BEs attempted to generate an SBC, but received the Capture File Fill Diabetes and Maternity Values error. It was determined that the calculator is unable to generate due to the % retail and mail service supply values demonstrated. Note that these values are correct, and the calculator should account for them. The custom plan generated was 3456851_Entegris, Inc_Blue_Care_Elect_Saver_90_1500_154854BS Please fix globally ASAP ![screen shot 2015-10-29 at 10 24 56 am](https://cloud.githubusercontent.com/assets/13453008/10822530/096e212e-7e2e-11e5-82e5-0403edf5c66c.png) ![screen shot 2015-10-29 at 10 24 41 am](https://cloud.githubusercontent.com/assets/13453008/10822485/dc186036-7e2d-11e5-8f79-2a7fdcf8c4c8.png)
defect
capture file fill diabetes and maternity values error retail and mail service supply hello one of our bes attempted to generate an sbc but received the capture file fill diabetes and maternity values error it was determined that the calculator is unable to generate due to the retail and mail service supply values demonstrated note that these values are correct and the calculator should account for them the custom plan generated was entegris inc blue care elect saver please fix globally asap
1
662,672
22,148,843,729
IssuesEvent
2022-06-03 14:43:20
WebSoftDevs/MusicBot
https://api.github.com/repos/WebSoftDevs/MusicBot
closed
Add healtcheck command
priority-low Back-End idea-accepted difficulty-easy
We need to know if the application has a connection to the database, my idea is to create a hidden healtcheck command which will query the database and display a message if the connection exists. Mentioned command can help us diagnose any kind of issues.
1.0
Add healtcheck command - We need to know if the application has a connection to the database, my idea is to create a hidden healtcheck command which will query the database and display a message if the connection exists. Mentioned command can help us diagnose any kind of issues.
non_defect
add healtcheck command we need to know if the application has a connection to the database my idea is to create a hidden healtcheck command which will query the database and display a message if the connection exists mentioned command can help us diagnose any kind of issues
0
21,429
3,507,901,847
IssuesEvent
2016-01-08 15:33:23
TimVelo/giflib
https://api.github.com/repos/TimVelo/giflib
closed
What steps will reproduce the problem? 1.
auto-migrated Priority-Medium Type-Defect
``` What steps will reproduce the problem? 1. 2. 3. jhvytguycvfnhbg What is the expected output? What do you see instead? ;kjmo What version of the product are you using? On what operating system? Please provide any additional information below. ``` Original issue reported on code.google.com by `hem6kach...@gmail.com` on 2 Jul 2015 at 7:11
1.0
What steps will reproduce the problem? 1. - ``` What steps will reproduce the problem? 1. 2. 3. jhvytguycvfnhbg What is the expected output? What do you see instead? ;kjmo What version of the product are you using? On what operating system? Please provide any additional information below. ``` Original issue reported on code.google.com by `hem6kach...@gmail.com` on 2 Jul 2015 at 7:11
defect
what steps will reproduce the problem what steps will reproduce the problem jhvytguycvfnhbg what is the expected output what do you see instead kjmo what version of the product are you using on what operating system please provide any additional information below original issue reported on code google com by gmail com on jul at
1
77,021
26,719,837,937
IssuesEvent
2023-01-29 01:20:38
amyjko/bookish
https://api.github.com/repos/amyjko/bookish
opened
Validate theme CSS values
defect
We currently do no validation, so there's no feedback for why something isn't working.
1.0
Validate theme CSS values - We currently do no validation, so there's no feedback for why something isn't working.
defect
validate theme css values we currently do no validation so there s no feedback for why something isn t working
1
109,336
4,385,944,840
IssuesEvent
2016-08-08 10:52:13
roschaefer/story.board
https://api.github.com/repos/roschaefer/story.board
reopened
Cow name in text component
point: 1 Priority: high
As a reporter I want to have the markup "{exemplar}" replaced by the setting cow name (defined in #82) to write about a individual with a name.
1.0
Cow name in text component - As a reporter I want to have the markup "{exemplar}" replaced by the setting cow name (defined in #82) to write about a individual with a name.
non_defect
cow name in text component as a reporter i want to have the markup exemplar replaced by the setting cow name defined in to write about a individual with a name
0
73,250
24,524,450,229
IssuesEvent
2022-10-11 12:07:24
vector-im/element-android
https://api.github.com/repos/vector-im/element-android
opened
Messages don't always send
T-Defect A-Local echo A-Timeline S-Major O-Occasional
### Steps to reproduce 1. Type out a message 2. Click send, then the system back button ### Outcome #### What did you expect? Local echo appears, message is sent #### What happened instead? Sometimes the message doesn't send and vanishes altogether Often the local echo won't update for 10+ seconds ### Your phone model _No response_ ### Operating system version _No response_ ### Application version and app store Element 1.5.2, SDK 1.5.2, olm 3.2.12 ### Homeserver matrix.org ### Will you send logs? Yes ### Are you willing to provide a PR? No
1.0
Messages don't always send - ### Steps to reproduce 1. Type out a message 2. Click send, then the system back button ### Outcome #### What did you expect? Local echo appears, message is sent #### What happened instead? Sometimes the message doesn't send and vanishes altogether Often the local echo won't update for 10+ seconds ### Your phone model _No response_ ### Operating system version _No response_ ### Application version and app store Element 1.5.2, SDK 1.5.2, olm 3.2.12 ### Homeserver matrix.org ### Will you send logs? Yes ### Are you willing to provide a PR? No
defect
messages don t always send steps to reproduce type out a message click send then the system back button outcome what did you expect local echo appears message is sent what happened instead sometimes the message doesn t send and vanishes altogether often the local echo won t update for seconds your phone model no response operating system version no response application version and app store element sdk olm homeserver matrix org will you send logs yes are you willing to provide a pr no
1
11,174
2,641,249,509
IssuesEvent
2015-03-11 16:46:34
chrsmith/html5rocks
https://api.github.com/repos/chrsmith/html5rocks
closed
Unwanted slide movement after trackpad swipe
Milestone-X Priority-Low Slides Type-Defect
Original [issue 37](https://code.google.com/p/html5rocks/issues/detail?id=37) created by chrsmith on 2010-06-24T18:13:51.000Z: Via Charles Jolley: I was on Chromium nightly (49408) on a Mac using a Magic Mouse. It also happens with my trackpad. All I have to do is side swipe slightly with two fingers and it scrolls forward and back a random distance. (At least I don't feel like it is controllable). The problem is that it is really easy to accidentally touch two fingers to the trackpad and then it sends me wooshing off to a new spot in the preso.
1.0
Unwanted slide movement after trackpad swipe - Original [issue 37](https://code.google.com/p/html5rocks/issues/detail?id=37) created by chrsmith on 2010-06-24T18:13:51.000Z: Via Charles Jolley: I was on Chromium nightly (49408) on a Mac using a Magic Mouse. It also happens with my trackpad. All I have to do is side swipe slightly with two fingers and it scrolls forward and back a random distance. (At least I don't feel like it is controllable). The problem is that it is really easy to accidentally touch two fingers to the trackpad and then it sends me wooshing off to a new spot in the preso.
defect
unwanted slide movement after trackpad swipe original created by chrsmith on via charles jolley i was on chromium nightly on a mac using a magic mouse it also happens with my trackpad all i have to do is side swipe slightly with two fingers and it scrolls forward and back a random distance at least i don t feel like it is controllable the problem is that it is really easy to accidentally touch two fingers to the trackpad and then it sends me wooshing off to a new spot in the preso
1
47,463
13,056,196,730
IssuesEvent
2020-07-30 03:57:34
icecube-trac/tix2
https://api.github.com/repos/icecube-trac/tix2
closed
I3Module handling of default parameters that are wrapped c++ classes (Trac #594)
IceTray Migrated from Trac defect
if a module takes a python object parameter and supplies a default value that is a c++ class wrapped outside of the module's main library, icetray-inspect won't know to load the wrapper and will fail on a 'no to-python converter' error. I3Module should have more robust handling of default values to handle this. Migrated from https://code.icecube.wisc.edu/ticket/594 ```json { "status": "closed", "changetime": "2012-10-31T19:03:39", "description": "if a module takes a python object parameter and supplies a default value that is a c++ class wrapped outside of the module's main library, icetray-inspect won't know to load the wrapper and will fail on a 'no to-python converter' error. I3Module should have more robust handling of default values to handle this.\n\n", "reporter": "troy", "cc": "", "resolution": "fixed", "_ts": "1351710219000000", "component": "IceTray", "summary": "I3Module handling of default parameters that are wrapped c++ classes", "priority": "normal", "keywords": "", "time": "2010-02-05T18:47:58", "milestone": "", "owner": "troy", "type": "defect" } ```
1.0
I3Module handling of default parameters that are wrapped c++ classes (Trac #594) - if a module takes a python object parameter and supplies a default value that is a c++ class wrapped outside of the module's main library, icetray-inspect won't know to load the wrapper and will fail on a 'no to-python converter' error. I3Module should have more robust handling of default values to handle this. Migrated from https://code.icecube.wisc.edu/ticket/594 ```json { "status": "closed", "changetime": "2012-10-31T19:03:39", "description": "if a module takes a python object parameter and supplies a default value that is a c++ class wrapped outside of the module's main library, icetray-inspect won't know to load the wrapper and will fail on a 'no to-python converter' error. I3Module should have more robust handling of default values to handle this.\n\n", "reporter": "troy", "cc": "", "resolution": "fixed", "_ts": "1351710219000000", "component": "IceTray", "summary": "I3Module handling of default parameters that are wrapped c++ classes", "priority": "normal", "keywords": "", "time": "2010-02-05T18:47:58", "milestone": "", "owner": "troy", "type": "defect" } ```
defect
handling of default parameters that are wrapped c classes trac if a module takes a python object parameter and supplies a default value that is a c class wrapped outside of the module s main library icetray inspect won t know to load the wrapper and will fail on a no to python converter error should have more robust handling of default values to handle this migrated from json status closed changetime description if a module takes a python object parameter and supplies a default value that is a c class wrapped outside of the module s main library icetray inspect won t know to load the wrapper and will fail on a no to python converter error should have more robust handling of default values to handle this n n reporter troy cc resolution fixed ts component icetray summary handling of default parameters that are wrapped c classes priority normal keywords time milestone owner troy type defect
1
53,616
7,846,280,606
IssuesEvent
2018-06-19 15:05:16
pivotal-cf/docs-spring-cloud-services
https://api.github.com/repos/pivotal-cf/docs-spring-cloud-services
closed
Revise Config Server documentation to incorporate new OSS
documentation in progress
In Finchley, Spring Cloud OSS has: * new composite configuration * enhanced proxy support * `skipSslValidation` and `timeout` properties Revise SCS Config Server docs to be DRY, incorporating OSS and not repeating info.
1.0
Revise Config Server documentation to incorporate new OSS - In Finchley, Spring Cloud OSS has: * new composite configuration * enhanced proxy support * `skipSslValidation` and `timeout` properties Revise SCS Config Server docs to be DRY, incorporating OSS and not repeating info.
non_defect
revise config server documentation to incorporate new oss in finchley spring cloud oss has new composite configuration enhanced proxy support skipsslvalidation and timeout properties revise scs config server docs to be dry incorporating oss and not repeating info
0
2,162
4,295,562,343
IssuesEvent
2016-07-19 07:41:53
NAFITH/IraqWeb
https://api.github.com/repos/NAFITH/IraqWeb
opened
A Crash occurred after saving black list Entry (container)
Crash Major New Server(http://174.140.130.230) Open Services-Black List
§ Role • Nafith Operator § Prerequisite: • Container has been added to black list (north port ) (TCLU1841577) § Scenario • Login to the system • From the main menu, go to Black List • Click add • Fill the data of the container from the prerequisite to the South port § Bug description a crash occurred ![image](https://cloud.githubusercontent.com/assets/15855657/16941841/4c4eefc2-4d9b-11e6-88a4-0c5641e1e991.png)
1.0
A Crash occurred after saving black list Entry (container) - § Role • Nafith Operator § Prerequisite: • Container has been added to black list (north port ) (TCLU1841577) § Scenario • Login to the system • From the main menu, go to Black List • Click add • Fill the data of the container from the prerequisite to the South port § Bug description a crash occurred ![image](https://cloud.githubusercontent.com/assets/15855657/16941841/4c4eefc2-4d9b-11e6-88a4-0c5641e1e991.png)
non_defect
a crash occurred after saving black list entry container § role • nafith operator § prerequisite • container has been added to black list north port § scenario • login to the system • from the main menu go to black list • click add • fill the data of the container from the prerequisite to the south port § bug description a crash occurred
0
183,880
14,961,279,060
IssuesEvent
2021-01-27 07:29:11
matplotlib/matplotlib
https://api.github.com/repos/matplotlib/matplotlib
closed
Is it a better solution to acess one of the spines by class atrribute?
API: changes Documentation
hiding spines is very common in creating a new plot by matplotlib, and the flowing lines of code are widely used. ```python ax.spines['top'].set_visible(False) ax.spines['right'].set_visible(False) ``` They are a little comprehensive for new users to remember, and is easy to get wrong. Will it possible to access the spine by class attribute and make matplotlib more object-oriented, with this mirror change. ```python ax.spines.top.set_visible(False) ax.spines.right.set_visible(False) ``` or even, ```python ax.spines.top = None ax.spines.right = None ```
1.0
Is it a better solution to acess one of the spines by class atrribute? - hiding spines is very common in creating a new plot by matplotlib, and the flowing lines of code are widely used. ```python ax.spines['top'].set_visible(False) ax.spines['right'].set_visible(False) ``` They are a little comprehensive for new users to remember, and is easy to get wrong. Will it possible to access the spine by class attribute and make matplotlib more object-oriented, with this mirror change. ```python ax.spines.top.set_visible(False) ax.spines.right.set_visible(False) ``` or even, ```python ax.spines.top = None ax.spines.right = None ```
non_defect
is it a better solution to acess one of the spines by class atrribute hiding spines is very common in creating a new plot by matplotlib and the flowing lines of code are widely used python ax spines set visible false ax spines set visible false they are a little comprehensive for new users to remember and is easy to get wrong will it possible to access the spine by class attribute and make matplotlib more object oriented with this mirror change python ax spines top set visible false ax spines right set visible false or even python ax spines top none ax spines right none
0
189,685
6,800,733,095
IssuesEvent
2017-11-02 14:50:39
CS2103AUG2017-F11-B4/main
https://api.github.com/repos/CS2103AUG2017-F11-B4/main
closed
Add profile picture for contacts
priority.high status.ongoing type.enhancement
- Retrieve's a contact's Facebook profile picture to be used. If there is no Facebook link for the user, a default profile picture would be used. - To be added in V1.4
1.0
Add profile picture for contacts - - Retrieve's a contact's Facebook profile picture to be used. If there is no Facebook link for the user, a default profile picture would be used. - To be added in V1.4
non_defect
add profile picture for contacts retrieve s a contact s facebook profile picture to be used if there is no facebook link for the user a default profile picture would be used to be added in
0
338,403
30,295,836,783
IssuesEvent
2023-07-09 21:01:05
unifyai/ivy
https://api.github.com/repos/unifyai/ivy
reopened
Fix general_functions.test_tensorflow_fill
TensorFlow Frontend Sub Task Failing Test
| | | |---|---| |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5501575189/jobs/10025258120"><img src=https://img.shields.io/badge/-failure-red></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5501575189/jobs/10025258120"><img src=https://img.shields.io/badge/-success-success></a> |jax|<a href="https://github.com/unifyai/ivy/actions/runs/5501575189/jobs/10025258120"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/5501575189/jobs/10025258120"><img src=https://img.shields.io/badge/-success-success></a> |paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5501575189/jobs/10025258120"><img src=https://img.shields.io/badge/-success-success></a>
1.0
Fix general_functions.test_tensorflow_fill - | | | |---|---| |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5501575189/jobs/10025258120"><img src=https://img.shields.io/badge/-failure-red></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5501575189/jobs/10025258120"><img src=https://img.shields.io/badge/-success-success></a> |jax|<a href="https://github.com/unifyai/ivy/actions/runs/5501575189/jobs/10025258120"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/5501575189/jobs/10025258120"><img src=https://img.shields.io/badge/-success-success></a> |paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5501575189/jobs/10025258120"><img src=https://img.shields.io/badge/-success-success></a>
non_defect
fix general functions test tensorflow fill numpy a href src tensorflow a href src jax a href src torch a href src paddle a href src
0
31,110
6,423,854,858
IssuesEvent
2017-08-09 12:10:49
wooowooo/phpsocks5
https://api.github.com/repos/wooowooo/phpsocks5
closed
配置好了之后不能用
auto-migrated Priority-Medium Type-Defect
``` 附上本地log文件,服务器里面貌似是因为安全配置的原因,没有 生成log文件 ``` Original issue reported on code.google.com by `dingya...@gmail.com` on 15 Jul 2011 at 5:03 Attachments: - [socks5err.log](https://storage.googleapis.com/google-code-attachments/phpsocks5/issue-29/comment-0/socks5err.log)
1.0
配置好了之后不能用 - ``` 附上本地log文件,服务器里面貌似是因为安全配置的原因,没有 生成log文件 ``` Original issue reported on code.google.com by `dingya...@gmail.com` on 15 Jul 2011 at 5:03 Attachments: - [socks5err.log](https://storage.googleapis.com/google-code-attachments/phpsocks5/issue-29/comment-0/socks5err.log)
defect
配置好了之后不能用 附上本地log文件 服务器里面貌似是因为安全配置的原因 没有 生成log文件 original issue reported on code google com by dingya gmail com on jul at attachments
1
46,682
13,055,959,103
IssuesEvent
2020-07-30 03:14:18
icecube-trac/tix2
https://api.github.com/repos/icecube-trac/tix2
opened
[docs] reactivate nightly builds of documentation (Trac #1709)
Incomplete Migration Migrated from Trac defect other
Migrated from https://code.icecube.wisc.edu/ticket/1709 ```json { "status": "closed", "changetime": "2019-02-13T14:12:58", "description": "hi,\n\ni noticed that the links of the nightly builds of documentation on\n\nhttp://software.icecube.wisc.edu/\n\nare not working. some people use this site and the links there, and it creates confusion that the documentation there is not up-to-date, while the link name claims it is.\n\ni would suggest to remove these links \"Nightly Documentation build (from trunk)\", and have only one link on this page to a nightly doc build from combo/trunk.", "reporter": "hdembinski", "cc": "olivas", "resolution": "wontfix", "_ts": "1550067178841456", "component": "other", "summary": "[docs] reactivate nightly builds of documentation", "priority": "normal", "keywords": "", "time": "2016-05-17T17:22:33", "milestone": "", "owner": "nega", "type": "defect" } ```
1.0
[docs] reactivate nightly builds of documentation (Trac #1709) - Migrated from https://code.icecube.wisc.edu/ticket/1709 ```json { "status": "closed", "changetime": "2019-02-13T14:12:58", "description": "hi,\n\ni noticed that the links of the nightly builds of documentation on\n\nhttp://software.icecube.wisc.edu/\n\nare not working. some people use this site and the links there, and it creates confusion that the documentation there is not up-to-date, while the link name claims it is.\n\ni would suggest to remove these links \"Nightly Documentation build (from trunk)\", and have only one link on this page to a nightly doc build from combo/trunk.", "reporter": "hdembinski", "cc": "olivas", "resolution": "wontfix", "_ts": "1550067178841456", "component": "other", "summary": "[docs] reactivate nightly builds of documentation", "priority": "normal", "keywords": "", "time": "2016-05-17T17:22:33", "milestone": "", "owner": "nega", "type": "defect" } ```
defect
reactivate nightly builds of documentation trac migrated from json status closed changetime description hi n ni noticed that the links of the nightly builds of documentation on n n not working some people use this site and the links there and it creates confusion that the documentation there is not up to date while the link name claims it is n ni would suggest to remove these links nightly documentation build from trunk and have only one link on this page to a nightly doc build from combo trunk reporter hdembinski cc olivas resolution wontfix ts component other summary reactivate nightly builds of documentation priority normal keywords time milestone owner nega type defect
1
206,993
23,414,571,791
IssuesEvent
2022-08-12 22:01:44
socialtables/saml-protocol
https://api.github.com/repos/socialtables/saml-protocol
closed
CVE-2022-24773 (Medium) detected in node-forge-0.6.38.tgz - autoclosed
security vulnerability
## CVE-2022-24773 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-forge-0.6.38.tgz</b></p></summary> <p>JavaScript implementations of network transports, cryptography, ciphers, PKI, message digests, and various utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/node-forge/-/node-forge-0.6.38.tgz">https://registry.npmjs.org/node-forge/-/node-forge-0.6.38.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/xml-encryption/node_modules/node-forge/package.json</p> <p> Dependency Hierarchy: - xml-encryption-0.9.0.tgz (Root Library) - :x: **node-forge-0.6.38.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/socialtables/saml-protocol/commit/73b34e5ab4dd4df4269ecbd89db52f145f10d284">73b34e5ab4dd4df4269ecbd89db52f145f10d284</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Forge (also called `node-forge`) is a native implementation of Transport Layer Security in JavaScript. Prior to version 1.3.0, RSA PKCS#1 v1.5 signature verification code does not properly check `DigestInfo` for a proper ASN.1 structure. This can lead to successful verification with signatures that contain invalid structures but a valid digest. The issue has been addressed in `node-forge` version 1.3.0. There are currently no known workarounds. <p>Publish Date: 2022-03-18 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-24773>CVE-2022-24773</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-24773">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-24773</a></p> <p>Release Date: 2022-03-18</p> <p>Fix Resolution (node-forge): 1.3.0</p> <p>Direct dependency fix Resolution (xml-encryption): 2.0.0</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"xml-encryption","packageVersion":"0.9.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"xml-encryption:0.9.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.0.0","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2022-24773","vulnerabilityDetails":"Forge (also called `node-forge`) is a native implementation of Transport Layer Security in JavaScript. Prior to version 1.3.0, RSA PKCS#1 v1.5 signature verification code does not properly check `DigestInfo` for a proper ASN.1 structure. This can lead to successful verification with signatures that contain invalid structures but a valid digest. The issue has been addressed in `node-forge` version 1.3.0. There are currently no known workarounds.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-24773","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
True
CVE-2022-24773 (Medium) detected in node-forge-0.6.38.tgz - autoclosed - ## CVE-2022-24773 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-forge-0.6.38.tgz</b></p></summary> <p>JavaScript implementations of network transports, cryptography, ciphers, PKI, message digests, and various utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/node-forge/-/node-forge-0.6.38.tgz">https://registry.npmjs.org/node-forge/-/node-forge-0.6.38.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/xml-encryption/node_modules/node-forge/package.json</p> <p> Dependency Hierarchy: - xml-encryption-0.9.0.tgz (Root Library) - :x: **node-forge-0.6.38.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/socialtables/saml-protocol/commit/73b34e5ab4dd4df4269ecbd89db52f145f10d284">73b34e5ab4dd4df4269ecbd89db52f145f10d284</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Forge (also called `node-forge`) is a native implementation of Transport Layer Security in JavaScript. Prior to version 1.3.0, RSA PKCS#1 v1.5 signature verification code does not properly check `DigestInfo` for a proper ASN.1 structure. This can lead to successful verification with signatures that contain invalid structures but a valid digest. The issue has been addressed in `node-forge` version 1.3.0. There are currently no known workarounds. <p>Publish Date: 2022-03-18 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-24773>CVE-2022-24773</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-24773">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-24773</a></p> <p>Release Date: 2022-03-18</p> <p>Fix Resolution (node-forge): 1.3.0</p> <p>Direct dependency fix Resolution (xml-encryption): 2.0.0</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"xml-encryption","packageVersion":"0.9.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"xml-encryption:0.9.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.0.0","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2022-24773","vulnerabilityDetails":"Forge (also called `node-forge`) is a native implementation of Transport Layer Security in JavaScript. Prior to version 1.3.0, RSA PKCS#1 v1.5 signature verification code does not properly check `DigestInfo` for a proper ASN.1 structure. This can lead to successful verification with signatures that contain invalid structures but a valid digest. The issue has been addressed in `node-forge` version 1.3.0. There are currently no known workarounds.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-24773","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
non_defect
cve medium detected in node forge tgz autoclosed cve medium severity vulnerability vulnerable library node forge tgz javascript implementations of network transports cryptography ciphers pki message digests and various utilities library home page a href path to dependency file package json path to vulnerable library node modules xml encryption node modules node forge package json dependency hierarchy xml encryption tgz root library x node forge tgz vulnerable library found in head commit a href found in base branch master vulnerability details forge also called node forge is a native implementation of transport layer security in javascript prior to version rsa pkcs signature verification code does not properly check digestinfo for a proper asn structure this can lead to successful verification with signatures that contain invalid structures but a valid digest the issue has been addressed in node forge version there are currently no known workarounds publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution node forge direct dependency fix resolution xml encryption rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree xml encryption isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails forge also called node forge is a native implementation of transport layer security in javascript prior to version rsa pkcs signature verification code does not properly check digestinfo for a proper asn structure this can lead to successful verification with signatures that contain invalid structures but a valid digest the issue has been addressed in node forge version there are currently no known workarounds vulnerabilityurl
0
157,392
12,371,876,480
IssuesEvent
2020-05-18 19:21:15
edwino-stein/elrond-common
https://api.github.com/repos/edwino-stein/elrond-common
opened
Refactor the input driver API
libelrond libelrond-test
Like describe in issue #55 and #57 , I think that its can update to be better.
1.0
Refactor the input driver API - Like describe in issue #55 and #57 , I think that its can update to be better.
non_defect
refactor the input driver api like describe in issue and i think that its can update to be better
0
48,417
13,068,519,098
IssuesEvent
2020-07-31 03:50:11
icecube-trac/tix2
https://api.github.com/repos/icecube-trac/tix2
closed
[MuonGun] Hidden overloaded virtual function in pybingins (Trac #2362)
Migrated from Trac combo simulation defect
Clang gives me this warning: ```text [297/1701] Building CXX object MuonGun/CMakeFiles/MuonGun-pybindings.dir/private/pybindings/EnergyDistribution.cxx.o /Users/kmeagher/icecube/combo/src/MuonGun/private/pybindings/EnergyDistribution.cxx:22:17: warning: 'I3MuonGun::PyEnergyDistribution::GetLog' hides overloaded virtual function [-Woverloaded-virtual] virtual double GetLog(double depth, double cos_theta, ^ /Users/kmeagher/icecube/combo/src/MuonGun/public/MuonGun/EnergyDistribution.h:50:17: note: hidden overloaded virtual function 'I3MuonGun::EnergyDistribution::GetLog' declared here: type mismatch at 5th parameter ('I3MuonGun::EnergyDistribution::log_value' vs 'double') virtual double GetLog(double depth, double cos_theta, ^ 1 warning generated. ``` The problem is the 5th parameter which is of type `log_value` in the header but type `double` in the pybindings file Migrated from https://code.icecube.wisc.edu/ticket/2362 ```json { "status": "closed", "changetime": "2020-06-24T12:31:42", "description": "Clang gives me this warning:\n{{{\n[297/1701] Building CXX object MuonGun/CMakeFiles/MuonGun-pybindings.dir/private/pybindings/EnergyDistribution.cxx.o\n/Users/kmeagher/icecube/combo/src/MuonGun/private/pybindings/EnergyDistribution.cxx:22:17: warning: 'I3MuonGun::PyEnergyDistribution::GetLog' hides overloaded virtual function [-Woverloaded-virtual]\n virtual double GetLog(double depth, double cos_theta,\n ^\n/Users/kmeagher/icecube/combo/src/MuonGun/public/MuonGun/EnergyDistribution.h:50:17: note: hidden overloaded virtual function 'I3MuonGun::EnergyDistribution::GetLog' declared here: type mismatch at 5th parameter ('I3MuonGun::EnergyDistribution::log_value' vs 'double')\n virtual double GetLog(double depth, double cos_theta,\n ^\n1 warning generated.\n\n}}}\nThe problem is the 5th parameter which is of type `log_value` in the header but type `double` in the pybindings file ", "reporter": "kjmeagher", "cc": "", "resolution": "fixed", "_ts": "1593001902142004", "component": "combo simulation", "summary": "[MuonGun] Hidden overloaded virtual function in pybingins", "priority": "normal", "keywords": "", "time": "2019-10-04T16:16:57", "milestone": "Autumnal Equinox 2020", "owner": "jvansanten", "type": "defect" } ```
1.0
[MuonGun] Hidden overloaded virtual function in pybingins (Trac #2362) - Clang gives me this warning: ```text [297/1701] Building CXX object MuonGun/CMakeFiles/MuonGun-pybindings.dir/private/pybindings/EnergyDistribution.cxx.o /Users/kmeagher/icecube/combo/src/MuonGun/private/pybindings/EnergyDistribution.cxx:22:17: warning: 'I3MuonGun::PyEnergyDistribution::GetLog' hides overloaded virtual function [-Woverloaded-virtual] virtual double GetLog(double depth, double cos_theta, ^ /Users/kmeagher/icecube/combo/src/MuonGun/public/MuonGun/EnergyDistribution.h:50:17: note: hidden overloaded virtual function 'I3MuonGun::EnergyDistribution::GetLog' declared here: type mismatch at 5th parameter ('I3MuonGun::EnergyDistribution::log_value' vs 'double') virtual double GetLog(double depth, double cos_theta, ^ 1 warning generated. ``` The problem is the 5th parameter which is of type `log_value` in the header but type `double` in the pybindings file Migrated from https://code.icecube.wisc.edu/ticket/2362 ```json { "status": "closed", "changetime": "2020-06-24T12:31:42", "description": "Clang gives me this warning:\n{{{\n[297/1701] Building CXX object MuonGun/CMakeFiles/MuonGun-pybindings.dir/private/pybindings/EnergyDistribution.cxx.o\n/Users/kmeagher/icecube/combo/src/MuonGun/private/pybindings/EnergyDistribution.cxx:22:17: warning: 'I3MuonGun::PyEnergyDistribution::GetLog' hides overloaded virtual function [-Woverloaded-virtual]\n virtual double GetLog(double depth, double cos_theta,\n ^\n/Users/kmeagher/icecube/combo/src/MuonGun/public/MuonGun/EnergyDistribution.h:50:17: note: hidden overloaded virtual function 'I3MuonGun::EnergyDistribution::GetLog' declared here: type mismatch at 5th parameter ('I3MuonGun::EnergyDistribution::log_value' vs 'double')\n virtual double GetLog(double depth, double cos_theta,\n ^\n1 warning generated.\n\n}}}\nThe problem is the 5th parameter which is of type `log_value` in the header but type `double` in the pybindings file ", "reporter": "kjmeagher", "cc": "", "resolution": "fixed", "_ts": "1593001902142004", "component": "combo simulation", "summary": "[MuonGun] Hidden overloaded virtual function in pybingins", "priority": "normal", "keywords": "", "time": "2019-10-04T16:16:57", "milestone": "Autumnal Equinox 2020", "owner": "jvansanten", "type": "defect" } ```
defect
hidden overloaded virtual function in pybingins trac clang gives me this warning text building cxx object muongun cmakefiles muongun pybindings dir private pybindings energydistribution cxx o users kmeagher icecube combo src muongun private pybindings energydistribution cxx warning pyenergydistribution getlog hides overloaded virtual function virtual double getlog double depth double cos theta users kmeagher icecube combo src muongun public muongun energydistribution h note hidden overloaded virtual function energydistribution getlog declared here type mismatch at parameter energydistribution log value vs double virtual double getlog double depth double cos theta warning generated the problem is the parameter which is of type log value in the header but type double in the pybindings file migrated from json status closed changetime description clang gives me this warning n n building cxx object muongun cmakefiles muongun pybindings dir private pybindings energydistribution cxx o n users kmeagher icecube combo src muongun private pybindings energydistribution cxx warning pyenergydistribution getlog hides overloaded virtual function n virtual double getlog double depth double cos theta n n users kmeagher icecube combo src muongun public muongun energydistribution h note hidden overloaded virtual function energydistribution getlog declared here type mismatch at parameter energydistribution log value vs double n virtual double getlog double depth double cos theta n warning generated n n nthe problem is the parameter which is of type log value in the header but type double in the pybindings file reporter kjmeagher cc resolution fixed ts component combo simulation summary hidden overloaded virtual function in pybingins priority normal keywords time milestone autumnal equinox owner jvansanten type defect
1
33,086
7,028,669,250
IssuesEvent
2017-12-25 12:32:56
primefaces/primeng
https://api.github.com/repos/primefaces/primeng
closed
Cannot use resizableColumns together with dynamic visibility
defect
### There is no guarantee in receiving a response in GitHub Issue Tracker, If you'd like to secure our response, you may consider *PrimeNG PRO Support* where support is provided within 4 business hours **I'm submitting a ...** (check one with "x") ``` [x] bug report => Search github for a similar issue or PR before submitting [ ] feature request => Please check if request is not on the roadmap already https://github.com/primefaces/primeng/wiki/Roadmap [ ] support request => Please do not submit support request here, instead see http://forum.primefaces.org/viewforum.php?f=35 ``` **Current behavior** Affected component dataTable Setup `p-dataTable` with the attribute `resizableColumns="true"` containing n `p-column`s with one way data-binding on the attribute `hidden`. Example: ``` <p-dataTable [value]="cars" resizableColumns="true"> <p-column field="vin" header="Vin"></p-column> <p-column field="year" header="Year" [hidden]="toggleableVariable"></p-column> <p-column field="brand" header="Brand"></p-column> <p-column field="color" header="Color"></p-column> </p-dataTable> ``` If hidden is changed after the initalisation of p-dataTable, resizing starts to behave ~weird~. The resize preview does not fit the actual modification on the column width. Fix within the method `onColumnResizeEnd`: ``` let nextColumn = this.resizeColumn.nextElementSibling; //This must be the next visible Column though. while(nextColumn.classList.contains("ui-helper-hidden")){ nextColumn = nextColumn.nextElementSibling; } ``` **Expected behavior** Column should still be resized correctly. **Minimal reproduction of the problem with instructions** See above **What is the motivation / use case for changing the behavior?** Allowing to join column visibility with the resizeability feature. **Please tell us about your environment:** * **Angular version: 5 * **PrimeNG version: recent * **Browser:** [Chrome | Firefox | Edge ] <!-- All browsers where this could be reproduced --> * **Language:** [all | TypeScript X.X | ES6/7 | ES5]
1.0
Cannot use resizableColumns together with dynamic visibility - ### There is no guarantee in receiving a response in GitHub Issue Tracker, If you'd like to secure our response, you may consider *PrimeNG PRO Support* where support is provided within 4 business hours **I'm submitting a ...** (check one with "x") ``` [x] bug report => Search github for a similar issue or PR before submitting [ ] feature request => Please check if request is not on the roadmap already https://github.com/primefaces/primeng/wiki/Roadmap [ ] support request => Please do not submit support request here, instead see http://forum.primefaces.org/viewforum.php?f=35 ``` **Current behavior** Affected component dataTable Setup `p-dataTable` with the attribute `resizableColumns="true"` containing n `p-column`s with one way data-binding on the attribute `hidden`. Example: ``` <p-dataTable [value]="cars" resizableColumns="true"> <p-column field="vin" header="Vin"></p-column> <p-column field="year" header="Year" [hidden]="toggleableVariable"></p-column> <p-column field="brand" header="Brand"></p-column> <p-column field="color" header="Color"></p-column> </p-dataTable> ``` If hidden is changed after the initalisation of p-dataTable, resizing starts to behave ~weird~. The resize preview does not fit the actual modification on the column width. Fix within the method `onColumnResizeEnd`: ``` let nextColumn = this.resizeColumn.nextElementSibling; //This must be the next visible Column though. while(nextColumn.classList.contains("ui-helper-hidden")){ nextColumn = nextColumn.nextElementSibling; } ``` **Expected behavior** Column should still be resized correctly. **Minimal reproduction of the problem with instructions** See above **What is the motivation / use case for changing the behavior?** Allowing to join column visibility with the resizeability feature. **Please tell us about your environment:** * **Angular version: 5 * **PrimeNG version: recent * **Browser:** [Chrome | Firefox | Edge ] <!-- All browsers where this could be reproduced --> * **Language:** [all | TypeScript X.X | ES6/7 | ES5]
defect
cannot use resizablecolumns together with dynamic visibility there is no guarantee in receiving a response in github issue tracker if you d like to secure our response you may consider primeng pro support where support is provided within business hours i m submitting a check one with x bug report search github for a similar issue or pr before submitting feature request please check if request is not on the roadmap already support request please do not submit support request here instead see current behavior affected component datatable setup p datatable with the attribute resizablecolumns true containing n p column s with one way data binding on the attribute hidden example if hidden is changed after the initalisation of p datatable resizing starts to behave weird the resize preview does not fit the actual modification on the column width fix within the method oncolumnresizeend let nextcolumn this resizecolumn nextelementsibling this must be the next visible column though while nextcolumn classlist contains ui helper hidden nextcolumn nextcolumn nextelementsibling expected behavior column should still be resized correctly minimal reproduction of the problem with instructions see above what is the motivation use case for changing the behavior allowing to join column visibility with the resizeability feature please tell us about your environment angular version primeng version recent browser language
1
185,055
21,785,058,342
IssuesEvent
2022-05-14 02:19:39
Yash-Handa/GitHub-Org-Geographics
https://api.github.com/repos/Yash-Handa/GitHub-Org-Geographics
closed
CVE-2021-23343 (High) detected in path-parse-1.0.6.tgz - autoclosed
security vulnerability
## CVE-2021-23343 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>path-parse-1.0.6.tgz</b></p></summary> <p>Node.js path.parse() ponyfill</p> <p>Library home page: <a href="https://registry.npmjs.org/path-parse/-/path-parse-1.0.6.tgz">https://registry.npmjs.org/path-parse/-/path-parse-1.0.6.tgz</a></p> <p>Path to dependency file: /functions/package.json</p> <p>Path to vulnerable library: /functions/node_modules/path-parse/package.json,/node_modules/path-parse/package.json</p> <p> Dependency Hierarchy: - tslint-5.11.0.tgz (Root Library) - resolve-1.10.0.tgz - :x: **path-parse-1.0.6.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Yash-Handa/GitHub-Org-Geographics/commit/8541f31e4773acce1892733eceadb747a827e40f">8541f31e4773acce1892733eceadb747a827e40f</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> All versions of package path-parse are vulnerable to Regular Expression Denial of Service (ReDoS) via splitDeviceRe, splitTailRe, and splitPathRe regular expressions. ReDoS exhibits polynomial worst-case time complexity. <p>Publish Date: 2021-05-04 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23343>CVE-2021-23343</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/jbgutierrez/path-parse/issues/8">https://github.com/jbgutierrez/path-parse/issues/8</a></p> <p>Release Date: 2021-05-04</p> <p>Fix Resolution (path-parse): 1.0.7</p> <p>Direct dependency fix Resolution (tslint): 5.12.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-23343 (High) detected in path-parse-1.0.6.tgz - autoclosed - ## CVE-2021-23343 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>path-parse-1.0.6.tgz</b></p></summary> <p>Node.js path.parse() ponyfill</p> <p>Library home page: <a href="https://registry.npmjs.org/path-parse/-/path-parse-1.0.6.tgz">https://registry.npmjs.org/path-parse/-/path-parse-1.0.6.tgz</a></p> <p>Path to dependency file: /functions/package.json</p> <p>Path to vulnerable library: /functions/node_modules/path-parse/package.json,/node_modules/path-parse/package.json</p> <p> Dependency Hierarchy: - tslint-5.11.0.tgz (Root Library) - resolve-1.10.0.tgz - :x: **path-parse-1.0.6.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Yash-Handa/GitHub-Org-Geographics/commit/8541f31e4773acce1892733eceadb747a827e40f">8541f31e4773acce1892733eceadb747a827e40f</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> All versions of package path-parse are vulnerable to Regular Expression Denial of Service (ReDoS) via splitDeviceRe, splitTailRe, and splitPathRe regular expressions. ReDoS exhibits polynomial worst-case time complexity. <p>Publish Date: 2021-05-04 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23343>CVE-2021-23343</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/jbgutierrez/path-parse/issues/8">https://github.com/jbgutierrez/path-parse/issues/8</a></p> <p>Release Date: 2021-05-04</p> <p>Fix Resolution (path-parse): 1.0.7</p> <p>Direct dependency fix Resolution (tslint): 5.12.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_defect
cve high detected in path parse tgz autoclosed cve high severity vulnerability vulnerable library path parse tgz node js path parse ponyfill library home page a href path to dependency file functions package json path to vulnerable library functions node modules path parse package json node modules path parse package json dependency hierarchy tslint tgz root library resolve tgz x path parse tgz vulnerable library found in head commit a href vulnerability details all versions of package path parse are vulnerable to regular expression denial of service redos via splitdevicere splittailre and splitpathre regular expressions redos exhibits polynomial worst case time complexity publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution path parse direct dependency fix resolution tslint step up your open source security game with whitesource
0
10,709
2,622,181,538
IssuesEvent
2015-03-04 00:19:02
byzhang/leveldb
https://api.github.com/repos/byzhang/leveldb
closed
Size() would be handy to get number of records.
auto-migrated Priority-Medium Type-Defect
``` What steps will reproduce the problem? 1. 2. 3. What is the expected output? What do you see instead? What version of the product are you using? On what operating system? 1.5 Please provide any additional information below. ``` Original issue reported on code.google.com by `miroslav...@gmail.com` on 10 Jul 2013 at 3:12
1.0
Size() would be handy to get number of records. - ``` What steps will reproduce the problem? 1. 2. 3. What is the expected output? What do you see instead? What version of the product are you using? On what operating system? 1.5 Please provide any additional information below. ``` Original issue reported on code.google.com by `miroslav...@gmail.com` on 10 Jul 2013 at 3:12
defect
size would be handy to get number of records what steps will reproduce the problem what is the expected output what do you see instead what version of the product are you using on what operating system please provide any additional information below original issue reported on code google com by miroslav gmail com on jul at
1
146,191
19,393,937,662
IssuesEvent
2021-12-18 01:37:06
mgh3326/google-calendar-slackbot
https://api.github.com/repos/mgh3326/google-calendar-slackbot
opened
CVE-2021-42550 (Medium) detected in logback-classic-1.2.3.jar
security vulnerability
## CVE-2021-42550 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>logback-classic-1.2.3.jar</b></p></summary> <p>logback-classic module</p> <p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p> <p>Path to dependency file: google-calendar-slackbot/build.gradle</p> <p>Path to vulnerable library: /root/.gradle/caches/modules-2/files-2.1/ch.qos.logback/logback-classic/1.2.3/7c4f3c474fb2c041d8028740440937705ebb473a/logback-classic-1.2.3.jar</p> <p> Dependency Hierarchy: - spring-boot-devtools-2.3.1.RELEASE (Root Library) - spring-boot-dependencies-2.3.1.RELEASE - :x: **logback-classic-1.2.3.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In logback version 1.2.7 and prior versions, an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from LDAP servers. <p>Publish Date: 2021-12-16 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-42550>CVE-2021-42550</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: High - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://logback.qos.ch/news.html">http://logback.qos.ch/news.html</a></p> <p>Release Date: 2021-12-16</p> <p>Fix Resolution: ch.qos.logback:logback-classic:1.2.8</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-42550 (Medium) detected in logback-classic-1.2.3.jar - ## CVE-2021-42550 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>logback-classic-1.2.3.jar</b></p></summary> <p>logback-classic module</p> <p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p> <p>Path to dependency file: google-calendar-slackbot/build.gradle</p> <p>Path to vulnerable library: /root/.gradle/caches/modules-2/files-2.1/ch.qos.logback/logback-classic/1.2.3/7c4f3c474fb2c041d8028740440937705ebb473a/logback-classic-1.2.3.jar</p> <p> Dependency Hierarchy: - spring-boot-devtools-2.3.1.RELEASE (Root Library) - spring-boot-dependencies-2.3.1.RELEASE - :x: **logback-classic-1.2.3.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In logback version 1.2.7 and prior versions, an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from LDAP servers. <p>Publish Date: 2021-12-16 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-42550>CVE-2021-42550</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: High - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://logback.qos.ch/news.html">http://logback.qos.ch/news.html</a></p> <p>Release Date: 2021-12-16</p> <p>Fix Resolution: ch.qos.logback:logback-classic:1.2.8</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_defect
cve medium detected in logback classic jar cve medium severity vulnerability vulnerable library logback classic jar logback classic module library home page a href path to dependency file google calendar slackbot build gradle path to vulnerable library root gradle caches modules files ch qos logback logback classic logback classic jar dependency hierarchy spring boot devtools release root library spring boot dependencies release x logback classic jar vulnerable library vulnerability details in logback version and prior versions an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from ldap servers publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ch qos logback logback classic step up your open source security game with whitesource
0
52,569
13,224,839,997
IssuesEvent
2020-08-17 19:57:25
icecube-trac/tix4
https://api.github.com/repos/icecube-trac/tix4
opened
[icetray] logging issue in icetray module destructors (Trac #2437)
Incomplete Migration Migrated from Trac combo core defect
<details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/2437">https://code.icecube.wisc.edu/projects/icecube/ticket/2437</a>, reported by lfischerand owned by olivas</em></summary> <p> ```json { "status": "accepted", "changetime": "2020-08-11T14:56:33", "_ts": "1597157793457072", "description": "I found an issue with the logging messages being called in the destructor of icetray modules. I've found the issue in I3Gulliver, but it also happens for I3Module. For some reason the log settings (like level/logtofile) i set do not apply for the logs being generated in the destructor of modules. I assume this is because the destructor is called outside of the scope where everything else was run, so the settings are the default settings, but i could be wrong. I attached a python script as well as a gzd and a i3 file to run it with. I ran it with the oscnext-meta project (http://code.icecube.wisc.edu/svn/sandbox/stuttard/oscNext_meta/trunk) at revision 180302, but i don't think this would be any different in combo. Changing the log settings in line 25-32 will not affect the info log message being called in the I3Gulliver destructor.", "reporter": "lfischer", "cc": "", "resolution": "", "time": "2020-08-11T09:21:58", "component": "combo core", "summary": "[icetray] logging issue in icetray module destructors", "priority": "normal", "keywords": "logging, destructor", "milestone": "Autumnal Equinox 2020", "owner": "olivas", "type": "defect" } ``` </p> </details>
1.0
[icetray] logging issue in icetray module destructors (Trac #2437) - <details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/2437">https://code.icecube.wisc.edu/projects/icecube/ticket/2437</a>, reported by lfischerand owned by olivas</em></summary> <p> ```json { "status": "accepted", "changetime": "2020-08-11T14:56:33", "_ts": "1597157793457072", "description": "I found an issue with the logging messages being called in the destructor of icetray modules. I've found the issue in I3Gulliver, but it also happens for I3Module. For some reason the log settings (like level/logtofile) i set do not apply for the logs being generated in the destructor of modules. I assume this is because the destructor is called outside of the scope where everything else was run, so the settings are the default settings, but i could be wrong. I attached a python script as well as a gzd and a i3 file to run it with. I ran it with the oscnext-meta project (http://code.icecube.wisc.edu/svn/sandbox/stuttard/oscNext_meta/trunk) at revision 180302, but i don't think this would be any different in combo. Changing the log settings in line 25-32 will not affect the info log message being called in the I3Gulliver destructor.", "reporter": "lfischer", "cc": "", "resolution": "", "time": "2020-08-11T09:21:58", "component": "combo core", "summary": "[icetray] logging issue in icetray module destructors", "priority": "normal", "keywords": "logging, destructor", "milestone": "Autumnal Equinox 2020", "owner": "olivas", "type": "defect" } ``` </p> </details>
defect
logging issue in icetray module destructors trac migrated from json status accepted changetime ts description i found an issue with the logging messages being called in the destructor of icetray modules i ve found the issue in but it also happens for for some reason the log settings like level logtofile i set do not apply for the logs being generated in the destructor of modules i assume this is because the destructor is called outside of the scope where everything else was run so the settings are the default settings but i could be wrong i attached a python script as well as a gzd and a file to run it with i ran it with the oscnext meta project at revision but i don t think this would be any different in combo changing the log settings in line will not affect the info log message being called in the destructor reporter lfischer cc resolution time component combo core summary logging issue in icetray module destructors priority normal keywords logging destructor milestone autumnal equinox owner olivas type defect
1
18,853
3,089,697,576
IssuesEvent
2015-08-25 23:05:30
google/googletest
https://api.github.com/repos/google/googletest
opened
complie 1.7 with vs 2010 fail
auto-migrated Priority-Medium Type-Defect
_From @GoogleCodeExporter on August 24, 2015 22:40_ ``` when i complie gmock-1.7.0-rc1.zip with visual studio 2010,i get the error info: 1>------ Build started: Project: gmock, Configuration: Debug Win32 ------ 1> gmock.vcxproj -> E:\gmock-1.7.0\msvc\2010\Debug\gmock.lib 2>------ Build started: Project: gmock_main, Configuration: Debug Win32 ------ 2> gmock_main.vcxproj -> E:\gmock-1.7.0\msvc\2010\Debug\gmock_main.lib 3>------ Build started: Project: gmock_test, Configuration: Debug Win32 ------ 3> gmock_all_test.cc 3>E:\gmock-1.7.0\test/gmock-generated-function-mockers_test.cc(134): warning C4373: 'testing::gmock_generated_function_mockers_test::MockFoo::TakesConst': virtual function overrides 'testing::gmock_generated_function_mockers_test::FooInterface::TakesConst', previous versions of the compiler did not override when parameters only differed by const/volatile qualifiers 3> E:\gmock-1.7.0\test/gmock-generated-function-mockers_test.cc(91) : see declaration of 'testing::gmock_generated_function_mockers_test::FooInterface::TakesConst' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2824): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2825): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2827): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2830): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2831): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2833): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2835): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2837): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2840): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2842): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2847): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2848): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2824): error C2589: '(' : illegal token on right side of '::' 3> E:\gmock-1.7.0\test/gmock-matchers_test.cc(2794) : while compiling class template member function 'void testing::gmock_matchers_test::FloatingPointNearTest<RawType>::TestNearMatches(te sting::internal::FloatingEqMatcher<FloatType> (__cdecl *)(RawType,RawType))' 3> with 3> [ 3> RawType=float, 3> FloatType=float 3> ] 3> E:\gmock-1.7.0\test/gmock-matchers_test.cc(2932) : see reference to class template instantiation 'testing::gmock_matchers_test::FloatingPointNearTest<RawType>' being compiled 3> with 3> [ 3> RawType=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2824): error C2143: syntax error : missing ')' before '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2824): error C2198: 'testing::internal::FloatingEqMatcher<FloatType> (__cdecl *)(RawType,RawType)' : too few arguments for call 3> with 3> [ 3> FloatType=float, 3> RawType=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2825): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2825): error C2059: syntax error : ')' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): error C2143: syntax error : missing ')' before '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): error C2660: 'testing::internal::MatcherBase<T>::Matches' : function does not take 0 arguments 3> with 3> [ 3> T=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): error C2059: syntax error : ')' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2827): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2827): error C2059: syntax error : '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2827): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2827): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2827): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2830): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2830): error C2059: syntax error : '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2831): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): error C2143: syntax error : missing ')' before '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): error C2660: 'testing::internal::MatcherBase<T>::Matches' : function does not take 0 arguments 3> with 3> [ 3> T=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): error C2059: syntax error : ')' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2833): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2833): error C2059: syntax error : '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2833): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2833): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2833): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2835): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2835): error C2143: syntax error : missing ')' before '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2835): error C2198: 'testing::internal::FloatingEqMatcher<FloatType> (__cdecl *)(RawType,RawType)' : too few arguments for call 3> with 3> [ 3> FloatType=float, 3> RawType=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2835): error C2059: syntax error : ')' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): error C2143: syntax error : missing ')' before '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): error C2660: 'testing::internal::MatcherBase<T>::Matches' : function does not take 0 arguments 3> with 3> [ 3> T=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): error C2059: syntax error : ')' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2837): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2837): error C2059: syntax error : '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2837): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2837): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2837): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2840): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2840): error C2059: syntax error : '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): error C2143: syntax error : missing ')' before '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): error C2660: 'testing::internal::MatcherBase<T>::Matches' : function does not take 0 arguments 3> with 3> [ 3> T=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): error C2059: syntax error : ')' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2842): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2842): error C2059: syntax error : '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2842): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2842): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2842): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2847): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2847): error C2143: syntax error : missing ')' before '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2847): error C2198: 'testing::internal::FloatingEqMatcher<FloatType> (__cdecl *)(RawType,RawType)' : too few arguments for call 3> with 3> [ 3> FloatType=float, 3> RawType=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2847): error C2059: syntax error : ')' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2848): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2848): error C2059: syntax error : '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2848): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2848): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2848): error C2228: left of '.c_str' must have class/struct/union ========== Build: 2 succeeded, 1 failed, 0 up-to-date, 0 skipped ========== anyone can tell me how to fix it? sorry for my english is not my mother language. ``` Original issue reported on code.google.com by `prop...@gmail.com` on 18 Sep 2013 at 8:25 _Copied from original issue: google/googlemock#164_
1.0
complie 1.7 with vs 2010 fail - _From @GoogleCodeExporter on August 24, 2015 22:40_ ``` when i complie gmock-1.7.0-rc1.zip with visual studio 2010,i get the error info: 1>------ Build started: Project: gmock, Configuration: Debug Win32 ------ 1> gmock.vcxproj -> E:\gmock-1.7.0\msvc\2010\Debug\gmock.lib 2>------ Build started: Project: gmock_main, Configuration: Debug Win32 ------ 2> gmock_main.vcxproj -> E:\gmock-1.7.0\msvc\2010\Debug\gmock_main.lib 3>------ Build started: Project: gmock_test, Configuration: Debug Win32 ------ 3> gmock_all_test.cc 3>E:\gmock-1.7.0\test/gmock-generated-function-mockers_test.cc(134): warning C4373: 'testing::gmock_generated_function_mockers_test::MockFoo::TakesConst': virtual function overrides 'testing::gmock_generated_function_mockers_test::FooInterface::TakesConst', previous versions of the compiler did not override when parameters only differed by const/volatile qualifiers 3> E:\gmock-1.7.0\test/gmock-generated-function-mockers_test.cc(91) : see declaration of 'testing::gmock_generated_function_mockers_test::FooInterface::TakesConst' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2824): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2825): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2827): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2830): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2831): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2833): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2835): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2837): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2840): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2842): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2847): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2848): warning C4003: not enough actual parameters for macro 'max' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2824): error C2589: '(' : illegal token on right side of '::' 3> E:\gmock-1.7.0\test/gmock-matchers_test.cc(2794) : while compiling class template member function 'void testing::gmock_matchers_test::FloatingPointNearTest<RawType>::TestNearMatches(te sting::internal::FloatingEqMatcher<FloatType> (__cdecl *)(RawType,RawType))' 3> with 3> [ 3> RawType=float, 3> FloatType=float 3> ] 3> E:\gmock-1.7.0\test/gmock-matchers_test.cc(2932) : see reference to class template instantiation 'testing::gmock_matchers_test::FloatingPointNearTest<RawType>' being compiled 3> with 3> [ 3> RawType=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2824): error C2143: syntax error : missing ')' before '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2824): error C2198: 'testing::internal::FloatingEqMatcher<FloatType> (__cdecl *)(RawType,RawType)' : too few arguments for call 3> with 3> [ 3> FloatType=float, 3> RawType=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2825): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2825): error C2059: syntax error : ')' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): error C2143: syntax error : missing ')' before '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): error C2660: 'testing::internal::MatcherBase<T>::Matches' : function does not take 0 arguments 3> with 3> [ 3> T=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): error C2059: syntax error : ')' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2826): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2827): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2827): error C2059: syntax error : '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2827): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2827): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2827): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2830): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2830): error C2059: syntax error : '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2831): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): error C2143: syntax error : missing ')' before '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): error C2660: 'testing::internal::MatcherBase<T>::Matches' : function does not take 0 arguments 3> with 3> [ 3> T=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): error C2059: syntax error : ')' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2832): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2833): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2833): error C2059: syntax error : '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2833): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2833): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2833): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2835): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2835): error C2143: syntax error : missing ')' before '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2835): error C2198: 'testing::internal::FloatingEqMatcher<FloatType> (__cdecl *)(RawType,RawType)' : too few arguments for call 3> with 3> [ 3> FloatType=float, 3> RawType=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2835): error C2059: syntax error : ')' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): error C2143: syntax error : missing ')' before '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): error C2660: 'testing::internal::MatcherBase<T>::Matches' : function does not take 0 arguments 3> with 3> [ 3> T=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): error C2059: syntax error : ')' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2836): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2837): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2837): error C2059: syntax error : '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2837): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2837): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2837): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2840): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2840): error C2059: syntax error : '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): error C2143: syntax error : missing ')' before '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): error C2660: 'testing::internal::MatcherBase<T>::Matches' : function does not take 0 arguments 3> with 3> [ 3> T=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): error C2059: syntax error : ')' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2841): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2842): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2842): error C2059: syntax error : '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2842): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2842): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2842): error C2228: left of '.c_str' must have class/struct/union 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2847): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2847): error C2143: syntax error : missing ')' before '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2847): error C2198: 'testing::internal::FloatingEqMatcher<FloatType> (__cdecl *)(RawType,RawType)' : too few arguments for call 3> with 3> [ 3> FloatType=float, 3> RawType=float 3> ] 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2847): error C2059: syntax error : ')' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2848): error C2589: '(' : illegal token on right side of '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2848): error C2059: syntax error : '::' 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2848): error C2181: illegal else without matching if 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2848): error C2065: 'gtest_ar_' : undeclared identifier 3>E:\gmock-1.7.0\test/gmock-matchers_test.cc(2848): error C2228: left of '.c_str' must have class/struct/union ========== Build: 2 succeeded, 1 failed, 0 up-to-date, 0 skipped ========== anyone can tell me how to fix it? sorry for my english is not my mother language. ``` Original issue reported on code.google.com by `prop...@gmail.com` on 18 Sep 2013 at 8:25 _Copied from original issue: google/googlemock#164_
defect
complie with vs fail from googlecodeexporter on august when i complie gmock zip with visual studio i get the error info build started project gmock configuration debug gmock vcxproj e gmock msvc debug gmock lib build started project gmock main configuration debug gmock main vcxproj e gmock msvc debug gmock main lib build started project gmock test configuration debug gmock all test cc e gmock test gmock generated function mockers test cc warning testing gmock generated function mockers test mockfoo takesconst virtual function overrides testing gmock generated function mockers test foointerface takesconst previous versions of the compiler did not override when parameters only differed by const volatile qualifiers e gmock test gmock generated function mockers test cc see declaration of testing gmock generated function mockers test foointerface takesconst e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc warning not enough actual parameters for macro max e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc while compiling class template member function void testing gmock matchers test floatingpointneartest testnearmatches te sting internal floatingeqmatcher cdecl rawtype rawtype with rawtype float floattype float e gmock test gmock matchers test cc see reference to class template instantiation testing gmock matchers test floatingpointneartest being compiled with rawtype float e gmock test gmock matchers test cc error syntax error missing before e gmock test gmock matchers test cc error testing internal floatingeqmatcher cdecl rawtype rawtype too few arguments for call with floattype float rawtype float e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc error syntax error e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc error syntax error missing before e gmock test gmock matchers test cc error testing internal matcherbase matches function does not take arguments with t float e gmock test gmock matchers test cc error syntax error e gmock test gmock matchers test cc error illegal else without matching if e gmock test gmock matchers test cc error gtest ar undeclared identifier e gmock test gmock matchers test cc error left of c str must have class struct union e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc error syntax error e gmock test gmock matchers test cc error illegal else without matching if e gmock test gmock matchers test cc error gtest ar undeclared identifier e gmock test gmock matchers test cc error left of c str must have class struct union e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc error syntax error e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc error syntax error missing before e gmock test gmock matchers test cc error testing internal matcherbase matches function does not take arguments with t float e gmock test gmock matchers test cc error syntax error e gmock test gmock matchers test cc error illegal else without matching if e gmock test gmock matchers test cc error gtest ar undeclared identifier e gmock test gmock matchers test cc error left of c str must have class struct union e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc error syntax error e gmock test gmock matchers test cc error illegal else without matching if e gmock test gmock matchers test cc error gtest ar undeclared identifier e gmock test gmock matchers test cc error left of c str must have class struct union e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc error syntax error missing before e gmock test gmock matchers test cc error testing internal floatingeqmatcher cdecl rawtype rawtype too few arguments for call with floattype float rawtype float e gmock test gmock matchers test cc error syntax error e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc error syntax error missing before e gmock test gmock matchers test cc error testing internal matcherbase matches function does not take arguments with t float e gmock test gmock matchers test cc error syntax error e gmock test gmock matchers test cc error illegal else without matching if e gmock test gmock matchers test cc error gtest ar undeclared identifier e gmock test gmock matchers test cc error left of c str must have class struct union e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc error syntax error e gmock test gmock matchers test cc error illegal else without matching if e gmock test gmock matchers test cc error gtest ar undeclared identifier e gmock test gmock matchers test cc error left of c str must have class struct union e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc error syntax error e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc error syntax error missing before e gmock test gmock matchers test cc error testing internal matcherbase matches function does not take arguments with t float e gmock test gmock matchers test cc error syntax error e gmock test gmock matchers test cc error illegal else without matching if e gmock test gmock matchers test cc error gtest ar undeclared identifier e gmock test gmock matchers test cc error left of c str must have class struct union e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc error syntax error e gmock test gmock matchers test cc error illegal else without matching if e gmock test gmock matchers test cc error gtest ar undeclared identifier e gmock test gmock matchers test cc error left of c str must have class struct union e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc error syntax error missing before e gmock test gmock matchers test cc error testing internal floatingeqmatcher cdecl rawtype rawtype too few arguments for call with floattype float rawtype float e gmock test gmock matchers test cc error syntax error e gmock test gmock matchers test cc error illegal token on right side of e gmock test gmock matchers test cc error syntax error e gmock test gmock matchers test cc error illegal else without matching if e gmock test gmock matchers test cc error gtest ar undeclared identifier e gmock test gmock matchers test cc error left of c str must have class struct union build succeeded failed up to date skipped anyone can tell me how to fix it sorry for my english is not my mother language original issue reported on code google com by prop gmail com on sep at copied from original issue google googlemock
1
176,918
21,464,429,935
IssuesEvent
2022-04-26 01:08:26
faizulho/sgmelayu-sanity-gatsby-blog
https://api.github.com/repos/faizulho/sgmelayu-sanity-gatsby-blog
opened
CVE-2022-1365 (Medium) detected in cross-fetch-2.2.2.tgz
security vulnerability
## CVE-2022-1365 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>cross-fetch-2.2.2.tgz</b></p></summary> <p>Universal WHATWG Fetch API for Node, Browsers and React Native</p> <p>Library home page: <a href="https://registry.npmjs.org/cross-fetch/-/cross-fetch-2.2.2.tgz">https://registry.npmjs.org/cross-fetch/-/cross-fetch-2.2.2.tgz</a></p> <p>Path to dependency file: /web/package.json</p> <p>Path to vulnerable library: /web/node_modules/cross-fetch/package.json</p> <p> Dependency Hierarchy: - gatsby-2.23.9.tgz (Root Library) - eslint-plugin-graphql-3.1.1.tgz - graphql-config-2.2.2.tgz - graphql-request-1.8.2.tgz - :x: **cross-fetch-2.2.2.tgz** (Vulnerable Library) <p>Found in base branch: <b>production</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Exposure of Private Personal Information to an Unauthorized Actor in GitHub repository lquixada/cross-fetch prior to 3.1.5. <p>Publish Date: 2022-04-15 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-1365>CVE-2022-1365</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-1365">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-1365</a></p> <p>Release Date: 2022-04-15</p> <p>Fix Resolution (cross-fetch): 2.2.6</p> <p>Direct dependency fix Resolution (gatsby): 2.24.16-ink3.22</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-1365 (Medium) detected in cross-fetch-2.2.2.tgz - ## CVE-2022-1365 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>cross-fetch-2.2.2.tgz</b></p></summary> <p>Universal WHATWG Fetch API for Node, Browsers and React Native</p> <p>Library home page: <a href="https://registry.npmjs.org/cross-fetch/-/cross-fetch-2.2.2.tgz">https://registry.npmjs.org/cross-fetch/-/cross-fetch-2.2.2.tgz</a></p> <p>Path to dependency file: /web/package.json</p> <p>Path to vulnerable library: /web/node_modules/cross-fetch/package.json</p> <p> Dependency Hierarchy: - gatsby-2.23.9.tgz (Root Library) - eslint-plugin-graphql-3.1.1.tgz - graphql-config-2.2.2.tgz - graphql-request-1.8.2.tgz - :x: **cross-fetch-2.2.2.tgz** (Vulnerable Library) <p>Found in base branch: <b>production</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Exposure of Private Personal Information to an Unauthorized Actor in GitHub repository lquixada/cross-fetch prior to 3.1.5. <p>Publish Date: 2022-04-15 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-1365>CVE-2022-1365</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-1365">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-1365</a></p> <p>Release Date: 2022-04-15</p> <p>Fix Resolution (cross-fetch): 2.2.6</p> <p>Direct dependency fix Resolution (gatsby): 2.24.16-ink3.22</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_defect
cve medium detected in cross fetch tgz cve medium severity vulnerability vulnerable library cross fetch tgz universal whatwg fetch api for node browsers and react native library home page a href path to dependency file web package json path to vulnerable library web node modules cross fetch package json dependency hierarchy gatsby tgz root library eslint plugin graphql tgz graphql config tgz graphql request tgz x cross fetch tgz vulnerable library found in base branch production vulnerability details exposure of private personal information to an unauthorized actor in github repository lquixada cross fetch prior to publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution cross fetch direct dependency fix resolution gatsby step up your open source security game with whitesource
0
157,711
24,710,685,718
IssuesEvent
2022-10-20 00:14:11
MicrosoftDocs/cpp-docs
https://api.github.com/repos/MicrosoftDocs/cpp-docs
closed
Visual C++ Redistributable release notes
product-question support-request Closed - by design
Hi. I'm having difficulty locating any release notes for the multiple versions of Visual C++ redistributable. Specifically, I'm looking for release dates and if the patch is security related or not. Can you please assist?
1.0
Visual C++ Redistributable release notes - Hi. I'm having difficulty locating any release notes for the multiple versions of Visual C++ redistributable. Specifically, I'm looking for release dates and if the patch is security related or not. Can you please assist?
non_defect
visual c redistributable release notes hi i m having difficulty locating any release notes for the multiple versions of visual c redistributable specifically i m looking for release dates and if the patch is security related or not can you please assist
0
438,418
12,627,499,455
IssuesEvent
2020-06-14 21:50:03
siodb/siodb
https://api.github.com/repos/siodb/siodb
closed
Build fails on the CentOS 7
component:iomgr priority:major
First issue: ``` /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp: In function 'void siodb::crypto::{anonymous}::checkRsaLength(const RSA*)': /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:136:5: error: 'RSA_get0_key' was not declared in this scope RSA_get0_key(rsa, &rsaN, nullptr, nullptr); ^~~~~~~~~~~~ /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:136:5: note: suggested alternative: 'RSA_check_key' RSA_get0_key(rsa, &rsaN, nullptr, nullptr); ^~~~~~~~~~~~ RSA_check_key /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp: In member function 'void siodb::crypto::DigitalSignatureKey::parseOpenSslKey(siodb::utils::StringScanner&)': /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:263:26: error: 'EVP_PKEY_get0_RSA' was not declared in this scope const auto rsa = EVP_PKEY_get0_RSA(*key); ^~~~~~~~~~~~~~~~~ /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:263:26: note: suggested alternative: 'EVP_PKEY_get1_RSA' const auto rsa = EVP_PKEY_get0_RSA(*key); ^~~~~~~~~~~~~~~~~ EVP_PKEY_get1_RSA /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp: In member function 'void siodb::crypto::DigitalSignatureKey::parseOpenSshRsaPublicKey(const string& ': /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:280:9: error: 'RSA_set0_key' was not declared in this scope if (RSA_set0_key(rsa, rsaN, rsaE, nullptr) != 1) throw OpenSslError("RSA_set0_key failed"); ^~~~~~~~~~~~ /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:280:9: note: suggested alternative: 'RSA_check_key' if (RSA_set0_key(rsa, rsaN, rsaE, nullptr) != 1) throw OpenSslError("RSA_set0_key failed"); ^~~~~~~~~~~~ RSA_check_key /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp: In member function 'void siodb::crypto::DigitalSignatureKey::parseOpenSshDsaPublicKey(const string& ': /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:304:9: error: 'DSA_set0_pqg' was not declared in this scope if (DSA_set0_pqg(dsa, dsaP, dsaQ, dsaG) == 0) throw OpenSslError("DSA_set0_pqg failed"); ^~~~~~~~~~~~ /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:312:9: error: 'DSA_set0_key' was not declared in this scope if (DSA_set0_key(dsa, dsaPubKey, nullptr) == 0) throw OpenSslError("DSA_set0_key failed"); ^~~~~~~~~~~~ /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:312:9: note: suggested alternative: 'RSA_check_key' if (DSA_set0_key(dsa, dsaPubKey, nullptr) == 0) throw OpenSslError("DSA_set0_key failed"); ^~~~~~~~~~~~ RSA_check_key compilation terminated due to -fmax-errors=5. ``` Second issue: ``` In file included from /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.cpp:5: /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:37:38: error: 'std::optional' has not been declared BinaryValue&& cipherKey, std::optional<std::string>&& description) ^~~ /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:37:51: error: expected ',' or '...' before '<' token BinaryValue&& cipherKey, std::optional<std::string>&& description) ^ /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:105:10: error: 'optional' in namespace 'std' does not name a template type std::optional<std::string> m_description; ^~~~~~~~ /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:105:5: note: 'std::optional' is defined in header '<optional>'; did you forget to '#include <optional>'? /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:10:1: +#include <optional> /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:105:5: std::optional<std::string> m_description; ^~~ /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h: In constructor 'siodb::iomgr::dbengine::DatabaseRecord::DatabaseRecord(uint32_t, const Uuid&, std::string&&, std::string&&, siodb::BinaryValue&&, int)': /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:43:11: error: class 'siodb::iomgr::dbengine::DatabaseRecord' does not have any field named 'm_description' , m_description(std::move(description)) ^~~~~~~~~~~~~ /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:43:35: error: 'description' was not declared in this scope , m_description(std::move(description)) ^~~~~~~~~~~ /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:43:35: note: suggested alternative: 'deserialize' , m_description(std::move(description)) ```
1.0
Build fails on the CentOS 7 - First issue: ``` /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp: In function 'void siodb::crypto::{anonymous}::checkRsaLength(const RSA*)': /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:136:5: error: 'RSA_get0_key' was not declared in this scope RSA_get0_key(rsa, &rsaN, nullptr, nullptr); ^~~~~~~~~~~~ /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:136:5: note: suggested alternative: 'RSA_check_key' RSA_get0_key(rsa, &rsaN, nullptr, nullptr); ^~~~~~~~~~~~ RSA_check_key /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp: In member function 'void siodb::crypto::DigitalSignatureKey::parseOpenSslKey(siodb::utils::StringScanner&)': /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:263:26: error: 'EVP_PKEY_get0_RSA' was not declared in this scope const auto rsa = EVP_PKEY_get0_RSA(*key); ^~~~~~~~~~~~~~~~~ /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:263:26: note: suggested alternative: 'EVP_PKEY_get1_RSA' const auto rsa = EVP_PKEY_get0_RSA(*key); ^~~~~~~~~~~~~~~~~ EVP_PKEY_get1_RSA /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp: In member function 'void siodb::crypto::DigitalSignatureKey::parseOpenSshRsaPublicKey(const string& ': /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:280:9: error: 'RSA_set0_key' was not declared in this scope if (RSA_set0_key(rsa, rsaN, rsaE, nullptr) != 1) throw OpenSslError("RSA_set0_key failed"); ^~~~~~~~~~~~ /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:280:9: note: suggested alternative: 'RSA_check_key' if (RSA_set0_key(rsa, rsaN, rsaE, nullptr) != 1) throw OpenSslError("RSA_set0_key failed"); ^~~~~~~~~~~~ RSA_check_key /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp: In member function 'void siodb::crypto::DigitalSignatureKey::parseOpenSshDsaPublicKey(const string& ': /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:304:9: error: 'DSA_set0_pqg' was not declared in this scope if (DSA_set0_pqg(dsa, dsaP, dsaQ, dsaG) == 0) throw OpenSslError("DSA_set0_pqg failed"); ^~~~~~~~~~~~ /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:312:9: error: 'DSA_set0_key' was not declared in this scope if (DSA_set0_key(dsa, dsaPubKey, nullptr) == 0) throw OpenSslError("DSA_set0_key failed"); ^~~~~~~~~~~~ /opt/siodb/common/lib/siodb/common/crypto/DigitalSignatureKey.cpp:312:9: note: suggested alternative: 'RSA_check_key' if (DSA_set0_key(dsa, dsaPubKey, nullptr) == 0) throw OpenSslError("DSA_set0_key failed"); ^~~~~~~~~~~~ RSA_check_key compilation terminated due to -fmax-errors=5. ``` Second issue: ``` In file included from /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.cpp:5: /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:37:38: error: 'std::optional' has not been declared BinaryValue&& cipherKey, std::optional<std::string>&& description) ^~~ /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:37:51: error: expected ',' or '...' before '<' token BinaryValue&& cipherKey, std::optional<std::string>&& description) ^ /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:105:10: error: 'optional' in namespace 'std' does not name a template type std::optional<std::string> m_description; ^~~~~~~~ /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:105:5: note: 'std::optional' is defined in header '<optional>'; did you forget to '#include <optional>'? /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:10:1: +#include <optional> /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:105:5: std::optional<std::string> m_description; ^~~ /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h: In constructor 'siodb::iomgr::dbengine::DatabaseRecord::DatabaseRecord(uint32_t, const Uuid&, std::string&&, std::string&&, siodb::BinaryValue&&, int)': /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:43:11: error: class 'siodb::iomgr::dbengine::DatabaseRecord' does not have any field named 'm_description' , m_description(std::move(description)) ^~~~~~~~~~~~~ /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:43:35: error: 'description' was not declared in this scope , m_description(std::move(description)) ^~~~~~~~~~~ /opt/siodb/iomgr/lib/dbengine/reg/DatabaseRecord.h:43:35: note: suggested alternative: 'deserialize' , m_description(std::move(description)) ```
non_defect
build fails on the centos first issue opt siodb common lib siodb common crypto digitalsignaturekey cpp in function void siodb crypto anonymous checkrsalength const rsa opt siodb common lib siodb common crypto digitalsignaturekey cpp error rsa key was not declared in this scope rsa key rsa rsan nullptr nullptr opt siodb common lib siodb common crypto digitalsignaturekey cpp note suggested alternative rsa check key rsa key rsa rsan nullptr nullptr rsa check key opt siodb common lib siodb common crypto digitalsignaturekey cpp in member function void siodb crypto digitalsignaturekey parseopensslkey siodb utils stringscanner opt siodb common lib siodb common crypto digitalsignaturekey cpp error evp pkey rsa was not declared in this scope const auto rsa evp pkey rsa key opt siodb common lib siodb common crypto digitalsignaturekey cpp note suggested alternative evp pkey rsa const auto rsa evp pkey rsa key evp pkey rsa opt siodb common lib siodb common crypto digitalsignaturekey cpp in member function void siodb crypto digitalsignaturekey parseopensshrsapublickey const string opt siodb common lib siodb common crypto digitalsignaturekey cpp error rsa key was not declared in this scope if rsa key rsa rsan rsae nullptr throw opensslerror rsa key failed opt siodb common lib siodb common crypto digitalsignaturekey cpp note suggested alternative rsa check key if rsa key rsa rsan rsae nullptr throw opensslerror rsa key failed rsa check key opt siodb common lib siodb common crypto digitalsignaturekey cpp in member function void siodb crypto digitalsignaturekey parseopensshdsapublickey const string opt siodb common lib siodb common crypto digitalsignaturekey cpp error dsa pqg was not declared in this scope if dsa pqg dsa dsap dsaq dsag throw opensslerror dsa pqg failed opt siodb common lib siodb common crypto digitalsignaturekey cpp error dsa key was not declared in this scope if dsa key dsa dsapubkey nullptr throw opensslerror dsa key failed opt siodb common lib siodb common crypto digitalsignaturekey cpp note suggested alternative rsa check key if dsa key dsa dsapubkey nullptr throw opensslerror dsa key failed rsa check key compilation terminated due to fmax errors second issue in file included from opt siodb iomgr lib dbengine reg databaserecord cpp opt siodb iomgr lib dbengine reg databaserecord h error std optional has not been declared binaryvalue cipherkey std optional description opt siodb iomgr lib dbengine reg databaserecord h error expected or before token binaryvalue cipherkey std optional description opt siodb iomgr lib dbengine reg databaserecord h error optional in namespace std does not name a template type std optional m description opt siodb iomgr lib dbengine reg databaserecord h note std optional is defined in header did you forget to include opt siodb iomgr lib dbengine reg databaserecord h include opt siodb iomgr lib dbengine reg databaserecord h std optional m description opt siodb iomgr lib dbengine reg databaserecord h in constructor siodb iomgr dbengine databaserecord databaserecord t const uuid std string std string siodb binaryvalue int opt siodb iomgr lib dbengine reg databaserecord h error class siodb iomgr dbengine databaserecord does not have any field named m description m description std move description opt siodb iomgr lib dbengine reg databaserecord h error description was not declared in this scope m description std move description opt siodb iomgr lib dbengine reg databaserecord h note suggested alternative deserialize m description std move description
0
61,958
17,023,819,136
IssuesEvent
2021-07-03 04:01:14
tomhughes/trac-tickets
https://api.github.com/repos/tomhughes/trac-tickets
closed
click twice for one logout
Component: admin Priority: minor Resolution: invalid Type: defect
**[Submitted to the original trac issue database at 6.39pm, Thursday, 30th August 2012]** When logging out from help.osm.org I have to click [Logout], then I get the message ``` Clicking Logout will log you out from the forum, but will not sign you off from your OpenID provider. If you wish to sign off completely - please make sure to log out from your OpenID provider as well. ``` then I have to click [Logout Now]. Since I always login "directly" entering user and password this is unnecessary. Would love to see that smoothened.
1.0
click twice for one logout - **[Submitted to the original trac issue database at 6.39pm, Thursday, 30th August 2012]** When logging out from help.osm.org I have to click [Logout], then I get the message ``` Clicking Logout will log you out from the forum, but will not sign you off from your OpenID provider. If you wish to sign off completely - please make sure to log out from your OpenID provider as well. ``` then I have to click [Logout Now]. Since I always login "directly" entering user and password this is unnecessary. Would love to see that smoothened.
defect
click twice for one logout when logging out from help osm org i have to click then i get the message clicking logout will log you out from the forum but will not sign you off from your openid provider if you wish to sign off completely please make sure to log out from your openid provider as well then i have to click since i always login directly entering user and password this is unnecessary would love to see that smoothened
1
12,266
8,653,345,688
IssuesEvent
2018-11-27 10:35:29
elastic/elasticsearch
https://api.github.com/repos/elastic/elasticsearch
opened
Move security audit logs to use ILM and write aliases
:Security/Audit >enhancement stalled
Once ILM is complete we should move the security audit log indexes over so they use write aliases to index and also have them use ILM to manage rolling over and deleting the indexes. For now this work is stalled waiting for the ILM feature to be completed /cc @bmcconaghy @bleskes
True
Move security audit logs to use ILM and write aliases - Once ILM is complete we should move the security audit log indexes over so they use write aliases to index and also have them use ILM to manage rolling over and deleting the indexes. For now this work is stalled waiting for the ILM feature to be completed /cc @bmcconaghy @bleskes
non_defect
move security audit logs to use ilm and write aliases once ilm is complete we should move the security audit log indexes over so they use write aliases to index and also have them use ilm to manage rolling over and deleting the indexes for now this work is stalled waiting for the ilm feature to be completed cc bmcconaghy bleskes
0
29,734
5,846,324,736
IssuesEvent
2017-05-10 15:56:27
idaholab/moose
https://api.github.com/repos/idaholab/moose
closed
Block restriction causes segfault in CNSFV
C: Modules P: normal T: defect
### Error report I am working with the CNSFV code in the Navier-Stokes module. I am trying to restrict a fluid flow to a subdomain of the problem. However, this appears to be causing a segmentation fault. Running the same input deck without block restrictions doesn't produce an error, just wrong answers. Both input files are included below. <details><summary>The gdb stack trace</summary> <pre> /opt/moose/gcc-6.2.0/include/c++/6.2.0/debug/vector:415: Error: attempt to subscript container with out-of-bounds index 0, but container only holds 0 elements. Objects involved in the operation: sequence "this" @ 0x0x7fffffffc130 { type = std::__debug::vector<unsigned int, std::allocator<unsigned int> >; } Program received signal SIGABRT, Aborted. 0x00007fffe7714428 in __GI_raise (sig=sig@entry=6) at ../sysdeps/unix/sysv/linux/raise.c:54 54 ../sysdeps/unix/sysv/linux/raise.c: No such file or directory. (gdb) where #0 0x00007fffe7714428 in __GI_raise (sig=sig@entry=6) at ../sysdeps/unix/sysv/linux/raise.c:54 #1 0x00007fffe771602a in __GI_abort () at abort.c:89 #2 0x00007fffe7e671e3 in __gnu_debug::_Error_formatter::_M_error (this=0x62a140 <__gnu_debug::_Error_formatter::_M_at(char const*, unsigned int)::__formatter>) at ../../../../../gcc-6.2.0/libstdc++-v3/src/c++11/debug.cc:1061 #3 0x00007ffff6eb60b2 in std::__debug::vector<unsigned int, std::allocator<unsigned int> >::operator[] (this=0x7fffffffc130, __n=0) at /opt/moose/gcc-6.2.0/include/c++/6.2.0/debug/vector:415 #4 0x00007ffff12534cb in MooseVariable::getElementalValue (this=0xb82330, elem=0xa25280, idx=0) at /home/moose-user/Projects/moose/framework/src/base/MooseVariable.C:1716 #5 0x00007ffff3525a65 in CNSFVLeastSquaresSlopeReconstruction::reconstructElementSlope (this=0xba0a60) at /home/moose-user/Projects/moose/modules/navier_stokes/src/userobjects/CNSFVLeastSquaresSlopeReconstruction.C:157 #6 0x00007ffff5285819 in SlopeReconstructionBase::computeElement (this=0xba0a60) at /home/moose-user/Projects/moose/modules/rdg/src/userobjects/SlopeReconstructionBase.C:217 #7 0x00007ffff5261a7e in ElementLoopUserObject::onElement (this=0xba0a60, elem=0xa22140) at /home/moose-user/Projects/moose/modules/rdg/src/userobjects/ElementLoopUserObject.C:147 #8 0x00007ffff5261664 in ElementLoopUserObject::execute (this=0xba0a60) at /home/moose-user/Projects/moose/modules/rdg/src/userobjects/ElementLoopUserObject.C:97 #9 0x00007ffff127b42d in FEProblemBase::computeUserObjects (this=0xaf5380, type=@0x7fffffffcd60: EXEC_LINEAR, group=@0x7fffffffcd5c: Moose::POST_AUX) at /home/moose-user/Projects/moose/framework/src/base/FEProblemBase.C:2591 #10 0x00007ffff128225d in FEProblemBase::computeResidualType (this=0xaf5380, soln=..., residual=..., type=Moose::KT_ALL) at /home/moose-user/Projects/moose/framework/src/base/FEProblemBase.C:3618 #11 0x00007ffff1281e53 in FEProblemBase::computeResidual (this=0xaf5380, soln=..., residual=...) at /home/moose-user/Projects/moose/framework/src/base/FEProblemBase.C:3542 #12 0x00007ffff1281e13 in FEProblemBase::computeResidual (this=0xaf5380, soln=..., residual=...) at /home/moose-user/Projects/moose/framework/src/base/FEProblemBase.C:3534 #13 0x00007ffff15300c3 in NonlinearSystem::solve (this=0xb1e230) at /home/moose-user/Projects/moose/framework/src/base/NonlinearSystem.C:121 #14 0x00007ffff12810e9 in FEProblemBase::solve (this=0xaf5380) at /home/moose-user/Projects/moose/framework/src/base/FEProblemBase.C:3313 #15 0x00007ffff0b11faf in TimeStepper::step (this=0xaeed10) at /home/moose-user/Projects/moose/framework/src/timesteppers/TimeStepper.C:188 #16 0x00007ffff0cb5194 in Transient::solveStep (this=0xb713e0, input_dt=-1) at /home/moose-user/Projects/moose/framework/src/executioners/Transient.C:392 #17 0x00007ffff0cb4d68 in Transient::takeStep (this=0xb713e0, input_dt=-1) at /home/moose-user/Projects/moose/framework/src/executioners/Transient.C:327 #18 0x00007ffff0cb49aa in Transient::execute (this=0xb713e0) at /home/moose-user/Projects/moose/framework/src/executioners/Transient.C:251 #19 0x00007ffff14d8995 in MooseApp::executeExecutioner (this=0x6cf240) at /home/moose-user/Projects/moose/framework/src/base/MooseApp.C:468 #20 0x00007ffff14d9598 in MooseApp::run (this=0x6cf240) at /home/moose-user/Projects/moose/framework/src/base/MooseApp.C:619 #21 0x000000000041adaf in main (argc=3, argv=0x7fffffffd178) at /home/moose-user/Projects/phoenix/src/main.C:23 </pre> </details> It looks like the error involves an empty array in MooseVariable::getElementalValue(), but I'm having trouble tracing the problem further. Any help would be appreciated.
1.0
Block restriction causes segfault in CNSFV - ### Error report I am working with the CNSFV code in the Navier-Stokes module. I am trying to restrict a fluid flow to a subdomain of the problem. However, this appears to be causing a segmentation fault. Running the same input deck without block restrictions doesn't produce an error, just wrong answers. Both input files are included below. <details><summary>The gdb stack trace</summary> <pre> /opt/moose/gcc-6.2.0/include/c++/6.2.0/debug/vector:415: Error: attempt to subscript container with out-of-bounds index 0, but container only holds 0 elements. Objects involved in the operation: sequence "this" @ 0x0x7fffffffc130 { type = std::__debug::vector<unsigned int, std::allocator<unsigned int> >; } Program received signal SIGABRT, Aborted. 0x00007fffe7714428 in __GI_raise (sig=sig@entry=6) at ../sysdeps/unix/sysv/linux/raise.c:54 54 ../sysdeps/unix/sysv/linux/raise.c: No such file or directory. (gdb) where #0 0x00007fffe7714428 in __GI_raise (sig=sig@entry=6) at ../sysdeps/unix/sysv/linux/raise.c:54 #1 0x00007fffe771602a in __GI_abort () at abort.c:89 #2 0x00007fffe7e671e3 in __gnu_debug::_Error_formatter::_M_error (this=0x62a140 <__gnu_debug::_Error_formatter::_M_at(char const*, unsigned int)::__formatter>) at ../../../../../gcc-6.2.0/libstdc++-v3/src/c++11/debug.cc:1061 #3 0x00007ffff6eb60b2 in std::__debug::vector<unsigned int, std::allocator<unsigned int> >::operator[] (this=0x7fffffffc130, __n=0) at /opt/moose/gcc-6.2.0/include/c++/6.2.0/debug/vector:415 #4 0x00007ffff12534cb in MooseVariable::getElementalValue (this=0xb82330, elem=0xa25280, idx=0) at /home/moose-user/Projects/moose/framework/src/base/MooseVariable.C:1716 #5 0x00007ffff3525a65 in CNSFVLeastSquaresSlopeReconstruction::reconstructElementSlope (this=0xba0a60) at /home/moose-user/Projects/moose/modules/navier_stokes/src/userobjects/CNSFVLeastSquaresSlopeReconstruction.C:157 #6 0x00007ffff5285819 in SlopeReconstructionBase::computeElement (this=0xba0a60) at /home/moose-user/Projects/moose/modules/rdg/src/userobjects/SlopeReconstructionBase.C:217 #7 0x00007ffff5261a7e in ElementLoopUserObject::onElement (this=0xba0a60, elem=0xa22140) at /home/moose-user/Projects/moose/modules/rdg/src/userobjects/ElementLoopUserObject.C:147 #8 0x00007ffff5261664 in ElementLoopUserObject::execute (this=0xba0a60) at /home/moose-user/Projects/moose/modules/rdg/src/userobjects/ElementLoopUserObject.C:97 #9 0x00007ffff127b42d in FEProblemBase::computeUserObjects (this=0xaf5380, type=@0x7fffffffcd60: EXEC_LINEAR, group=@0x7fffffffcd5c: Moose::POST_AUX) at /home/moose-user/Projects/moose/framework/src/base/FEProblemBase.C:2591 #10 0x00007ffff128225d in FEProblemBase::computeResidualType (this=0xaf5380, soln=..., residual=..., type=Moose::KT_ALL) at /home/moose-user/Projects/moose/framework/src/base/FEProblemBase.C:3618 #11 0x00007ffff1281e53 in FEProblemBase::computeResidual (this=0xaf5380, soln=..., residual=...) at /home/moose-user/Projects/moose/framework/src/base/FEProblemBase.C:3542 #12 0x00007ffff1281e13 in FEProblemBase::computeResidual (this=0xaf5380, soln=..., residual=...) at /home/moose-user/Projects/moose/framework/src/base/FEProblemBase.C:3534 #13 0x00007ffff15300c3 in NonlinearSystem::solve (this=0xb1e230) at /home/moose-user/Projects/moose/framework/src/base/NonlinearSystem.C:121 #14 0x00007ffff12810e9 in FEProblemBase::solve (this=0xaf5380) at /home/moose-user/Projects/moose/framework/src/base/FEProblemBase.C:3313 #15 0x00007ffff0b11faf in TimeStepper::step (this=0xaeed10) at /home/moose-user/Projects/moose/framework/src/timesteppers/TimeStepper.C:188 #16 0x00007ffff0cb5194 in Transient::solveStep (this=0xb713e0, input_dt=-1) at /home/moose-user/Projects/moose/framework/src/executioners/Transient.C:392 #17 0x00007ffff0cb4d68 in Transient::takeStep (this=0xb713e0, input_dt=-1) at /home/moose-user/Projects/moose/framework/src/executioners/Transient.C:327 #18 0x00007ffff0cb49aa in Transient::execute (this=0xb713e0) at /home/moose-user/Projects/moose/framework/src/executioners/Transient.C:251 #19 0x00007ffff14d8995 in MooseApp::executeExecutioner (this=0x6cf240) at /home/moose-user/Projects/moose/framework/src/base/MooseApp.C:468 #20 0x00007ffff14d9598 in MooseApp::run (this=0x6cf240) at /home/moose-user/Projects/moose/framework/src/base/MooseApp.C:619 #21 0x000000000041adaf in main (argc=3, argv=0x7fffffffd178) at /home/moose-user/Projects/phoenix/src/main.C:23 </pre> </details> It looks like the error involves an empty array in MooseVariable::getElementalValue(), but I'm having trouble tracing the problem further. Any help would be appreciated.
defect
block restriction causes segfault in cnsfv error report i am working with the cnsfv code in the navier stokes module i am trying to restrict a fluid flow to a subdomain of the problem however this appears to be causing a segmentation fault running the same input deck without block restrictions doesn t produce an error just wrong answers both input files are included below the gdb stack trace opt moose gcc include c debug vector error attempt to subscript container with out of bounds index but container only holds elements objects involved in the operation sequence this type std debug vector program received signal sigabrt aborted in gi raise sig sig entry at sysdeps unix sysv linux raise c sysdeps unix sysv linux raise c no such file or directory gdb where in gi raise sig sig entry at sysdeps unix sysv linux raise c in gi abort at abort c in gnu debug error formatter m error this at gcc libstdc src c debug cc in std debug vector operator this n at opt moose gcc include c debug vector in moosevariable getelementalvalue this elem idx at home moose user projects moose framework src base moosevariable c in cnsfvleastsquaresslopereconstruction reconstructelementslope this at home moose user projects moose modules navier stokes src userobjects cnsfvleastsquaresslopereconstruction c in slopereconstructionbase computeelement this at home moose user projects moose modules rdg src userobjects slopereconstructionbase c in elementloopuserobject onelement this elem at home moose user projects moose modules rdg src userobjects elementloopuserobject c in elementloopuserobject execute this at home moose user projects moose modules rdg src userobjects elementloopuserobject c in feproblembase computeuserobjects this type exec linear group moose post aux at home moose user projects moose framework src base feproblembase c in feproblembase computeresidualtype this soln residual type moose kt all at home moose user projects moose framework src base feproblembase c in feproblembase computeresidual this soln residual at home moose user projects moose framework src base feproblembase c in feproblembase computeresidual this soln residual at home moose user projects moose framework src base feproblembase c in nonlinearsystem solve this at home moose user projects moose framework src base nonlinearsystem c in feproblembase solve this at home moose user projects moose framework src base feproblembase c in timestepper step this at home moose user projects moose framework src timesteppers timestepper c in transient solvestep this input dt at home moose user projects moose framework src executioners transient c in transient takestep this input dt at home moose user projects moose framework src executioners transient c in transient execute this at home moose user projects moose framework src executioners transient c in mooseapp executeexecutioner this at home moose user projects moose framework src base mooseapp c in mooseapp run this at home moose user projects moose framework src base mooseapp c in main argc argv at home moose user projects phoenix src main c it looks like the error involves an empty array in moosevariable getelementalvalue but i m having trouble tracing the problem further any help would be appreciated
1
11,698
2,663,035,982
IssuesEvent
2015-03-20 00:16:42
epoblaguev/graphingcalculator
https://api.github.com/repos/epoblaguev/graphingcalculator
closed
Patch: Corrected handling of NaN values while plotting graphs
auto-migrated Priority-Medium Type-Defect
``` When plotting certain graphs (such as y=sqrt(x)), parts of the graph would plot incorrectly. This patch fixes this issue (but not for asymptotes of one value, like y=tan(x) or y=1/x). ``` Original issue reported on code.google.com by `luckytoi...@gmail.com` on 7 Mar 2010 at 8:06 Attachments: * [graphing.patch](https://storage.googleapis.com/google-code-attachments/graphingcalculator/issue-2/comment-0/graphing.patch)
1.0
Patch: Corrected handling of NaN values while plotting graphs - ``` When plotting certain graphs (such as y=sqrt(x)), parts of the graph would plot incorrectly. This patch fixes this issue (but not for asymptotes of one value, like y=tan(x) or y=1/x). ``` Original issue reported on code.google.com by `luckytoi...@gmail.com` on 7 Mar 2010 at 8:06 Attachments: * [graphing.patch](https://storage.googleapis.com/google-code-attachments/graphingcalculator/issue-2/comment-0/graphing.patch)
defect
patch corrected handling of nan values while plotting graphs when plotting certain graphs such as y sqrt x parts of the graph would plot incorrectly this patch fixes this issue but not for asymptotes of one value like y tan x or y x original issue reported on code google com by luckytoi gmail com on mar at attachments
1
303,381
9,306,504,274
IssuesEvent
2019-03-25 09:52:33
cs2103-ay1819s2-w09-3/main
https://api.github.com/repos/cs2103-ay1819s2-w09-3/main
closed
As a waiter I can delete orders from a table
priority.High type.Story
...so that I can update changes to the orders at each table.
1.0
As a waiter I can delete orders from a table - ...so that I can update changes to the orders at each table.
non_defect
as a waiter i can delete orders from a table so that i can update changes to the orders at each table
0
49,608
13,187,239,326
IssuesEvent
2020-08-13 02:47:19
icecube-trac/tix3
https://api.github.com/repos/icecube-trac/tix3
opened
filter scripts refer to extinct projects in many places (Trac #1742)
Incomplete Migration Migrated from Trac combo reconstruction defect
<details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1742">https://code.icecube.wisc.edu/ticket/1742</a>, reported by kjmeagher and owned by blaufuss</em></summary> <p> ```json { "status": "closed", "changetime": "2019-02-13T14:12:38", "description": "it is causing warnings in sphinx\n{{{\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.dst.rst:15: WARNING: autodoc: failed to import module u'icecube.dst.dstfilter'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/dst/dstfilter.py\", line 5, in <module>\n import filter_globals\nImportError: No module named filter_globals\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.examples_simulation.rst:15: WARNING: autodoc: failed to import module u'icecube.examples_simulation.configure_simulation'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/examples_simulation/configure_simulation.py\", line 14, in <module>\n from icecube.sim_services.sanity_checker import SimulationSanityChecker\nImportError: No module named sanity_checker\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.filterscripts.offlineL2.rst:47lders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/filterscripts/offlineL2/level2_Reconstruction_Cascade.py\", line 2, in <module>\n from icecube import linefit, improvedLinefit, dipolefit, clast, cscd_llh, fill_ratio, tensor_of_inertia,CascadeVariables, AtmCscdEnergyReco\nImportError: cannot import name AtmCscdEnergyReco\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.finiteReco.rst:15: WARNING: autodoc: failed to import module u'icecube.finiteReco.converters'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/finiteReco/converters.py\", line 1, in <module>\n from icecube.finiteReco import I3FiniteCuts\nImportError: cannot import name I3FiniteCuts\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.level3_filter_lowen.rst:15: WARNING: autodoc: failed to import module u'icecube.level3_filter_lowen.LowEnergyL3TraySegment'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/level3_filter_lowen/LowEnergyL3TraySegment.py\", line 17, in <module>\n from icecube.STTools.seededRT.configuration_services import I3ClassicSeededRTConfigurationService\nImportError: cannot import name I3ClassicSeededRTConfigurationService\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.level3_filter_lowen.rst:23: WARNING: autodoc: failed to import module u'icecube.level3_filter_lowen.level3_Master_2012'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/level3_filter_lowen/level3_Master_2012.py\", line 17, in <module>\n from icecube.level3_filter_lowen import LowEnergyL3TraySegment\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/level3_filter_lowen/LowEnergyL3TraySegment.py\", line 17, in <module>\n from icecube.STTools.seededRT.configuration_services import I3ClassicSeededRTConfigurationService\nImportError: cannot import name I3ClassicSeededRTConfigurationService\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.level3_filter_lowen.rst:31: WARNING: autodoc: failed to import module u'icecube.level3_filter_lowen.level3_segments'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/level3_filter_lowen/level3_segments.py\", line 17, in <module>\n from icecube.STTools.seededRT.configuration_services import I3ClassicSeededRTConfigurationService\nImportError: cannot import name I3ClassicSeededRTConfigurationService\n}}}\n", "reporter": "kjmeagher", "cc": "", "resolution": "invalid", "_ts": "1550067158057333", "component": "combo reconstruction", "summary": "filter scripts refer to extinct projects in many places", "priority": "minor", "keywords": "documentation", "time": "2016-06-10T11:50:34", "milestone": "", "owner": "blaufuss", "type": "defect" } ``` </p> </details>
1.0
filter scripts refer to extinct projects in many places (Trac #1742) - <details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1742">https://code.icecube.wisc.edu/ticket/1742</a>, reported by kjmeagher and owned by blaufuss</em></summary> <p> ```json { "status": "closed", "changetime": "2019-02-13T14:12:38", "description": "it is causing warnings in sphinx\n{{{\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.dst.rst:15: WARNING: autodoc: failed to import module u'icecube.dst.dstfilter'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/dst/dstfilter.py\", line 5, in <module>\n import filter_globals\nImportError: No module named filter_globals\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.examples_simulation.rst:15: WARNING: autodoc: failed to import module u'icecube.examples_simulation.configure_simulation'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/examples_simulation/configure_simulation.py\", line 14, in <module>\n from icecube.sim_services.sanity_checker import SimulationSanityChecker\nImportError: No module named sanity_checker\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.filterscripts.offlineL2.rst:47lders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/filterscripts/offlineL2/level2_Reconstruction_Cascade.py\", line 2, in <module>\n from icecube import linefit, improvedLinefit, dipolefit, clast, cscd_llh, fill_ratio, tensor_of_inertia,CascadeVariables, AtmCscdEnergyReco\nImportError: cannot import name AtmCscdEnergyReco\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.finiteReco.rst:15: WARNING: autodoc: failed to import module u'icecube.finiteReco.converters'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/finiteReco/converters.py\", line 1, in <module>\n from icecube.finiteReco import I3FiniteCuts\nImportError: cannot import name I3FiniteCuts\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.level3_filter_lowen.rst:15: WARNING: autodoc: failed to import module u'icecube.level3_filter_lowen.LowEnergyL3TraySegment'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/level3_filter_lowen/LowEnergyL3TraySegment.py\", line 17, in <module>\n from icecube.STTools.seededRT.configuration_services import I3ClassicSeededRTConfigurationService\nImportError: cannot import name I3ClassicSeededRTConfigurationService\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.level3_filter_lowen.rst:23: WARNING: autodoc: failed to import module u'icecube.level3_filter_lowen.level3_Master_2012'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/level3_filter_lowen/level3_Master_2012.py\", line 17, in <module>\n from icecube.level3_filter_lowen import LowEnergyL3TraySegment\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/level3_filter_lowen/LowEnergyL3TraySegment.py\", line 17, in <module>\n from icecube.STTools.seededRT.configuration_services import I3ClassicSeededRTConfigurationService\nImportError: cannot import name I3ClassicSeededRTConfigurationService\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.level3_filter_lowen.rst:31: WARNING: autodoc: failed to import module u'icecube.level3_filter_lowen.level3_segments'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/level3_filter_lowen/level3_segments.py\", line 17, in <module>\n from icecube.STTools.seededRT.configuration_services import I3ClassicSeededRTConfigurationService\nImportError: cannot import name I3ClassicSeededRTConfigurationService\n}}}\n", "reporter": "kjmeagher", "cc": "", "resolution": "invalid", "_ts": "1550067158057333", "component": "combo reconstruction", "summary": "filter scripts refer to extinct projects in many places", "priority": "minor", "keywords": "documentation", "time": "2016-06-10T11:50:34", "milestone": "", "owner": "blaufuss", "type": "defect" } ``` </p> </details>
defect
filter scripts refer to extinct projects in many places trac migrated from json status closed changetime description it is causing warnings in sphinx n n users kmeagher icecube combo release sphinx build source python icecube dst rst warning autodoc failed to import module u icecube dst dstfilter the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube dst dstfilter py line in n import filter globals nimporterror no module named filter globals n users kmeagher icecube combo release sphinx build source python icecube examples simulation rst warning autodoc failed to import module u icecube examples simulation configure simulation the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube examples simulation configure simulation py line in n from icecube sim services sanity checker import simulationsanitychecker nimporterror no module named sanity checker n users kmeagher icecube combo release sphinx build source python icecube filterscripts rst rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube filterscripts reconstruction cascade py line in n from icecube import linefit improvedlinefit dipolefit clast cscd llh fill ratio tensor of inertia cascadevariables atmcscdenergyreco nimporterror cannot import name atmcscdenergyreco n users kmeagher icecube combo release sphinx build source python icecube finitereco rst warning autodoc failed to import module u icecube finitereco converters the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube finitereco converters py line in n from icecube finitereco import nimporterror cannot import name n users kmeagher icecube combo release sphinx build source python icecube filter lowen rst warning autodoc failed to import module u icecube filter lowen the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube filter lowen py line in n from icecube sttools seededrt configuration services import nimporterror cannot import name n users kmeagher icecube combo release sphinx build source python icecube filter lowen rst warning autodoc failed to import module u icecube filter lowen master the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube filter lowen master py line in n from icecube filter lowen import n file users kmeagher icecube combo release lib icecube filter lowen py line in n from icecube sttools seededrt configuration services import nimporterror cannot import name n users kmeagher icecube combo release sphinx build source python icecube filter lowen rst warning autodoc failed to import module u icecube filter lowen segments the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube filter lowen segments py line in n from icecube sttools seededrt configuration services import nimporterror cannot import name n n reporter kjmeagher cc resolution invalid ts component combo reconstruction summary filter scripts refer to extinct projects in many places priority minor keywords documentation time milestone owner blaufuss type defect
1
49,968
12,439,282,568
IssuesEvent
2020-05-26 09:50:33
docascod/DocsAsCode
https://api.github.com/repos/docascod/DocsAsCode
opened
default theme slides : add default images
enhancement fct_build
Add default (blank) images into slides default theme : * title-page-background * title-page-logo * header-logo * footer-logo
1.0
default theme slides : add default images - Add default (blank) images into slides default theme : * title-page-background * title-page-logo * header-logo * footer-logo
non_defect
default theme slides add default images add default blank images into slides default theme title page background title page logo header logo footer logo
0
80,979
30,639,929,129
IssuesEvent
2023-07-24 20:58:22
scipy/scipy
https://api.github.com/repos/scipy/scipy
closed
BUG: Regression in SciPy mode return shape with 1.11.1
defect
### Describe your issue. Returns float and not array as expected. Shape of return data changed. Please see and REOPEN https://github.com/scipy/scipy/issues/16418 ### Reproducing Code Example ```python Please see https://github.com/rzellem/EXOTIC/issues/1206 ``` ### Error message ```shell Added Comparison Star #3 [324,365] from AAVSO Finding transformation 1 of 142 : /proj/survey-ws/source/EXOTIC_sampledata/HatP32Dec202017/HATP-32171220013343.FITS Traceback (most recent call last): File "/System/Volumes/Data/proj/survey-ws/source/EXOTIC/exotic/exotic.py", line 2641, in <module> main() File "/System/Volumes/Data/proj/survey-ws/source/EXOTIC/exotic/exotic.py", line 2124, in main aper_data["target"][i][a][an], aper_data["target_bg"][i][a][an] = aperPhot(imageData, 0, ^^^^^^^^^^^^^^^^^^^^^^ File "/System/Volumes/Data/proj/survey-ws/source/EXOTIC/exotic/exotic.py", line 1137, in aperPhot bgflux, sigmabg, Nbg = skybg_phot(data, starIndex, xc, yc, r + 2, dr) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/System/Volumes/Data/proj/survey-ws/source/EXOTIC/exotic/exotic.py", line 1204, in skybg_phot return mode(dat.flatten(), nan_policy='omit').mode[0], np.nanstd(dat.flatten()), np.sum(mask) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^ IndexError: invalid index to scalar variable. ``` ### SciPy/NumPy/Python version and system information ```shell Python 3.11, SciPy 3.11.1 (but works in SciPy 3.10.1) ```
1.0
BUG: Regression in SciPy mode return shape with 1.11.1 - ### Describe your issue. Returns float and not array as expected. Shape of return data changed. Please see and REOPEN https://github.com/scipy/scipy/issues/16418 ### Reproducing Code Example ```python Please see https://github.com/rzellem/EXOTIC/issues/1206 ``` ### Error message ```shell Added Comparison Star #3 [324,365] from AAVSO Finding transformation 1 of 142 : /proj/survey-ws/source/EXOTIC_sampledata/HatP32Dec202017/HATP-32171220013343.FITS Traceback (most recent call last): File "/System/Volumes/Data/proj/survey-ws/source/EXOTIC/exotic/exotic.py", line 2641, in <module> main() File "/System/Volumes/Data/proj/survey-ws/source/EXOTIC/exotic/exotic.py", line 2124, in main aper_data["target"][i][a][an], aper_data["target_bg"][i][a][an] = aperPhot(imageData, 0, ^^^^^^^^^^^^^^^^^^^^^^ File "/System/Volumes/Data/proj/survey-ws/source/EXOTIC/exotic/exotic.py", line 1137, in aperPhot bgflux, sigmabg, Nbg = skybg_phot(data, starIndex, xc, yc, r + 2, dr) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/System/Volumes/Data/proj/survey-ws/source/EXOTIC/exotic/exotic.py", line 1204, in skybg_phot return mode(dat.flatten(), nan_policy='omit').mode[0], np.nanstd(dat.flatten()), np.sum(mask) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^ IndexError: invalid index to scalar variable. ``` ### SciPy/NumPy/Python version and system information ```shell Python 3.11, SciPy 3.11.1 (but works in SciPy 3.10.1) ```
defect
bug regression in scipy mode return shape with describe your issue returns float and not array as expected shape of return data changed please see and reopen reproducing code example python please see error message shell added comparison star from aavso finding transformation of proj survey ws source exotic sampledata hatp fits traceback most recent call last file system volumes data proj survey ws source exotic exotic exotic py line in main file system volumes data proj survey ws source exotic exotic exotic py line in main aper data aper data aperphot imagedata file system volumes data proj survey ws source exotic exotic exotic py line in aperphot bgflux sigmabg nbg skybg phot data starindex xc yc r dr file system volumes data proj survey ws source exotic exotic exotic py line in skybg phot return mode dat flatten nan policy omit mode np nanstd dat flatten np sum mask indexerror invalid index to scalar variable scipy numpy python version and system information shell python scipy but works in scipy
1
289,134
8,855,183,584
IssuesEvent
2019-01-09 05:10:31
visit-dav/issues-test
https://api.github.com/repos/visit-dav/issues-test
closed
Add an option to the PLOT3D reader to autodetect the settings.
enhancement priority reviewed
Rick Angellini mentioned that it would be nice if the PLOT3D reader autodetected the settings for the file. I agree that would be nice. The VTK multiblock reader has an autodetect option so we could possibly steal some code from there. -----------------------REDMINE MIGRATION----------------------- This ticket was migrated from Redmine. As such, not all information was able to be captured in the transition. Below is a complete record of the original redmine ticket. Ticket number: 2255 Status: Resolved Project: VisIt Tracker: Feature Priority: High Subject: Add an option to the PLOT3D reader to autodetect the settings. Assigned to: Kathleen Biagas Category: - Target version: 2.10 Author: Eric Brugger Start: 05/06/2015 Due date: % Done: 100% Estimated time: Created: 05/06/2015 03:57 pm Updated: 06/17/2015 09:52 pm Likelihood: Severity: Found in version: 2.12.3 Impact: 3 - Medium Expected Use: 3 - Occasional OS: All Support Group: Any Description: Rick Angellini mentioned that it would be nice if the PLOT3D reader autodetected the settings for the file. I agree that would be nice. The VTK multiblock reader has an autodetect option so we could possibly steal some code from there. Comments: I modified our vtkPLOT3DReader to be a subclass of VTK's: vtkMultiBlockPLOT3DReader, in order to get the autodetection functionality.This also greatly reduced number of methods in the file, keeping only the few requiring reimplementation in order for only a single block at a time to be read. (As VisIt prefers).The autodetection functionality is always on, and works reasonably well for most binary files (it does not work at all for ascii files). If autodetection fails, the specified options (via .vp3d meta file or Read Options) are used as fallback, and a warning will be in VisIt's log files in case there are problems reading the data with the fallback options.M databases/PLOT3D/vtkPLOT3DReader.CM databases/PLOT3D/vtkPLOT3DReader.hM databases/PLOT3D/PLOT3D.xmlM databases/PLOT3D/avtPLOT3DFileFormat.CM databases/PLOT3D/CMakeLists.txt
1.0
Add an option to the PLOT3D reader to autodetect the settings. - Rick Angellini mentioned that it would be nice if the PLOT3D reader autodetected the settings for the file. I agree that would be nice. The VTK multiblock reader has an autodetect option so we could possibly steal some code from there. -----------------------REDMINE MIGRATION----------------------- This ticket was migrated from Redmine. As such, not all information was able to be captured in the transition. Below is a complete record of the original redmine ticket. Ticket number: 2255 Status: Resolved Project: VisIt Tracker: Feature Priority: High Subject: Add an option to the PLOT3D reader to autodetect the settings. Assigned to: Kathleen Biagas Category: - Target version: 2.10 Author: Eric Brugger Start: 05/06/2015 Due date: % Done: 100% Estimated time: Created: 05/06/2015 03:57 pm Updated: 06/17/2015 09:52 pm Likelihood: Severity: Found in version: 2.12.3 Impact: 3 - Medium Expected Use: 3 - Occasional OS: All Support Group: Any Description: Rick Angellini mentioned that it would be nice if the PLOT3D reader autodetected the settings for the file. I agree that would be nice. The VTK multiblock reader has an autodetect option so we could possibly steal some code from there. Comments: I modified our vtkPLOT3DReader to be a subclass of VTK's: vtkMultiBlockPLOT3DReader, in order to get the autodetection functionality.This also greatly reduced number of methods in the file, keeping only the few requiring reimplementation in order for only a single block at a time to be read. (As VisIt prefers).The autodetection functionality is always on, and works reasonably well for most binary files (it does not work at all for ascii files). If autodetection fails, the specified options (via .vp3d meta file or Read Options) are used as fallback, and a warning will be in VisIt's log files in case there are problems reading the data with the fallback options.M databases/PLOT3D/vtkPLOT3DReader.CM databases/PLOT3D/vtkPLOT3DReader.hM databases/PLOT3D/PLOT3D.xmlM databases/PLOT3D/avtPLOT3DFileFormat.CM databases/PLOT3D/CMakeLists.txt
non_defect
add an option to the reader to autodetect the settings rick angellini mentioned that it would be nice if the reader autodetected the settings for the file i agree that would be nice the vtk multiblock reader has an autodetect option so we could possibly steal some code from there redmine migration this ticket was migrated from redmine as such not all information was able to be captured in the transition below is a complete record of the original redmine ticket ticket number status resolved project visit tracker feature priority high subject add an option to the reader to autodetect the settings assigned to kathleen biagas category target version author eric brugger start due date done estimated time created pm updated pm likelihood severity found in version impact medium expected use occasional os all support group any description rick angellini mentioned that it would be nice if the reader autodetected the settings for the file i agree that would be nice the vtk multiblock reader has an autodetect option so we could possibly steal some code from there comments i modified our to be a subclass of vtk s in order to get the autodetection functionality this also greatly reduced number of methods in the file keeping only the few requiring reimplementation in order for only a single block at a time to be read as visit prefers the autodetection functionality is always on and works reasonably well for most binary files it does not work at all for ascii files if autodetection fails the specified options via meta file or read options are used as fallback and a warning will be in visit s log files in case there are problems reading the data with the fallback options m databases cm databases hm databases xmlm databases cm databases cmakelists txt
0
71,497
30,913,111,968
IssuesEvent
2023-08-05 01:06:30
Zahlungsmittel/Zahlungsmittel
https://api.github.com/repos/Zahlungsmittel/Zahlungsmittel
opened
[CLOSED] feat: Test Change Language in User Profile
service: wallet frontend feature imported
<a href="https://github.com/Mogge"><img src="https://avatars.githubusercontent.com/u/15882241?v=4" align="left" width="96" height="96" hspace="10"></img></a> **Issue by [Mogge](https://github.com/Mogge)** _Monday Sep 27, 2021 at 13:07 GMT_ _Originally opened as https://github.com/gradido/gradido/pull/904_ ---- ## 🍰 Pullrequest * Test all the functionality of the component to change the language in the user profile * line coverage of newsletter component to 100% * coverage of unit tests frontend to 69% ---- _**[Mogge](https://github.com/Mogge)** included the following code: https://github.com/gradido/gradido/pull/904/commits_
1.0
[CLOSED] feat: Test Change Language in User Profile - <a href="https://github.com/Mogge"><img src="https://avatars.githubusercontent.com/u/15882241?v=4" align="left" width="96" height="96" hspace="10"></img></a> **Issue by [Mogge](https://github.com/Mogge)** _Monday Sep 27, 2021 at 13:07 GMT_ _Originally opened as https://github.com/gradido/gradido/pull/904_ ---- ## 🍰 Pullrequest * Test all the functionality of the component to change the language in the user profile * line coverage of newsletter component to 100% * coverage of unit tests frontend to 69% ---- _**[Mogge](https://github.com/Mogge)** included the following code: https://github.com/gradido/gradido/pull/904/commits_
non_defect
feat test change language in user profile issue by monday sep at gmt originally opened as 🍰 pullrequest test all the functionality of the component to change the language in the user profile line coverage of newsletter component to coverage of unit tests frontend to included the following code
0
143,524
11,568,456,168
IssuesEvent
2020-02-20 15:54:31
elastic/elasticsearch
https://api.github.com/repos/elastic/elasticsearch
closed
"explain_data_frame_analytics#Test non-empty data frame given body" doesn't match expected_memory_without_disk string
:ml >test-failure
Stumbled into this on a PR CI run, reproduces reliably for me on master. Will mute as well. ``` ./gradlew ':x-pack:plugin:integTestRunner' --tests "org.elasticsearch.xpack.test.rest.XPackRestIT.test {p0=ml/explain_data_frame_analytics/Test non-empty data frame given body}" -Dtests.seed=B99363F6D7C31F6F -Dtests.security.manager=true -Dtests.locale=ca -Dtests.timezone=America/Merida -Dcompiler.java=13 -Dtests.rest.blacklist=getting_started/10_monitor_cluster_health/* ``` ``` 2> java.lang.AssertionError: Failure at [ml/explain_data_frame_analytics:156]: memory_estimation.expected_memory_without_disk didn't match expected value: memory_estimation.expected_memory_without_disk: expected String [4kb] but was String [5kb] at __randomizedtesting.SeedInfo.seed([B99363F6D7C31F6F:31C75C2C793F7297]:0) at org.elasticsearch.test.rest.yaml.ESClientYamlSuiteTestCase.executeSection(ESClientYamlSuiteTestCase.java:405) at org.elasticsearch.test.rest.yaml.ESClientYamlSuiteTestCase.test(ESClientYamlSuiteTestCase.java:382) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1754) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:942) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:978) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:992) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:370) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:819) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:470) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:951) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:887) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:898) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) ``` ``` 1> [2020-02-20T09:08:31,091][INFO ][o.e.x.t.r.XPackRestIT ] [test] Stash dump on test failure [{ 1> "stash" : { 1> "body" : { 1> "field_selection" : [ 1> { 1> "name" : "x", 1> "mapping_types" : [ 1> "float" 1> ], 1> "is_included" : true, 1> "is_required" : false, 1> "feature_type" : "numerical" 1> }, 1> { 1> "name" : "y", 1> "mapping_types" : [ 1> "float" 1> ], 1> "is_included" : true, 1> "is_required" : false, 1> "feature_type" : "numerical" 1> } 1> ], 1> "memory_estimation" : { 1> "expected_memory_without_disk" : "5kb", 1> "expected_memory_with_disk" : "5kb" 1> } 1> } 1> } 1> }] ```
1.0
"explain_data_frame_analytics#Test non-empty data frame given body" doesn't match expected_memory_without_disk string - Stumbled into this on a PR CI run, reproduces reliably for me on master. Will mute as well. ``` ./gradlew ':x-pack:plugin:integTestRunner' --tests "org.elasticsearch.xpack.test.rest.XPackRestIT.test {p0=ml/explain_data_frame_analytics/Test non-empty data frame given body}" -Dtests.seed=B99363F6D7C31F6F -Dtests.security.manager=true -Dtests.locale=ca -Dtests.timezone=America/Merida -Dcompiler.java=13 -Dtests.rest.blacklist=getting_started/10_monitor_cluster_health/* ``` ``` 2> java.lang.AssertionError: Failure at [ml/explain_data_frame_analytics:156]: memory_estimation.expected_memory_without_disk didn't match expected value: memory_estimation.expected_memory_without_disk: expected String [4kb] but was String [5kb] at __randomizedtesting.SeedInfo.seed([B99363F6D7C31F6F:31C75C2C793F7297]:0) at org.elasticsearch.test.rest.yaml.ESClientYamlSuiteTestCase.executeSection(ESClientYamlSuiteTestCase.java:405) at org.elasticsearch.test.rest.yaml.ESClientYamlSuiteTestCase.test(ESClientYamlSuiteTestCase.java:382) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1754) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:942) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:978) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:992) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:370) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:819) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:470) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:951) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:887) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:898) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) ``` ``` 1> [2020-02-20T09:08:31,091][INFO ][o.e.x.t.r.XPackRestIT ] [test] Stash dump on test failure [{ 1> "stash" : { 1> "body" : { 1> "field_selection" : [ 1> { 1> "name" : "x", 1> "mapping_types" : [ 1> "float" 1> ], 1> "is_included" : true, 1> "is_required" : false, 1> "feature_type" : "numerical" 1> }, 1> { 1> "name" : "y", 1> "mapping_types" : [ 1> "float" 1> ], 1> "is_included" : true, 1> "is_required" : false, 1> "feature_type" : "numerical" 1> } 1> ], 1> "memory_estimation" : { 1> "expected_memory_without_disk" : "5kb", 1> "expected_memory_with_disk" : "5kb" 1> } 1> } 1> } 1> }] ```
non_defect
explain data frame analytics test non empty data frame given body doesn t match expected memory without disk string stumbled into this on a pr ci run reproduces reliably for me on master will mute as well gradlew x pack plugin integtestrunner tests org elasticsearch xpack test rest xpackrestit test ml explain data frame analytics test non empty data frame given body dtests seed dtests security manager true dtests locale ca dtests timezone america merida dcompiler java dtests rest blacklist getting started monitor cluster health java lang assertionerror failure at memory estimation expected memory without disk didn t match expected value memory estimation expected memory without disk expected string but was string at randomizedtesting seedinfo seed at org elasticsearch test rest yaml esclientyamlsuitetestcase executesection esclientyamlsuitetestcase java at org elasticsearch test rest yaml esclientyamlsuitetestcase test esclientyamlsuitetestcase java at java base jdk internal reflect nativemethodaccessorimpl native method at java base jdk internal reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at java base jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java base java lang reflect method invoke method java at com carrotsearch randomizedtesting randomizedrunner invoke randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testrulesetupteardownchained evaluate testrulesetupteardownchained java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java at org apache lucene util testrulethreadandtestname evaluate testrulethreadandtestname java at org apache lucene util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene util testrulemarkfailure evaluate testrulemarkfailure java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol forktimeoutingtask threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol evaluate threadleakcontrol java at com carrotsearch randomizedtesting randomizedrunner runsingletest randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java stash dump on test failure stash body field selection name x mapping types float is included true is required false feature type numerical name y mapping types float is included true is required false feature type numerical memory estimation expected memory without disk expected memory with disk
0
2,203
2,603,977,894
IssuesEvent
2015-02-24 19:02:00
chrsmith/nishazi6
https://api.github.com/repos/chrsmith/nishazi6
opened
沈阳男性生殖疱疹如何治疗
auto-migrated Priority-Medium Type-Defect
``` 沈阳男性生殖疱疹如何治疗〓沈陽軍區政治部醫院性病〓TEL�� �024-31023308〓成立于1946年,68年專注于性傳播疾病的研究和治� ��。位于沈陽市沈河區二緯路32號。是一所與新中國同建立共� ��煌的歷史悠久、設備精良、技術權威、專家云集,是預防、 保健、醫療、科研康復為一體的綜合性醫院。是國家首批公�� �甲等部隊醫院、全國首批醫療規范定點單位,是第四軍醫大� ��、東南大學等知名高等院校的教學醫院。曾被中國人民解放 軍空軍后勤部衛生部評為衛生工作先進單位,先后兩次榮立�� �體二等功。 ``` ----- Original issue reported on code.google.com by `q964105...@gmail.com` on 4 Jun 2014 at 8:29
1.0
沈阳男性生殖疱疹如何治疗 - ``` 沈阳男性生殖疱疹如何治疗〓沈陽軍區政治部醫院性病〓TEL�� �024-31023308〓成立于1946年,68年專注于性傳播疾病的研究和治� ��。位于沈陽市沈河區二緯路32號。是一所與新中國同建立共� ��煌的歷史悠久、設備精良、技術權威、專家云集,是預防、 保健、醫療、科研康復為一體的綜合性醫院。是國家首批公�� �甲等部隊醫院、全國首批醫療規范定點單位,是第四軍醫大� ��、東南大學等知名高等院校的教學醫院。曾被中國人民解放 軍空軍后勤部衛生部評為衛生工作先進單位,先后兩次榮立�� �體二等功。 ``` ----- Original issue reported on code.google.com by `q964105...@gmail.com` on 4 Jun 2014 at 8:29
defect
沈阳男性生殖疱疹如何治疗 沈阳男性生殖疱疹如何治疗〓沈陽軍區政治部醫院性病〓tel�� � 〓 , � ��。 。是一所與新中國同建立共� ��煌的歷史悠久、設備精良、技術權威、專家云集,是預防、 保健、醫療、科研康復為一體的綜合性醫院。是國家首批公�� �甲等部隊醫院、全國首批醫療規范定點單位,是第四軍醫大� ��、東南大學等知名高等院校的教學醫院。曾被中國人民解放 軍空軍后勤部衛生部評為衛生工作先進單位,先后兩次榮立�� �體二等功。 original issue reported on code google com by gmail com on jun at
1
488,978
14,100,009,707
IssuesEvent
2020-11-06 02:57:33
hydroshare/hydroshare
https://api.github.com/repos/hydroshare/hydroshare
closed
Discovery: moving map does not refresh list.
Discover Medium Priority bug geo and maps
When moving the map, the list of available resources does not get refreshed.
1.0
Discovery: moving map does not refresh list. - When moving the map, the list of available resources does not get refreshed.
non_defect
discovery moving map does not refresh list when moving the map the list of available resources does not get refreshed
0
505,900
14,654,089,511
IssuesEvent
2020-12-28 07:48:54
Kerosz/reddit-client
https://api.github.com/repos/Kerosz/reddit-client
closed
Rework the navigation avatar and theme switcher
Area-User Interface Cost-Low Idea-New Priority-2
Improve the display of the avatar and theme switcher on the top right navigation, possbily add a menu to show and hide those options.
1.0
Rework the navigation avatar and theme switcher - Improve the display of the avatar and theme switcher on the top right navigation, possbily add a menu to show and hide those options.
non_defect
rework the navigation avatar and theme switcher improve the display of the avatar and theme switcher on the top right navigation possbily add a menu to show and hide those options
0
156,906
13,656,154,595
IssuesEvent
2020-09-28 01:44:48
keepassxreboot/keepassxc
https://api.github.com/repos/keepassxreboot/keepassxc
closed
[Documentation] Keyboard shortcuts list should mention Cmd key for Mac users
documentation :bookmark_tabs: platform: macOS
[TIP]: # ( Provide a general summary of the issue in the title above ^^ ) [TIP]: # ( DO NOT include screenshots of your actual database! ) The keyboard shortcuts list at [KEYBINDS.md](https://github.com/keepassxreboot/keepassxc/blob/develop/docs/KEYBINDS.md) ought to have **a note to the effect that Mac users should use _Cmd_ where it says _Ctrl_**. For example, the shortcut to bring up the list is <kbd>Ctrl</kbd><kbd>/</kbd> on most platforms, but <kbd>Cmd</kbd><kbd>/</kbd> (aka <kbd>⌘</kbd><kbd>/</kbd>) on MacOS. **Motivation:** Macs do actually have a _Ctrl_ key, so the list is in fact incorrect for the platform. **History:** For some reason, it never happened in #3928. Makes sense not to change the whole list but just to add a quick note. (Mac users are used to it.) Thanks ## Debug Info [NOTE]: # ( Paste debug info from Help → About here ) KeePassXC - Version 2.5.3-snapshot Build Type: Snapshot Revision: e26063a Operating system: MacOS
1.0
[Documentation] Keyboard shortcuts list should mention Cmd key for Mac users - [TIP]: # ( Provide a general summary of the issue in the title above ^^ ) [TIP]: # ( DO NOT include screenshots of your actual database! ) The keyboard shortcuts list at [KEYBINDS.md](https://github.com/keepassxreboot/keepassxc/blob/develop/docs/KEYBINDS.md) ought to have **a note to the effect that Mac users should use _Cmd_ where it says _Ctrl_**. For example, the shortcut to bring up the list is <kbd>Ctrl</kbd><kbd>/</kbd> on most platforms, but <kbd>Cmd</kbd><kbd>/</kbd> (aka <kbd>⌘</kbd><kbd>/</kbd>) on MacOS. **Motivation:** Macs do actually have a _Ctrl_ key, so the list is in fact incorrect for the platform. **History:** For some reason, it never happened in #3928. Makes sense not to change the whole list but just to add a quick note. (Mac users are used to it.) Thanks ## Debug Info [NOTE]: # ( Paste debug info from Help → About here ) KeePassXC - Version 2.5.3-snapshot Build Type: Snapshot Revision: e26063a Operating system: MacOS
non_defect
keyboard shortcuts list should mention cmd key for mac users provide a general summary of the issue in the title above do not include screenshots of your actual database the keyboard shortcuts list at ought to have a note to the effect that mac users should use cmd where it says ctrl for example the shortcut to bring up the list is ctrl on most platforms but cmd aka ⌘ on macos motivation macs do actually have a ctrl key so the list is in fact incorrect for the platform history for some reason it never happened in makes sense not to change the whole list but just to add a quick note mac users are used to it thanks debug info paste debug info from help → about here keepassxc version snapshot build type snapshot revision operating system macos
0
223,394
17,597,903,055
IssuesEvent
2021-08-17 08:10:03
spacemeshos/go-spacemesh
https://api.github.com/repos/spacemeshos/go-spacemesh
opened
`overwriting sessionID in context` warning
bug testnet v0.2
## Description <!-- Please provide a clear and detailed description of the bug. --> It happens in `WithSessionID`, when `curSessionID != sessionID`, but `sessionID` is always a unique value (`uuid.New().String()`, called from `WithNewSessionID`) which cannot be equal to `curSessionID`
1.0
`overwriting sessionID in context` warning - ## Description <!-- Please provide a clear and detailed description of the bug. --> It happens in `WithSessionID`, when `curSessionID != sessionID`, but `sessionID` is always a unique value (`uuid.New().String()`, called from `WithNewSessionID`) which cannot be equal to `curSessionID`
non_defect
overwriting sessionid in context warning description it happens in withsessionid when cursessionid sessionid but sessionid is always a unique value uuid new string called from withnewsessionid which cannot be equal to cursessionid
0
266,009
8,361,686,146
IssuesEvent
2018-10-03 14:55:07
kubernetes/kubeadm
https://api.github.com/repos/kubernetes/kubeadm
closed
Apply means to search for existing KubeConfig files in standard locations across kubeadm
kind/feature lifecycle/stale priority/important-longterm
This PR added a method to `kubeadm token` to search in the current user home path and the environment variable KUBECONFIG for existing files if the user does not provide a --kubeconfig flag: https://github.com/kubernetes/kubernetes/pull/62850 Taskset: 1) Move `defaultKubeConfig = "/etc/kubernetes/admin.conf"` to `constants.go` 2) Refactor `token.go::findExistingKubeConfig()` into a utility in `cmd/kubeadm/app/cmd/util` 3) Use the utility in `config`, `upgrade` and some of the phases commands that use `--kubeconfig` 4) Apply relevant documentation in `stdout` and at `/website`. Possibly a couple of PRs would be enough to cover this - one being against `/website`. Requested by @luxas /assign @neolit123
1.0
Apply means to search for existing KubeConfig files in standard locations across kubeadm - This PR added a method to `kubeadm token` to search in the current user home path and the environment variable KUBECONFIG for existing files if the user does not provide a --kubeconfig flag: https://github.com/kubernetes/kubernetes/pull/62850 Taskset: 1) Move `defaultKubeConfig = "/etc/kubernetes/admin.conf"` to `constants.go` 2) Refactor `token.go::findExistingKubeConfig()` into a utility in `cmd/kubeadm/app/cmd/util` 3) Use the utility in `config`, `upgrade` and some of the phases commands that use `--kubeconfig` 4) Apply relevant documentation in `stdout` and at `/website`. Possibly a couple of PRs would be enough to cover this - one being against `/website`. Requested by @luxas /assign @neolit123
non_defect
apply means to search for existing kubeconfig files in standard locations across kubeadm this pr added a method to kubeadm token to search in the current user home path and the environment variable kubeconfig for existing files if the user does not provide a kubeconfig flag taskset move defaultkubeconfig etc kubernetes admin conf to constants go refactor token go findexistingkubeconfig into a utility in cmd kubeadm app cmd util use the utility in config upgrade and some of the phases commands that use kubeconfig apply relevant documentation in stdout and at website possibly a couple of prs would be enough to cover this one being against website requested by luxas assign
0
551,419
16,167,520,929
IssuesEvent
2021-05-01 20:04:56
uwblueprint/shoe-project
https://api.github.com/repos/uwblueprint/shoe-project
closed
Filter showing countries not visible on map
bug priority: high
The filter on the main map currently shows countries that are attached stories that are set to be not visible. ![image](https://user-images.githubusercontent.com/50289930/116630543-5af5b880-a921-11eb-9b16-f396bd5e59f4.png)
1.0
Filter showing countries not visible on map - The filter on the main map currently shows countries that are attached stories that are set to be not visible. ![image](https://user-images.githubusercontent.com/50289930/116630543-5af5b880-a921-11eb-9b16-f396bd5e59f4.png)
non_defect
filter showing countries not visible on map the filter on the main map currently shows countries that are attached stories that are set to be not visible
0
1,011
2,697,612,484
IssuesEvent
2015-04-02 21:02:00
flynn/flynn
https://api.github.com/repos/flynn/flynn
closed
Support removing Flynn from a system
enhancement install-script usability
There should be a simple way to remove all Flynn binaries, config and images from a system.
True
Support removing Flynn from a system - There should be a simple way to remove all Flynn binaries, config and images from a system.
non_defect
support removing flynn from a system there should be a simple way to remove all flynn binaries config and images from a system
0
8,250
2,611,473,427
IssuesEvent
2015-02-27 05:17:33
chrsmith/hedgewars
https://api.github.com/repos/chrsmith/hedgewars
closed
Top CMakeLists.txt assumes debug configuration if CMAKE_BUILD_TYPE is not Release
auto-migrated Priority-Medium Type-Defect
``` Currently, CMakeLists.txt thinks that if the build type is not Release, then debug configuration should be used: if(CMAKE_BUILD_TYPE MATCHES RELEASE OR CMAKE_BUILD_TYPE MATCHES "Release") message(STATUS "Building Release") set(Optz true) else() message(STATUS "Building Debug") set(Optz false) endif() But this doesn't account for custom build types, like the one used in Gentoo (see http://sources.gentoo.org/cgi-bin/viewvc.cgi/gentoo-x86/eclass/cmake-utils.eclas s?view=markup for details on CMake project handling). The relevant code could be changed to if(CMAKE_BUILD_TYPE MATCHES DEBUG OR CMAKE_BUILD_TYPE MATCHES "Debug") message(STATUS "Building Debug") set(Optz false) else() message(STATUS "Building Release") set(Optz true) endif() ``` Original issue reported on code.google.com by `andrey.vihrov` on 24 Sep 2011 at 9:03
1.0
Top CMakeLists.txt assumes debug configuration if CMAKE_BUILD_TYPE is not Release - ``` Currently, CMakeLists.txt thinks that if the build type is not Release, then debug configuration should be used: if(CMAKE_BUILD_TYPE MATCHES RELEASE OR CMAKE_BUILD_TYPE MATCHES "Release") message(STATUS "Building Release") set(Optz true) else() message(STATUS "Building Debug") set(Optz false) endif() But this doesn't account for custom build types, like the one used in Gentoo (see http://sources.gentoo.org/cgi-bin/viewvc.cgi/gentoo-x86/eclass/cmake-utils.eclas s?view=markup for details on CMake project handling). The relevant code could be changed to if(CMAKE_BUILD_TYPE MATCHES DEBUG OR CMAKE_BUILD_TYPE MATCHES "Debug") message(STATUS "Building Debug") set(Optz false) else() message(STATUS "Building Release") set(Optz true) endif() ``` Original issue reported on code.google.com by `andrey.vihrov` on 24 Sep 2011 at 9:03
defect
top cmakelists txt assumes debug configuration if cmake build type is not release currently cmakelists txt thinks that if the build type is not release then debug configuration should be used if cmake build type matches release or cmake build type matches release message status building release set optz true else message status building debug set optz false endif but this doesn t account for custom build types like the one used in gentoo see s view markup for details on cmake project handling the relevant code could be changed to if cmake build type matches debug or cmake build type matches debug message status building debug set optz false else message status building release set optz true endif original issue reported on code google com by andrey vihrov on sep at
1
21,936
3,587,215,180
IssuesEvent
2016-01-30 05:06:11
mash99/crypto-js
https://api.github.com/repos/mash99/crypto-js
closed
Adding a rollup for all components
auto-migrated Priority-Medium Type-Defect
``` Hi - the patch provide a all component rollup for users who just want to include a single script. ``` Original issue reported on code.google.com by `yinso.c...@gmail.com` on 5 Apr 2013 at 7:51 Attachments: * [rollup_all.diff](https://storage.googleapis.com/google-code-attachments/crypto-js/issue-87/comment-0/rollup_all.diff)
1.0
Adding a rollup for all components - ``` Hi - the patch provide a all component rollup for users who just want to include a single script. ``` Original issue reported on code.google.com by `yinso.c...@gmail.com` on 5 Apr 2013 at 7:51 Attachments: * [rollup_all.diff](https://storage.googleapis.com/google-code-attachments/crypto-js/issue-87/comment-0/rollup_all.diff)
defect
adding a rollup for all components hi the patch provide a all component rollup for users who just want to include a single script original issue reported on code google com by yinso c gmail com on apr at attachments
1
29,560
11,759,837,614
IssuesEvent
2020-03-13 18:06:36
01binary/elevator
https://api.github.com/repos/01binary/elevator
opened
WS-2019-0331 (Medium) detected in handlebars-4.0.12.tgz
security vulnerability
## WS-2019-0331 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.0.12.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.0.12.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.0.12.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/elevator/ClientApp/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/elevator/ClientApp/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - react-scripts-1.1.5.tgz (Root Library) - jest-20.0.4.tgz - jest-cli-20.0.4.tgz - istanbul-api-1.3.7.tgz - istanbul-reports-1.5.1.tgz - :x: **handlebars-4.0.12.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/01binary/elevator/commit/c03855450ce69cbe684e2d0017a95692e42f929f">c03855450ce69cbe684e2d0017a95692e42f929f</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Arbitrary Code Execution vulnerability found in handlebars before 4.5.2. Lookup helper fails to validate templates. Attack may submit templates that execute arbitrary JavaScript in the system. <p>Publish Date: 2019-12-05 <p>URL: <a href=https://github.com/wycats/handlebars.js/commit/d54137810a49939fd2ad01a91a34e182ece4528e>WS-2019-0331</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.npmjs.com/advisories/1316">https://www.npmjs.com/advisories/1316</a></p> <p>Release Date: 2019-12-05</p> <p>Fix Resolution: handlebars - 4.5.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
WS-2019-0331 (Medium) detected in handlebars-4.0.12.tgz - ## WS-2019-0331 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.0.12.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.0.12.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.0.12.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/elevator/ClientApp/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/elevator/ClientApp/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - react-scripts-1.1.5.tgz (Root Library) - jest-20.0.4.tgz - jest-cli-20.0.4.tgz - istanbul-api-1.3.7.tgz - istanbul-reports-1.5.1.tgz - :x: **handlebars-4.0.12.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/01binary/elevator/commit/c03855450ce69cbe684e2d0017a95692e42f929f">c03855450ce69cbe684e2d0017a95692e42f929f</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Arbitrary Code Execution vulnerability found in handlebars before 4.5.2. Lookup helper fails to validate templates. Attack may submit templates that execute arbitrary JavaScript in the system. <p>Publish Date: 2019-12-05 <p>URL: <a href=https://github.com/wycats/handlebars.js/commit/d54137810a49939fd2ad01a91a34e182ece4528e>WS-2019-0331</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.npmjs.com/advisories/1316">https://www.npmjs.com/advisories/1316</a></p> <p>Release Date: 2019-12-05</p> <p>Fix Resolution: handlebars - 4.5.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_defect
ws medium detected in handlebars tgz ws medium severity vulnerability vulnerable library handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file tmp ws scm elevator clientapp package json path to vulnerable library tmp ws scm elevator clientapp node modules handlebars package json dependency hierarchy react scripts tgz root library jest tgz jest cli tgz istanbul api tgz istanbul reports tgz x handlebars tgz vulnerable library found in head commit a href vulnerability details arbitrary code execution vulnerability found in handlebars before lookup helper fails to validate templates attack may submit templates that execute arbitrary javascript in the system publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution handlebars step up your open source security game with whitesource
0
31,468
6,534,093,514
IssuesEvent
2017-08-31 09:21:23
hazelcast/hazelcast
https://api.github.com/repos/hazelcast/hazelcast
closed
[cache] Cache.cacheManager may be overwritten with a different CacheManager
Team: Core Type: Defect
Non-standard creation of `Cache`s or a crafted `URI`/`Classloader` combination may overwrite an existing `Cache`'s `cacheManager` field with a `CacheManager` that did not create the `Cache`. ``` HazelcastInstance hz = Hazelcast.newHazelcastInstance(); HazelcastInstance client = HazelcastClient.newHazelcastClient(); Properties properties = HazelcastCachingProvider.propertiesByInstanceItself(client); CachingProvider caching = Caching.getCachingProvider("com.hazelcast.client.cache.impl.HazelcastClientCachingProvider"); CacheManager cacheManagerFoo = caching.getCacheManager(new URI("foo"), null, properties); CacheManager cacheManagerBar = caching.getCacheManager(null, new MaliciousClassLoader(Bootstrap.class.getClassLoader()), properties); CacheConfig cacheConfig = new CacheConfig("the-cache"); Cache cache1 = cacheManagerFoo.createCache("the-cache", cacheConfig); // will print false, cache1.cacheManager is cacheManagerFoo System.out.println(cache1.getCacheManager() == cacheManagerBar); Cache cache2 = cacheManagerBar.getCache("the-cache"); // both statements below will print true System.out.println(cache1 == cache2); System.out.println(cache1.getCacheManager() == cacheManagerBar); public static class MaliciousClassLoader extends ClassLoader { @Override public String toString() { return "foo"; } } ```
1.0
[cache] Cache.cacheManager may be overwritten with a different CacheManager - Non-standard creation of `Cache`s or a crafted `URI`/`Classloader` combination may overwrite an existing `Cache`'s `cacheManager` field with a `CacheManager` that did not create the `Cache`. ``` HazelcastInstance hz = Hazelcast.newHazelcastInstance(); HazelcastInstance client = HazelcastClient.newHazelcastClient(); Properties properties = HazelcastCachingProvider.propertiesByInstanceItself(client); CachingProvider caching = Caching.getCachingProvider("com.hazelcast.client.cache.impl.HazelcastClientCachingProvider"); CacheManager cacheManagerFoo = caching.getCacheManager(new URI("foo"), null, properties); CacheManager cacheManagerBar = caching.getCacheManager(null, new MaliciousClassLoader(Bootstrap.class.getClassLoader()), properties); CacheConfig cacheConfig = new CacheConfig("the-cache"); Cache cache1 = cacheManagerFoo.createCache("the-cache", cacheConfig); // will print false, cache1.cacheManager is cacheManagerFoo System.out.println(cache1.getCacheManager() == cacheManagerBar); Cache cache2 = cacheManagerBar.getCache("the-cache"); // both statements below will print true System.out.println(cache1 == cache2); System.out.println(cache1.getCacheManager() == cacheManagerBar); public static class MaliciousClassLoader extends ClassLoader { @Override public String toString() { return "foo"; } } ```
defect
cache cachemanager may be overwritten with a different cachemanager non standard creation of cache s or a crafted uri classloader combination may overwrite an existing cache s cachemanager field with a cachemanager that did not create the cache hazelcastinstance hz hazelcast newhazelcastinstance hazelcastinstance client hazelcastclient newhazelcastclient properties properties hazelcastcachingprovider propertiesbyinstanceitself client cachingprovider caching caching getcachingprovider com hazelcast client cache impl hazelcastclientcachingprovider cachemanager cachemanagerfoo caching getcachemanager new uri foo null properties cachemanager cachemanagerbar caching getcachemanager null new maliciousclassloader bootstrap class getclassloader properties cacheconfig cacheconfig new cacheconfig the cache cache cachemanagerfoo createcache the cache cacheconfig will print false cachemanager is cachemanagerfoo system out println getcachemanager cachemanagerbar cache cachemanagerbar getcache the cache both statements below will print true system out println system out println getcachemanager cachemanagerbar public static class maliciousclassloader extends classloader override public string tostring return foo
1
29,689
5,661,370,159
IssuesEvent
2017-04-10 17:12:07
mit-crpg/openmc
https://api.github.com/repos/mit-crpg/openmc
opened
Normalization of pin powers in MGXS Libraries example
Documentation First-Timers-Only MGXS
When normalizing the pin powers before plotting them in openmc (and openmoc), the 0 power at the guide tubes is making the mean wrong.
1.0
Normalization of pin powers in MGXS Libraries example - When normalizing the pin powers before plotting them in openmc (and openmoc), the 0 power at the guide tubes is making the mean wrong.
non_defect
normalization of pin powers in mgxs libraries example when normalizing the pin powers before plotting them in openmc and openmoc the power at the guide tubes is making the mean wrong
0
72,058
23,911,475,304
IssuesEvent
2022-09-09 08:37:35
vector-im/element-ios
https://api.github.com/repos/vector-im/element-ios
closed
[Start DM] the composer doesn't expand vertically as you enter the message in deferred mode
T-Defect A-Composer A-Message S-Minor O-Frequent A-Create-Room Z-PS-Request
### Steps to reproduce 1- Select "Start Chat" option 2- Select a contact for which there is no DM yet 3- Compose a message with multi-lines -> the composer doesn't expand vertically NOK ### Outcome the composer should expand vertically like it does in an actual room ### Your phone model _No response_ ### Operating system version _No response_ ### Application version v1.9.3 RC ### Homeserver _No response_ ### Will you send logs? No
1.0
[Start DM] the composer doesn't expand vertically as you enter the message in deferred mode - ### Steps to reproduce 1- Select "Start Chat" option 2- Select a contact for which there is no DM yet 3- Compose a message with multi-lines -> the composer doesn't expand vertically NOK ### Outcome the composer should expand vertically like it does in an actual room ### Your phone model _No response_ ### Operating system version _No response_ ### Application version v1.9.3 RC ### Homeserver _No response_ ### Will you send logs? No
defect
the composer doesn t expand vertically as you enter the message in deferred mode steps to reproduce select start chat option select a contact for which there is no dm yet compose a message with multi lines the composer doesn t expand vertically nok outcome the composer should expand vertically like it does in an actual room your phone model no response operating system version no response application version rc homeserver no response will you send logs no
1
76,486
9,457,958,410
IssuesEvent
2019-04-17 02:53:06
aspnet/AspNetCore
https://api.github.com/repos/aspnet/AspNetCore
closed
Default files does not consider Accept header
area-middleware by design enhancement
### Describe the bug If you have an API that returns json from a `/` path but you also have a index.html in `wwwroot` folder and make a request with `Accept:application/json` you will get back HTML. ### To Reproduce Steps to reproduce the behavior: 1. Using this version of ASP.NET Core 2.2 2. ``` app.UseDefaultFiles(); app.UseStaticFiles(); app.UseCarter() OR app.UseMvc() OR app.UseRouting() ``` 3. `curl -H "Accept:application/json" localhost:5000` 4. See index.html response from wwwroot folder ### Expected behavior To see the json response
1.0
Default files does not consider Accept header - ### Describe the bug If you have an API that returns json from a `/` path but you also have a index.html in `wwwroot` folder and make a request with `Accept:application/json` you will get back HTML. ### To Reproduce Steps to reproduce the behavior: 1. Using this version of ASP.NET Core 2.2 2. ``` app.UseDefaultFiles(); app.UseStaticFiles(); app.UseCarter() OR app.UseMvc() OR app.UseRouting() ``` 3. `curl -H "Accept:application/json" localhost:5000` 4. See index.html response from wwwroot folder ### Expected behavior To see the json response
non_defect
default files does not consider accept header describe the bug if you have an api that returns json from a path but you also have a index html in wwwroot folder and make a request with accept application json you will get back html to reproduce steps to reproduce the behavior using this version of asp net core app usedefaultfiles app usestaticfiles app usecarter or app usemvc or app userouting curl h accept application json localhost see index html response from wwwroot folder expected behavior to see the json response
0
49,621
13,187,241,293
IssuesEvent
2020-08-13 02:47:41
icecube-trac/tix3
https://api.github.com/repos/icecube-trac/tix3
opened
cascade extension applied twice (Trac #1763)
Incomplete Migration Migrated from Trac combo simulation defect
<details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1763">https://code.icecube.wisc.edu/ticket/1763</a>, reported by claudio.kopper and owned by jvansanten</em></summary> <p> ```json { "status": "closed", "changetime": "2019-02-13T14:12:47", "description": "From slack:\n\n @musner just presented on the cascade call that there is an issue in simprod with cascade simulation. Short summary is: cascasde extension is simulated twice, first by CMC, then by clsim. So it needs to be disabled in clsim. I added an option to\n simprod-scripts/python/segments/HybridPhotonicsCLSim.py / the PropagatePhotons segment needs to be used with UseCascadeExtension = False when the PropagateMuons() segment is used in the chain before it.\n\n I think this needs to be added somewhere in\n simprod-scripts/python/modules/*.py, but I am unsure where\n\n it also must not be set if you don\u2019t use CMC (i.e. for Genie/Geant4 I guess)\n\n could you add the relevant options to the various scripts? We could also make\n UseCascadeExtension = False the default\n\nThis requires a bugfix release of simulation.", "reporter": "claudio.kopper", "cc": "olivas", "resolution": "fixed", "_ts": "1550067167842669", "component": "combo simulation", "summary": "cascade extension applied twice", "priority": "blocker", "keywords": "", "time": "2016-06-27T18:53:20", "milestone": "", "owner": "jvansanten", "type": "defect" } ``` </p> </details>
1.0
cascade extension applied twice (Trac #1763) - <details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1763">https://code.icecube.wisc.edu/ticket/1763</a>, reported by claudio.kopper and owned by jvansanten</em></summary> <p> ```json { "status": "closed", "changetime": "2019-02-13T14:12:47", "description": "From slack:\n\n @musner just presented on the cascade call that there is an issue in simprod with cascade simulation. Short summary is: cascasde extension is simulated twice, first by CMC, then by clsim. So it needs to be disabled in clsim. I added an option to\n simprod-scripts/python/segments/HybridPhotonicsCLSim.py / the PropagatePhotons segment needs to be used with UseCascadeExtension = False when the PropagateMuons() segment is used in the chain before it.\n\n I think this needs to be added somewhere in\n simprod-scripts/python/modules/*.py, but I am unsure where\n\n it also must not be set if you don\u2019t use CMC (i.e. for Genie/Geant4 I guess)\n\n could you add the relevant options to the various scripts? We could also make\n UseCascadeExtension = False the default\n\nThis requires a bugfix release of simulation.", "reporter": "claudio.kopper", "cc": "olivas", "resolution": "fixed", "_ts": "1550067167842669", "component": "combo simulation", "summary": "cascade extension applied twice", "priority": "blocker", "keywords": "", "time": "2016-06-27T18:53:20", "milestone": "", "owner": "jvansanten", "type": "defect" } ``` </p> </details>
defect
cascade extension applied twice trac migrated from json status closed changetime description from slack n n musner just presented on the cascade call that there is an issue in simprod with cascade simulation short summary is cascasde extension is simulated twice first by cmc then by clsim so it needs to be disabled in clsim i added an option to n simprod scripts python segments hybridphotonicsclsim py the propagatephotons segment needs to be used with usecascadeextension false when the propagatemuons segment is used in the chain before it n n i think this needs to be added somewhere in n simprod scripts python modules py but i am unsure where n n it also must not be set if you don use cmc i e for genie i guess n n could you add the relevant options to the various scripts we could also make n usecascadeextension false the default n nthis requires a bugfix release of simulation reporter claudio kopper cc olivas resolution fixed ts component combo simulation summary cascade extension applied twice priority blocker keywords time milestone owner jvansanten type defect
1
79,570
28,375,915,534
IssuesEvent
2023-04-12 20:51:16
JohnAustinDev/xulsword
https://api.github.com/repos/JohnAustinDev/xulsword
closed
Verse markup (blue) in multiple columns
Type-Defect Priority-Medium auto-migrated
``` What steps will reproduce the problem? Markup (in blue) of the active verse in Bible text in multiple columns (e.g. ESV/MAT): Working with the next tools on the toolbar: - Next chapter (4.11 -> 5.1): no verse blue - Next verse (5.1 -> 5.2): 5.1 blue - Next verse (5.2 -> 5.3): 5.3 blue ... - Next verse (5.14 -> 5.15): 5.15 blue - Next verse (5.15 -> 5.16): 5.15 stays blue - Next verse (5.16 -> 5.17): 5.15 blue ... - Next verse (5.25 -> 5.26): 5.15 blue - Next verse (5.26 -> 5.27): no verse blue ... - Next verse (6.13 -> 6.14): still no verse blue Working with the previous tools on the toolbar: - Previous chapter (6.14 -> 5.1): 5.1 not blue - Previous verse (5.1 -> 4.25): 4.25 blue - Previous verse (4.25 -> 4.24): 4.25 stays blue - Previous verse (4.16 -> 4.15): 4.25 blue - Previous verse (4.15 -> 4.14): no verse blue ... - Previous verse (4.12 -> 4.11): no verse blue - Previous verse (4.11 -> 4.10): 4.10 blue - Previous verse (4.10 -> 4.9): 4.9 blue ... - Previous verse (4.2 -> 4.1): 4.1 blue - Previous verse (4.1 -> 3.17): 3.17 blue - Previous verse (4.17 -> 3.16): 3.17 stays blue ... There are three behaviors of marking verses in blue - marking the active verse (I think this is what this markup is for) - mark-up stops at previous/following verse - no markup What is the expected output? What do you see instead? -> If would be nice if the same method could be used, this makes it easier to work with XS. What version of the product are you using? On what operating system? xulSword Portable-2.15 in WinXP Please provide any additional information below. This does not occur, if Bible text is not spread over several windows. ``` Original issue reported on code.google.com by `wolfgang...@gmx.de` on 9 Sep 2010 at 11:06
1.0
Verse markup (blue) in multiple columns - ``` What steps will reproduce the problem? Markup (in blue) of the active verse in Bible text in multiple columns (e.g. ESV/MAT): Working with the next tools on the toolbar: - Next chapter (4.11 -> 5.1): no verse blue - Next verse (5.1 -> 5.2): 5.1 blue - Next verse (5.2 -> 5.3): 5.3 blue ... - Next verse (5.14 -> 5.15): 5.15 blue - Next verse (5.15 -> 5.16): 5.15 stays blue - Next verse (5.16 -> 5.17): 5.15 blue ... - Next verse (5.25 -> 5.26): 5.15 blue - Next verse (5.26 -> 5.27): no verse blue ... - Next verse (6.13 -> 6.14): still no verse blue Working with the previous tools on the toolbar: - Previous chapter (6.14 -> 5.1): 5.1 not blue - Previous verse (5.1 -> 4.25): 4.25 blue - Previous verse (4.25 -> 4.24): 4.25 stays blue - Previous verse (4.16 -> 4.15): 4.25 blue - Previous verse (4.15 -> 4.14): no verse blue ... - Previous verse (4.12 -> 4.11): no verse blue - Previous verse (4.11 -> 4.10): 4.10 blue - Previous verse (4.10 -> 4.9): 4.9 blue ... - Previous verse (4.2 -> 4.1): 4.1 blue - Previous verse (4.1 -> 3.17): 3.17 blue - Previous verse (4.17 -> 3.16): 3.17 stays blue ... There are three behaviors of marking verses in blue - marking the active verse (I think this is what this markup is for) - mark-up stops at previous/following verse - no markup What is the expected output? What do you see instead? -> If would be nice if the same method could be used, this makes it easier to work with XS. What version of the product are you using? On what operating system? xulSword Portable-2.15 in WinXP Please provide any additional information below. This does not occur, if Bible text is not spread over several windows. ``` Original issue reported on code.google.com by `wolfgang...@gmx.de` on 9 Sep 2010 at 11:06
defect
verse markup blue in multiple columns what steps will reproduce the problem markup in blue of the active verse in bible text in multiple columns e g esv mat working with the next tools on the toolbar next chapter no verse blue next verse blue next verse blue next verse blue next verse stays blue next verse blue next verse blue next verse no verse blue next verse still no verse blue working with the previous tools on the toolbar previous chapter not blue previous verse blue previous verse stays blue previous verse blue previous verse no verse blue previous verse no verse blue previous verse blue previous verse blue previous verse blue previous verse blue previous verse stays blue there are three behaviors of marking verses in blue marking the active verse i think this is what this markup is for mark up stops at previous following verse no markup what is the expected output what do you see instead if would be nice if the same method could be used this makes it easier to work with xs what version of the product are you using on what operating system xulsword portable in winxp please provide any additional information below this does not occur if bible text is not spread over several windows original issue reported on code google com by wolfgang gmx de on sep at
1
84,825
10,566,406,803
IssuesEvent
2019-10-05 18:31:44
cornell-dti/samwise
https://api.github.com/repos/cornell-dti/samwise
opened
"Last Day of Class" buttons need styling
bug design enhancement frontend good first issue
**Describe the bug** The two buttons on the date picker when creating a new repeating task needs styling. **To Reproduce** Try to create a repeating task and set an end date **Screenshots** ![image](https://user-images.githubusercontent.com/6147405/66259229-897ad400-e77c-11e9-8dc0-cc1b814e5a8b.png) **Styles** We don't have official designs yet, but just to get something going, model the buttons after the "specify time" button from this design on Zeplin for now: ![image](https://user-images.githubusercontent.com/6147405/66259257-d1016000-e77c-11e9-8f7a-25442d1a9684.png)
1.0
"Last Day of Class" buttons need styling - **Describe the bug** The two buttons on the date picker when creating a new repeating task needs styling. **To Reproduce** Try to create a repeating task and set an end date **Screenshots** ![image](https://user-images.githubusercontent.com/6147405/66259229-897ad400-e77c-11e9-8dc0-cc1b814e5a8b.png) **Styles** We don't have official designs yet, but just to get something going, model the buttons after the "specify time" button from this design on Zeplin for now: ![image](https://user-images.githubusercontent.com/6147405/66259257-d1016000-e77c-11e9-8f7a-25442d1a9684.png)
non_defect
last day of class buttons need styling describe the bug the two buttons on the date picker when creating a new repeating task needs styling to reproduce try to create a repeating task and set an end date screenshots styles we don t have official designs yet but just to get something going model the buttons after the specify time button from this design on zeplin for now
0
662,048
22,101,622,856
IssuesEvent
2022-06-01 14:08:06
Tomas-Kraus/metro-jax-ws
https://api.github.com/repos/Tomas-Kraus/metro-jax-ws
opened
Duplicate WS-A namespace declarations in headers
Priority: Major Component: runtime Type: Improvement ERR: Assignee
When WS-Addressing headers are written over the wire, each header declares it's own namespace. To get a performance boost, it would be required to declare the namespace at Envelope or Header level and then reuse the prefix in the header implementation. Currently, headers are not aware of the information in their parent element. So this may require making our XMLStreamWriter smart where they do not write any namespace declarations if they are already defined in the scope. This fix will reduce the overall message size. #### Environment Operating System: All Platform: All #### Affected Versions [JAXWS 2.1 EA2]
1.0
Duplicate WS-A namespace declarations in headers - When WS-Addressing headers are written over the wire, each header declares it's own namespace. To get a performance boost, it would be required to declare the namespace at Envelope or Header level and then reuse the prefix in the header implementation. Currently, headers are not aware of the information in their parent element. So this may require making our XMLStreamWriter smart where they do not write any namespace declarations if they are already defined in the scope. This fix will reduce the overall message size. #### Environment Operating System: All Platform: All #### Affected Versions [JAXWS 2.1 EA2]
non_defect
duplicate ws a namespace declarations in headers when ws addressing headers are written over the wire each header declares it s own namespace to get a performance boost it would be required to declare the namespace at envelope or header level and then reuse the prefix in the header implementation currently headers are not aware of the information in their parent element so this may require making our xmlstreamwriter smart where they do not write any namespace declarations if they are already defined in the scope this fix will reduce the overall message size environment operating system all platform all affected versions
0
73,773
24,795,449,478
IssuesEvent
2022-10-24 16:50:51
primefaces/primeng
https://api.github.com/repos/primefaces/primeng
closed
Component level styles not working
defect
### Describe the bug I am trying to overwrite the p-menu padding, So I tried adding CSS to my header.componentscss. Unfortunately that was not working. So I tried adding the same CSS in my style.scss file and that is working perfectly. `.p-menu { padding: vars.$main-padding !important; }` This is my CSS. ### Environment Angular version 14.2.0 ### Angular version 14.2.0 ### PrimeNG version 14.1.2 ### Build / Runtime Angular CLI App ### Language TypeScript ### Node version (for AoT issues node --version) 18.10.0 ### Browser(s) Google Chrome - Version 106.0.5249.119 (Official Build) (64-bit)
1.0
Component level styles not working - ### Describe the bug I am trying to overwrite the p-menu padding, So I tried adding CSS to my header.componentscss. Unfortunately that was not working. So I tried adding the same CSS in my style.scss file and that is working perfectly. `.p-menu { padding: vars.$main-padding !important; }` This is my CSS. ### Environment Angular version 14.2.0 ### Angular version 14.2.0 ### PrimeNG version 14.1.2 ### Build / Runtime Angular CLI App ### Language TypeScript ### Node version (for AoT issues node --version) 18.10.0 ### Browser(s) Google Chrome - Version 106.0.5249.119 (Official Build) (64-bit)
defect
component level styles not working describe the bug i am trying to overwrite the p menu padding so i tried adding css to my header componentscss unfortunately that was not working so i tried adding the same css in my style scss file and that is working perfectly p menu padding vars main padding important this is my css environment angular version angular version primeng version build runtime angular cli app language typescript node version for aot issues node version browser s google chrome version official build bit
1
256,088
27,552,612,512
IssuesEvent
2023-03-07 15:51:19
BrianMcDonaldWS/genie
https://api.github.com/repos/BrianMcDonaldWS/genie
opened
CVE-2019-11358 (Medium) detected in jquery-3.3.1.tgz, jquery-1.7.2.min.js
security vulnerability
## CVE-2019-11358 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-3.3.1.tgz</b>, <b>jquery-1.7.2.min.js</b></p></summary> <p> <details><summary><b>jquery-3.3.1.tgz</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://registry.npmjs.org/jquery/-/jquery-3.3.1.tgz">https://registry.npmjs.org/jquery/-/jquery-3.3.1.tgz</a></p> <p>Path to dependency file: /genie-ui/package.json</p> <p>Path to vulnerable library: /genie-ui/node_modules/jquery/package.json</p> <p> Dependency Hierarchy: - :x: **jquery-3.3.1.tgz** (Vulnerable Library) </details> <details><summary><b>jquery-1.7.2.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery.min.js</a></p> <p>Path to dependency file: /genie-ui/node_modules/js-base64/test/index.html</p> <p>Path to vulnerable library: /odules/js-base64/test/index.html</p> <p> Dependency Hierarchy: - :x: **jquery-1.7.2.min.js** (Vulnerable Library) </details> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jQuery before 3.4.0, as used in Drupal, Backdrop CMS, and other products, mishandles jQuery.extend(true, {}, ...) because of Object.prototype pollution. If an unsanitized source object contained an enumerable __proto__ property, it could extend the native Object.prototype. <p>Publish Date: 2019-04-20 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-11358>CVE-2019-11358</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358</a></p> <p>Release Date: 2019-04-20</p> <p>Fix Resolution: 3.4.0</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
True
CVE-2019-11358 (Medium) detected in jquery-3.3.1.tgz, jquery-1.7.2.min.js - ## CVE-2019-11358 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-3.3.1.tgz</b>, <b>jquery-1.7.2.min.js</b></p></summary> <p> <details><summary><b>jquery-3.3.1.tgz</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://registry.npmjs.org/jquery/-/jquery-3.3.1.tgz">https://registry.npmjs.org/jquery/-/jquery-3.3.1.tgz</a></p> <p>Path to dependency file: /genie-ui/package.json</p> <p>Path to vulnerable library: /genie-ui/node_modules/jquery/package.json</p> <p> Dependency Hierarchy: - :x: **jquery-3.3.1.tgz** (Vulnerable Library) </details> <details><summary><b>jquery-1.7.2.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery.min.js</a></p> <p>Path to dependency file: /genie-ui/node_modules/js-base64/test/index.html</p> <p>Path to vulnerable library: /odules/js-base64/test/index.html</p> <p> Dependency Hierarchy: - :x: **jquery-1.7.2.min.js** (Vulnerable Library) </details> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jQuery before 3.4.0, as used in Drupal, Backdrop CMS, and other products, mishandles jQuery.extend(true, {}, ...) because of Object.prototype pollution. If an unsanitized source object contained an enumerable __proto__ property, it could extend the native Object.prototype. <p>Publish Date: 2019-04-20 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-11358>CVE-2019-11358</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358</a></p> <p>Release Date: 2019-04-20</p> <p>Fix Resolution: 3.4.0</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
non_defect
cve medium detected in jquery tgz jquery min js cve medium severity vulnerability vulnerable libraries jquery tgz jquery min js jquery tgz javascript library for dom operations library home page a href path to dependency file genie ui package json path to vulnerable library genie ui node modules jquery package json dependency hierarchy x jquery tgz vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file genie ui node modules js test index html path to vulnerable library odules js test index html dependency hierarchy x jquery min js vulnerable library vulnerability details jquery before as used in drupal backdrop cms and other products mishandles jquery extend true because of object prototype pollution if an unsanitized source object contained an enumerable proto property it could extend the native object prototype publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution check this box to open an automated fix pr
0
72,526
24,164,644,564
IssuesEvent
2022-09-22 14:11:59
vector-im/element-web
https://api.github.com/repos/vector-im/element-web
opened
Stickers aren't displayed in Safari
T-Defect
### Steps to reproduce 1. Open the "More options" menu 2. Select Sticker 3. Click "Add some now" 4. Add the packs you'd like 5. Close the modal 6. Repeat step 1 & 2 7. There is no stickers ### Outcome #### What did you expect? Stickers should appear in the small modal after being added. #### What happened instead? No stickers appears and you are looped back to the setup steps. ### Operating system macOS v12.6 ### Browser information Safari v16 (17614.1.25.9.10, 17614) ### URL for webapp Private server, Element v1.11 ### Application version XWikiRiot version: 1.11.0, Olm version: 3.2.8 ### Homeserver Synapse (unkown version) ### Will you send logs? Yes
1.0
Stickers aren't displayed in Safari - ### Steps to reproduce 1. Open the "More options" menu 2. Select Sticker 3. Click "Add some now" 4. Add the packs you'd like 5. Close the modal 6. Repeat step 1 & 2 7. There is no stickers ### Outcome #### What did you expect? Stickers should appear in the small modal after being added. #### What happened instead? No stickers appears and you are looped back to the setup steps. ### Operating system macOS v12.6 ### Browser information Safari v16 (17614.1.25.9.10, 17614) ### URL for webapp Private server, Element v1.11 ### Application version XWikiRiot version: 1.11.0, Olm version: 3.2.8 ### Homeserver Synapse (unkown version) ### Will you send logs? Yes
defect
stickers aren t displayed in safari steps to reproduce open the more options menu select sticker click add some now add the packs you d like close the modal repeat step there is no stickers outcome what did you expect stickers should appear in the small modal after being added what happened instead no stickers appears and you are looped back to the setup steps operating system macos browser information safari url for webapp private server element application version xwikiriot version olm version homeserver synapse unkown version will you send logs yes
1
89,874
15,855,950,195
IssuesEvent
2021-04-08 01:09:15
rsoreq/django-DefectDojo
https://api.github.com/repos/rsoreq/django-DefectDojo
reopened
CVE-2018-14040 (Medium) detected in bootstrap-3.3.5.min.js, bootstrap-3.3.4.min.js
security vulnerability
## CVE-2018-14040 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bootstrap-3.3.4.min.js</b>, <b>bootstrap-3.3.5.min.js</b></p></summary> <p> <details><summary><b>bootstrap-3.3.4.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.4/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.4/js/bootstrap.min.js</a></p> <p>Path to dependency file: django-DefectDojo/components/node_modules/bootstrap-wysiwyg/index.html</p> <p>Path to vulnerable library: django-DefectDojo/components/node_modules/bootstrap-wysiwyg/index.html,django-DefectDojo/components/node_modules/startbootstrap-sb-admin-2/pages/../bower_components/bootstrap/dist/js/bootstrap.min.js</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.4.min.js** (Vulnerable Library) </details> <details><summary><b>bootstrap-3.3.5.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js</a></p> <p>Path to dependency file: django-DefectDojo/components/node_modules/bootstrap-wysiwyg/examples/events.html</p> <p>Path to vulnerable library: django-DefectDojo/components/node_modules/bootstrap-wysiwyg/examples/events.html</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.5.min.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/rsoreq/django-DefectDojo/commit/778bcf0b3400f30c71d722f50e221c2eec64ea95">778bcf0b3400f30c71d722f50e221c2eec64ea95</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap before 4.1.2, XSS is possible in the collapse data-parent attribute. <p>Publish Date: 2018-07-13 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040>CVE-2018-14040</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/twbs/bootstrap/pull/26630">https://github.com/twbs/bootstrap/pull/26630</a></p> <p>Release Date: 2018-07-13</p> <p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.4","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"},{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.5","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"}],"vulnerabilityIdentifier":"CVE-2018-14040","vulnerabilityDetails":"In Bootstrap before 4.1.2, XSS is possible in the collapse data-parent attribute.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
True
CVE-2018-14040 (Medium) detected in bootstrap-3.3.5.min.js, bootstrap-3.3.4.min.js - ## CVE-2018-14040 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bootstrap-3.3.4.min.js</b>, <b>bootstrap-3.3.5.min.js</b></p></summary> <p> <details><summary><b>bootstrap-3.3.4.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.4/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.4/js/bootstrap.min.js</a></p> <p>Path to dependency file: django-DefectDojo/components/node_modules/bootstrap-wysiwyg/index.html</p> <p>Path to vulnerable library: django-DefectDojo/components/node_modules/bootstrap-wysiwyg/index.html,django-DefectDojo/components/node_modules/startbootstrap-sb-admin-2/pages/../bower_components/bootstrap/dist/js/bootstrap.min.js</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.4.min.js** (Vulnerable Library) </details> <details><summary><b>bootstrap-3.3.5.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js</a></p> <p>Path to dependency file: django-DefectDojo/components/node_modules/bootstrap-wysiwyg/examples/events.html</p> <p>Path to vulnerable library: django-DefectDojo/components/node_modules/bootstrap-wysiwyg/examples/events.html</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.5.min.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/rsoreq/django-DefectDojo/commit/778bcf0b3400f30c71d722f50e221c2eec64ea95">778bcf0b3400f30c71d722f50e221c2eec64ea95</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap before 4.1.2, XSS is possible in the collapse data-parent attribute. <p>Publish Date: 2018-07-13 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040>CVE-2018-14040</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/twbs/bootstrap/pull/26630">https://github.com/twbs/bootstrap/pull/26630</a></p> <p>Release Date: 2018-07-13</p> <p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.4","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"},{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.5","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"}],"vulnerabilityIdentifier":"CVE-2018-14040","vulnerabilityDetails":"In Bootstrap before 4.1.2, XSS is possible in the collapse data-parent attribute.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
non_defect
cve medium detected in bootstrap min js bootstrap min js cve medium severity vulnerability vulnerable libraries bootstrap min js bootstrap min js bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to dependency file django defectdojo components node modules bootstrap wysiwyg index html path to vulnerable library django defectdojo components node modules bootstrap wysiwyg index html django defectdojo components node modules startbootstrap sb admin pages bower components bootstrap dist js bootstrap min js dependency hierarchy x bootstrap min js vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to dependency file django defectdojo components node modules bootstrap wysiwyg examples events html path to vulnerable library django defectdojo components node modules bootstrap wysiwyg examples events html dependency hierarchy x bootstrap min js vulnerable library found in head commit a href found in base branch master vulnerability details in bootstrap before xss is possible in the collapse data parent attribute publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org webjars npm bootstrap org webjars bootstrap isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails in bootstrap before xss is possible in the collapse data parent attribute vulnerabilityurl
0
554,933
16,442,912,181
IssuesEvent
2021-05-20 16:08:58
meerk40t/meerk40t
https://api.github.com/repos/meerk40t/meerk40t
closed
Resample Issue - RasterWizard, Resample and DitherType Disabled
Context: UI/UX Priority: Low Status: Postponed Type: Enhancement Work: Obvious
why can't I set the exact size? I can only set the size with the mouse ![12](https://user-images.githubusercontent.com/61805701/96395278-85bdc200-11c4-11eb-80ec-ee8fad80351e.jpg) dithering doesn't work either ![13](https://user-images.githubusercontent.com/61805701/96395500-12688000-11c5-11eb-9c36-c636887283e6.jpg)
1.0
Resample Issue - RasterWizard, Resample and DitherType Disabled - why can't I set the exact size? I can only set the size with the mouse ![12](https://user-images.githubusercontent.com/61805701/96395278-85bdc200-11c4-11eb-80ec-ee8fad80351e.jpg) dithering doesn't work either ![13](https://user-images.githubusercontent.com/61805701/96395500-12688000-11c5-11eb-9c36-c636887283e6.jpg)
non_defect
resample issue rasterwizard resample and dithertype disabled why can t i set the exact size i can only set the size with the mouse dithering doesn t work either
0
346,024
30,860,196,563
IssuesEvent
2023-08-03 01:53:19
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
sql: TestDistSQLFlowsVirtualTables failed
C-test-failure O-robot branch-master T-sql-queries
sql.TestDistSQLFlowsVirtualTables [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Ci_TestsAwsLinuxArm64_UnitTests/11093936?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Ci_TestsAwsLinuxArm64_UnitTests/11093936?buildTab=artifacts#/) on master @ [51aa6257c063f60fb76044f9cccc47e259930d00](https://github.com/cockroachdb/cockroach/commits/51aa6257c063f60fb76044f9cccc47e259930d00): Fatal error: ``` panic: test timed out after 14m55s ``` Stack: ``` goroutine 59003 [running]: testing.(*M).startAlarm.func1() GOROOT/src/testing/testing.go:2036 +0x88 created by time.goFunc GOROOT/src/time/sleep.go:176 +0x38 ``` <details><summary>Log preceding fatal error</summary> <p> ``` * created by github.com/cockroachdb/pebble/record.NewLogWriter * github.com/cockroachdb/pebble/record/external/com_github_cockroachdb_pebble/record/log_writer.go:354 +0x3f0 * * goroutine 59018 [chan receive, 2 minutes]: * github.com/cockroachdb/cockroach/pkg/sql_test.TestDistSQLFlowsVirtualTables.func4({0x403bb6c5a0?, 0x739b280?}, 0x40354fbec0?) * github.com/cockroachdb/cockroach/pkg/sql_test/pkg/sql/crdb_internal_test.go:564 +0x184 * github.com/cockroachdb/cockroach/pkg/kv/kvserver.(*Replica).SendWithWriteBytes(0x4019093900, {0x739b280?, 0x40354fbe90?}, 0x403bb6c5a0) * github.com/cockroachdb/cockroach/pkg/kv/kvserver/pkg/kv/kvserver/replica_send.go:173 +0x3fc * github.com/cockroachdb/cockroach/pkg/kv/kvserver.(*Store).SendWithWriteBytes(0x4018d34000, {0x739b280?, 0x40354fbe60?}, 0x403bb6c5a0) * github.com/cockroachdb/cockroach/pkg/kv/kvserver/pkg/kv/kvserver/store_send.go:193 +0x44c * github.com/cockroachdb/cockroach/pkg/kv/kvserver.(*Stores).SendWithWriteBytes(0x402dc9f9d0?, {0x739b280, 0x40354fbe60}, 0x403bb6c5a0) * github.com/cockroachdb/cockroach/pkg/kv/kvserver/pkg/kv/kvserver/stores.go:202 +0x9c * github.com/cockroachdb/cockroach/pkg/server.(*Node).batchInternal(0x4035ae1c00, {0x739b280?, 0x40354fbe30?}, {0x401a382360?}, 0x403bb6c5a0) * github.com/cockroachdb/cockroach/pkg/server/node.go:1301 +0x2e4 * github.com/cockroachdb/cockroach/pkg/server.(*Node).Batch(0x4035ae1c00, {0x739b280, 0x40354fbd70}, 0x403bb6c5a0) * github.com/cockroachdb/cockroach/pkg/server/node.go:1432 +0x234 * github.com/cockroachdb/cockroach/pkg/kv/kvpb._Internal_Batch_Handler.func1({0x739b280, 0x40354fbd70}, {0x5af93a0?, 0x403bb6c5a0}) * github.com/cockroachdb/cockroach/pkg/kv/kvpb/bazel-out/aarch64-fastbuild/bin/pkg/kv/kvpb/kvpb_go_proto_/github.com/cockroachdb/cockroach/pkg/kv/kvpb/api.pb.go:10277 +0x74 * github.com/cockroachdb/cockroach/pkg/util/tracing/grpcinterceptor.ServerInterceptor.func1({0x739b280, 0x40354fbd70}, {0x5af93a0, 0x403bb6c5a0}, 0x40062dc120, 0x401f9102a0) * github.com/cockroachdb/cockroach/pkg/util/tracing/grpcinterceptor/grpc_interceptor.go:97 +0x1b4 * google.golang.org/grpc.getChainUnaryHandler.func1({0x739b280, 0x40354fbd70}, {0x5af93a0, 0x403bb6c5a0}) * google.golang.org/grpc/external/org_golang_google_grpc/server.go:1163 +0xa0 * github.com/cockroachdb/cockroach/pkg/rpc.NewServerEx.func3({0x739b280, 0x40354fbd70}, {0x5af93a0, 0x403bb6c5a0}, 0x40062dc120?, 0x4005cfb1c0) * github.com/cockroachdb/cockroach/pkg/rpc/pkg/rpc/context.go:167 +0x88 * google.golang.org/grpc.getChainUnaryHandler.func1({0x739b280, 0x40354fbd70}, {0x5af93a0, 0x403bb6c5a0}) * google.golang.org/grpc/external/org_golang_google_grpc/server.go:1163 +0xa0 * github.com/cockroachdb/cockroach/pkg/rpc.kvAuth.unaryInterceptor({0x4005c84000?, {{0x5780080?}, {0x73d9a20?, 0x4019016030?}}}, {0x739b280, 0x40354fbd70}, {0x5af93a0, 0x403bb6c5a0}, 0x40062dc120, 0x4005cfb140) * github.com/cockroachdb/cockroach/pkg/rpc/pkg/rpc/auth.go:105 +0x1ec * google.golang.org/grpc.getChainUnaryHandler.func1({0x739b280, 0x40354fbd70}, {0x5af93a0, 0x403bb6c5a0}) * google.golang.org/grpc/external/org_golang_google_grpc/server.go:1163 +0xa0 * github.com/cockroachdb/cockroach/pkg/rpc.NewServerEx.func1.1({0x739b280?, 0x40354fbd70?}) * github.com/cockroachdb/cockroach/pkg/rpc/pkg/rpc/context.go:134 +0x3c * github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunTaskWithErr(0x40039fd7c0, {0x739b280, 0x40354fbd70}, {0xc04ddc?, 0x38?}, 0x400b5b5910) * github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:336 +0xac * github.com/cockroachdb/cockroach/pkg/rpc.NewServerEx.func1({0x739b280?, 0x40354fbd70?}, {0x5af93a0?, 0x403bb6c5a0?}, 0x40062dc120?, 0x401f9102a0?) * github.com/cockroachdb/cockroach/pkg/rpc/pkg/rpc/context.go:132 +0x7c * google.golang.org/grpc.chainUnaryInterceptors.func1({0x739b280, 0x40354fbd70}, {0x5af93a0, 0x403bb6c5a0}, 0x400a1979f8?, 0xc05988?) * google.golang.org/grpc/external/org_golang_google_grpc/server.go:1154 +0x88 * github.com/cockroachdb/cockroach/pkg/kv/kvpb._Internal_Batch_Handler({0x5acbda0?, 0x4035ae1c00}, {0x739b280, 0x40354fbd70}, 0x400cf18070, 0x40334532a0) * github.com/cockroachdb/cockroach/pkg/kv/kvpb/bazel-out/aarch64-fastbuild/bin/pkg/kv/kvpb/kvpb_go_proto_/github.com/cockroachdb/cockroach/pkg/kv/kvpb/api.pb.go:10279 +0x134 * google.golang.org/grpc.(*Server).processUnaryRPC(0x4001f3ad20, {0x73ed9e0, 0x40367ac820}, 0x403bb6c480, 0x401960df50, 0xa820800, 0x0) * google.golang.org/grpc/external/org_golang_google_grpc/server.go:1336 +0xb68 * google.golang.org/grpc.(*Server).handleStream(0x4001f3ad20, {0x73ed9e0, 0x40367ac820}, 0x403bb6c480, 0x0) * google.golang.org/grpc/external/org_golang_google_grpc/server.go:1704 +0x840 * google.golang.org/grpc.(*Server).serveStreams.func1.2() * google.golang.org/grpc/external/org_golang_google_grpc/server.go:965 +0x84 * created by google.golang.org/grpc.(*Server).serveStreams.func1 * google.golang.org/grpc/external/org_golang_google_grpc/server.go:963 +0x294 * * ``` </p> </details> <details><summary>Help</summary> <p> See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM) </p> </details> /cc @cockroachdb/sql-queries <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestDistSQLFlowsVirtualTables.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub> Jira issue: CRDB-30186
1.0
sql: TestDistSQLFlowsVirtualTables failed - sql.TestDistSQLFlowsVirtualTables [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Ci_TestsAwsLinuxArm64_UnitTests/11093936?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Ci_TestsAwsLinuxArm64_UnitTests/11093936?buildTab=artifacts#/) on master @ [51aa6257c063f60fb76044f9cccc47e259930d00](https://github.com/cockroachdb/cockroach/commits/51aa6257c063f60fb76044f9cccc47e259930d00): Fatal error: ``` panic: test timed out after 14m55s ``` Stack: ``` goroutine 59003 [running]: testing.(*M).startAlarm.func1() GOROOT/src/testing/testing.go:2036 +0x88 created by time.goFunc GOROOT/src/time/sleep.go:176 +0x38 ``` <details><summary>Log preceding fatal error</summary> <p> ``` * created by github.com/cockroachdb/pebble/record.NewLogWriter * github.com/cockroachdb/pebble/record/external/com_github_cockroachdb_pebble/record/log_writer.go:354 +0x3f0 * * goroutine 59018 [chan receive, 2 minutes]: * github.com/cockroachdb/cockroach/pkg/sql_test.TestDistSQLFlowsVirtualTables.func4({0x403bb6c5a0?, 0x739b280?}, 0x40354fbec0?) * github.com/cockroachdb/cockroach/pkg/sql_test/pkg/sql/crdb_internal_test.go:564 +0x184 * github.com/cockroachdb/cockroach/pkg/kv/kvserver.(*Replica).SendWithWriteBytes(0x4019093900, {0x739b280?, 0x40354fbe90?}, 0x403bb6c5a0) * github.com/cockroachdb/cockroach/pkg/kv/kvserver/pkg/kv/kvserver/replica_send.go:173 +0x3fc * github.com/cockroachdb/cockroach/pkg/kv/kvserver.(*Store).SendWithWriteBytes(0x4018d34000, {0x739b280?, 0x40354fbe60?}, 0x403bb6c5a0) * github.com/cockroachdb/cockroach/pkg/kv/kvserver/pkg/kv/kvserver/store_send.go:193 +0x44c * github.com/cockroachdb/cockroach/pkg/kv/kvserver.(*Stores).SendWithWriteBytes(0x402dc9f9d0?, {0x739b280, 0x40354fbe60}, 0x403bb6c5a0) * github.com/cockroachdb/cockroach/pkg/kv/kvserver/pkg/kv/kvserver/stores.go:202 +0x9c * github.com/cockroachdb/cockroach/pkg/server.(*Node).batchInternal(0x4035ae1c00, {0x739b280?, 0x40354fbe30?}, {0x401a382360?}, 0x403bb6c5a0) * github.com/cockroachdb/cockroach/pkg/server/node.go:1301 +0x2e4 * github.com/cockroachdb/cockroach/pkg/server.(*Node).Batch(0x4035ae1c00, {0x739b280, 0x40354fbd70}, 0x403bb6c5a0) * github.com/cockroachdb/cockroach/pkg/server/node.go:1432 +0x234 * github.com/cockroachdb/cockroach/pkg/kv/kvpb._Internal_Batch_Handler.func1({0x739b280, 0x40354fbd70}, {0x5af93a0?, 0x403bb6c5a0}) * github.com/cockroachdb/cockroach/pkg/kv/kvpb/bazel-out/aarch64-fastbuild/bin/pkg/kv/kvpb/kvpb_go_proto_/github.com/cockroachdb/cockroach/pkg/kv/kvpb/api.pb.go:10277 +0x74 * github.com/cockroachdb/cockroach/pkg/util/tracing/grpcinterceptor.ServerInterceptor.func1({0x739b280, 0x40354fbd70}, {0x5af93a0, 0x403bb6c5a0}, 0x40062dc120, 0x401f9102a0) * github.com/cockroachdb/cockroach/pkg/util/tracing/grpcinterceptor/grpc_interceptor.go:97 +0x1b4 * google.golang.org/grpc.getChainUnaryHandler.func1({0x739b280, 0x40354fbd70}, {0x5af93a0, 0x403bb6c5a0}) * google.golang.org/grpc/external/org_golang_google_grpc/server.go:1163 +0xa0 * github.com/cockroachdb/cockroach/pkg/rpc.NewServerEx.func3({0x739b280, 0x40354fbd70}, {0x5af93a0, 0x403bb6c5a0}, 0x40062dc120?, 0x4005cfb1c0) * github.com/cockroachdb/cockroach/pkg/rpc/pkg/rpc/context.go:167 +0x88 * google.golang.org/grpc.getChainUnaryHandler.func1({0x739b280, 0x40354fbd70}, {0x5af93a0, 0x403bb6c5a0}) * google.golang.org/grpc/external/org_golang_google_grpc/server.go:1163 +0xa0 * github.com/cockroachdb/cockroach/pkg/rpc.kvAuth.unaryInterceptor({0x4005c84000?, {{0x5780080?}, {0x73d9a20?, 0x4019016030?}}}, {0x739b280, 0x40354fbd70}, {0x5af93a0, 0x403bb6c5a0}, 0x40062dc120, 0x4005cfb140) * github.com/cockroachdb/cockroach/pkg/rpc/pkg/rpc/auth.go:105 +0x1ec * google.golang.org/grpc.getChainUnaryHandler.func1({0x739b280, 0x40354fbd70}, {0x5af93a0, 0x403bb6c5a0}) * google.golang.org/grpc/external/org_golang_google_grpc/server.go:1163 +0xa0 * github.com/cockroachdb/cockroach/pkg/rpc.NewServerEx.func1.1({0x739b280?, 0x40354fbd70?}) * github.com/cockroachdb/cockroach/pkg/rpc/pkg/rpc/context.go:134 +0x3c * github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunTaskWithErr(0x40039fd7c0, {0x739b280, 0x40354fbd70}, {0xc04ddc?, 0x38?}, 0x400b5b5910) * github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:336 +0xac * github.com/cockroachdb/cockroach/pkg/rpc.NewServerEx.func1({0x739b280?, 0x40354fbd70?}, {0x5af93a0?, 0x403bb6c5a0?}, 0x40062dc120?, 0x401f9102a0?) * github.com/cockroachdb/cockroach/pkg/rpc/pkg/rpc/context.go:132 +0x7c * google.golang.org/grpc.chainUnaryInterceptors.func1({0x739b280, 0x40354fbd70}, {0x5af93a0, 0x403bb6c5a0}, 0x400a1979f8?, 0xc05988?) * google.golang.org/grpc/external/org_golang_google_grpc/server.go:1154 +0x88 * github.com/cockroachdb/cockroach/pkg/kv/kvpb._Internal_Batch_Handler({0x5acbda0?, 0x4035ae1c00}, {0x739b280, 0x40354fbd70}, 0x400cf18070, 0x40334532a0) * github.com/cockroachdb/cockroach/pkg/kv/kvpb/bazel-out/aarch64-fastbuild/bin/pkg/kv/kvpb/kvpb_go_proto_/github.com/cockroachdb/cockroach/pkg/kv/kvpb/api.pb.go:10279 +0x134 * google.golang.org/grpc.(*Server).processUnaryRPC(0x4001f3ad20, {0x73ed9e0, 0x40367ac820}, 0x403bb6c480, 0x401960df50, 0xa820800, 0x0) * google.golang.org/grpc/external/org_golang_google_grpc/server.go:1336 +0xb68 * google.golang.org/grpc.(*Server).handleStream(0x4001f3ad20, {0x73ed9e0, 0x40367ac820}, 0x403bb6c480, 0x0) * google.golang.org/grpc/external/org_golang_google_grpc/server.go:1704 +0x840 * google.golang.org/grpc.(*Server).serveStreams.func1.2() * google.golang.org/grpc/external/org_golang_google_grpc/server.go:965 +0x84 * created by google.golang.org/grpc.(*Server).serveStreams.func1 * google.golang.org/grpc/external/org_golang_google_grpc/server.go:963 +0x294 * * ``` </p> </details> <details><summary>Help</summary> <p> See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM) </p> </details> /cc @cockroachdb/sql-queries <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestDistSQLFlowsVirtualTables.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub> Jira issue: CRDB-30186
non_defect
sql testdistsqlflowsvirtualtables failed sql testdistsqlflowsvirtualtables with on master fatal error panic test timed out after stack goroutine testing m startalarm goroot src testing testing go created by time gofunc goroot src time sleep go log preceding fatal error created by github com cockroachdb pebble record newlogwriter github com cockroachdb pebble record external com github cockroachdb pebble record log writer go goroutine github com cockroachdb cockroach pkg sql test testdistsqlflowsvirtualtables github com cockroachdb cockroach pkg sql test pkg sql crdb internal test go github com cockroachdb cockroach pkg kv kvserver replica sendwithwritebytes github com cockroachdb cockroach pkg kv kvserver pkg kv kvserver replica send go github com cockroachdb cockroach pkg kv kvserver store sendwithwritebytes github com cockroachdb cockroach pkg kv kvserver pkg kv kvserver store send go github com cockroachdb cockroach pkg kv kvserver stores sendwithwritebytes github com cockroachdb cockroach pkg kv kvserver pkg kv kvserver stores go github com cockroachdb cockroach pkg server node batchinternal github com cockroachdb cockroach pkg server node go github com cockroachdb cockroach pkg server node batch github com cockroachdb cockroach pkg server node go github com cockroachdb cockroach pkg kv kvpb internal batch handler github com cockroachdb cockroach pkg kv kvpb bazel out fastbuild bin pkg kv kvpb kvpb go proto github com cockroachdb cockroach pkg kv kvpb api pb go github com cockroachdb cockroach pkg util tracing grpcinterceptor serverinterceptor github com cockroachdb cockroach pkg util tracing grpcinterceptor grpc interceptor go google golang org grpc getchainunaryhandler google golang org grpc external org golang google grpc server go github com cockroachdb cockroach pkg rpc newserverex github com cockroachdb cockroach pkg rpc pkg rpc context go google golang org grpc getchainunaryhandler google golang org grpc external org golang google grpc server go github com cockroachdb cockroach pkg rpc kvauth unaryinterceptor github com cockroachdb cockroach pkg rpc pkg rpc auth go google golang org grpc getchainunaryhandler google golang org grpc external org golang google grpc server go github com cockroachdb cockroach pkg rpc newserverex github com cockroachdb cockroach pkg rpc pkg rpc context go github com cockroachdb cockroach pkg util stop stopper runtaskwitherr github com cockroachdb cockroach pkg util stop stopper go github com cockroachdb cockroach pkg rpc newserverex github com cockroachdb cockroach pkg rpc pkg rpc context go google golang org grpc chainunaryinterceptors google golang org grpc external org golang google grpc server go github com cockroachdb cockroach pkg kv kvpb internal batch handler github com cockroachdb cockroach pkg kv kvpb bazel out fastbuild bin pkg kv kvpb kvpb go proto github com cockroachdb cockroach pkg kv kvpb api pb go google golang org grpc server processunaryrpc google golang org grpc external org golang google grpc server go google golang org grpc server handlestream google golang org grpc external org golang google grpc server go google golang org grpc server servestreams google golang org grpc external org golang google grpc server go created by google golang org grpc server servestreams google golang org grpc external org golang google grpc server go help see also cc cockroachdb sql queries jira issue crdb
0
155,842
5,961,778,109
IssuesEvent
2017-05-29 19:00:48
heartsucker/rust-tuf
https://api.github.com/repos/heartsucker/rust-tuf
closed
A target does not need to have exactly `length`, just use it as an upper bound
Bug :: Minor Priority :: Low
If a target is listed as having 100KB and only 90KB are download, check the hash and continue. Relevant converation: https://github.com/theupdateframework/tuf/commit/6236878eb1f280e2d5bce79678c7b92cf7ac2b9e#commitcomment-21921164
1.0
A target does not need to have exactly `length`, just use it as an upper bound - If a target is listed as having 100KB and only 90KB are download, check the hash and continue. Relevant converation: https://github.com/theupdateframework/tuf/commit/6236878eb1f280e2d5bce79678c7b92cf7ac2b9e#commitcomment-21921164
non_defect
a target does not need to have exactly length just use it as an upper bound if a target is listed as having and only are download check the hash and continue relevant converation
0
51,843
13,211,322,648
IssuesEvent
2020-08-15 22:18:18
icecube-trac/tix4
https://api.github.com/repos/icecube-trac/tix4
opened
CoincSuite split_recombine.py example doesn't run (Trac #1162)
Incomplete Migration Migrated from Trac combo reconstruction defect
<details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1162">https://code.icecube.wisc.edu/projects/icecube/ticket/1162</a>, reported by jtatarand owned by mzoll</em></summary> <p> ```json { "status": "closed", "changetime": "2019-02-13T14:11:57", "_ts": "1550067117911749", "description": "Here is the error message:\n\n\n{{{\nINFO (Python): Using CoincSuite Recombinations (coincsuite.py:82 in Complete)\nTraceback (most recent call last):\n File \"split_recombine.py\", line 77, in <module>\n Split_Recombine( tray, \"Split_Recombine\", params)\n File \"split_recombine.py\", line 58, in Split_Recombine\n SplitPulses = \"MaskedOfflinePulses\")\n File \"/home/jtatar/StrikeTeam/IceRec/build/lib/I3Tray.py\", line 204, in AddSegment\n return _segment(self, _name, **kwargs)\n File \"/home/jtatar/StrikeTeam/IceRec/build/lib/icecube/CoincSuite/coincsuite.py\", line 169, in Complete\n mininame = lilliput.add_minuit_simplex_minimizer_service(tray)\nNameError: global name 'lilliput' is not defined\n\n}}}\n\n* Default input file does not exist.\n* Please add a better summary of what script does at top of script.", "reporter": "jtatar", "cc": "", "resolution": "fixed", "time": "2015-08-18T18:26:13", "component": "combo reconstruction", "summary": "CoincSuite split_recombine.py example doesn't run", "priority": "blocker", "keywords": "", "milestone": "", "owner": "mzoll", "type": "defect" } ``` </p> </details>
1.0
CoincSuite split_recombine.py example doesn't run (Trac #1162) - <details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1162">https://code.icecube.wisc.edu/projects/icecube/ticket/1162</a>, reported by jtatarand owned by mzoll</em></summary> <p> ```json { "status": "closed", "changetime": "2019-02-13T14:11:57", "_ts": "1550067117911749", "description": "Here is the error message:\n\n\n{{{\nINFO (Python): Using CoincSuite Recombinations (coincsuite.py:82 in Complete)\nTraceback (most recent call last):\n File \"split_recombine.py\", line 77, in <module>\n Split_Recombine( tray, \"Split_Recombine\", params)\n File \"split_recombine.py\", line 58, in Split_Recombine\n SplitPulses = \"MaskedOfflinePulses\")\n File \"/home/jtatar/StrikeTeam/IceRec/build/lib/I3Tray.py\", line 204, in AddSegment\n return _segment(self, _name, **kwargs)\n File \"/home/jtatar/StrikeTeam/IceRec/build/lib/icecube/CoincSuite/coincsuite.py\", line 169, in Complete\n mininame = lilliput.add_minuit_simplex_minimizer_service(tray)\nNameError: global name 'lilliput' is not defined\n\n}}}\n\n* Default input file does not exist.\n* Please add a better summary of what script does at top of script.", "reporter": "jtatar", "cc": "", "resolution": "fixed", "time": "2015-08-18T18:26:13", "component": "combo reconstruction", "summary": "CoincSuite split_recombine.py example doesn't run", "priority": "blocker", "keywords": "", "milestone": "", "owner": "mzoll", "type": "defect" } ``` </p> </details>
defect
coincsuite split recombine py example doesn t run trac migrated from json status closed changetime ts description here is the error message n n n ninfo python using coincsuite recombinations coincsuite py in complete ntraceback most recent call last n file split recombine py line in n split recombine tray split recombine params n file split recombine py line in split recombine n splitpulses maskedofflinepulses n file home jtatar striketeam icerec build lib py line in addsegment n return segment self name kwargs n file home jtatar striketeam icerec build lib icecube coincsuite coincsuite py line in complete n mininame lilliput add minuit simplex minimizer service tray nnameerror global name lilliput is not defined n n n n default input file does not exist n please add a better summary of what script does at top of script reporter jtatar cc resolution fixed time component combo reconstruction summary coincsuite split recombine py example doesn t run priority blocker keywords milestone owner mzoll type defect
1
232,162
17,774,601,236
IssuesEvent
2021-08-30 17:31:25
binbashar/le-ref-architecture-doc
https://api.github.com/repos/binbashar/le-ref-architecture-doc
closed
Feature | 1st Steps section min content
documentation enhancement 2021 Q2 feature patch
## What? - Create a 1st https://leverage.binbash.com.ar/first-steps/ version ## Why? - Simplify the first Leverage Reference Architecture adoption steps
1.0
Feature | 1st Steps section min content - ## What? - Create a 1st https://leverage.binbash.com.ar/first-steps/ version ## Why? - Simplify the first Leverage Reference Architecture adoption steps
non_defect
feature steps section min content what create a version why simplify the first leverage reference architecture adoption steps
0
5,679
2,610,193,264
IssuesEvent
2015-02-26 19:01:02
chrsmith/quchuseban
https://api.github.com/repos/chrsmith/quchuseban
opened
解谜脸上长色斑应吃什么好
auto-migrated Priority-Medium Type-Defect
``` 《摘要》 随缘,是一种胸怀,是一种成熟,是对自我内心的一种自信�� �把握。读懂随缘的人,总能在风云变幻、艰难坎坷的生活中� ��收放自如、游刃有余;总能在逆境中,找寻到前行的方向, 保持坦然愉快的心情。随缘,是对现实正确、清醒的认识,�� �对人生彻悟之后的精神自由,是“聚散离合本是缘”的达观� ��“得即高歌失即休”的超然,更是“一蓑烟雨任平生”的从 容。拥有一份随缘之心,你就会发现,天空中无论是阴云密�� �,还是阳光灿烂;生活的道路上无论是坎坷还是畅达,心中� ��是会拥有一份平静和恬淡。随缘,也让我认识了黛芙薇尔, 最后去掉脸上的斑。脸上长色斑应吃什么好, 《客户案例》   有一句话怎么说来着,“十个女人九个斑”,可是走在�� �街上我也没有发现多少个女孩有斑啊,偏偏就我不幸,从小� ��在脸上长了挺多色斑,到现在我还是弄不明白为什么自己会 那么倒霉。爱美之心人皆有之,我想方设法寻找祛斑方法。�� �论是走到街上,还是看报纸,看电视,或是上网,我都会关� ��祛斑的方法。美容祛斑、激光祛斑、美白祛斑产品,我都看 到过无数广告了,也做过几次美容,还有一次激光,皮肤白�� �一点,我还高兴了好长一段时间呢。但是角质层变薄了,被� ��阳晒过没多久,色斑又出现了,而且比以前更多,色素沉着 更严重。我就再也不敢做美容或激光了,就连在美容院买的�� �膜我也很少用。<br>   去年冬天,我想在网上找一些祛斑的小偏方,在百度上�� �索祛斑产品的时候,简直让人眼花缭乱啊,太多了,一时间� ��也不知道应该选择哪个产品。我就在论坛上发了篇文章,关 于自己祛斑的艰难过程,也希望有祛斑经验的网友能给我一�� �指引。没几天,就有好多回复信息,各种各样的说法都有,� ��的很感谢这些朋友们。其中有两条信息引起我的注意,都是 说「黛芙薇尔精华液」的,她们都用过,而且效果非常好。�� �前我也在百度上搜到我,但是没太在意。网友的建议让我有� ��重新了解「黛芙薇尔精华液」的念头。<br>   我上了「黛芙薇尔精华液」主网站,了解到这公司各方�� �实力确实可以,而且这个产品卖了很多年效果反映普遍不错� ��而且还得知「黛芙薇尔精华液」是纯植物萃取精华,而且使 用纳米技术。经过专家的介绍和分析,让我对「黛芙薇尔精�� �液」有了很透彻的了解,我就决定使用了。<br>   一个月过去后,不瞒你说,我的皮肤真的变白了,色斑�� �化了,到公司的时候,同事们都说我是不是什么美容养颜的� ��西了呢。嘿嘿,东西确实是了,但是我要等彻底把色斑去除 后才告诉她们我的什么。所以,暂时保密了三个月。她们看�� �我皮肤一天天好转,惊讶极了呢。我呢,在心底偷着乐呢。� ��用完第三个周期之后,我脸上的斑点就差不多没有了,而且 皮肤好滋润。这会我才告诉她们,我使用了「黛芙薇尔精华�� �」。嘻嘻,真是一个让人激动又快乐的治疗过程,而且使用� ��间没有任何不舒服的感觉,还很方便,最重要的是效果出乎 意料的好。我现在再也不用去想那些头疼的事情了,关于祛�� �的消息我可以一笑而过喽。因为,我已经不再是一个“斑点� ��孩”,而是一个皮肤超好的白领喽! 阅读了脸上长色斑应吃什么好,再看脸上容易长斑的原因: 《色斑形成原因》   内部因素   一、压力   当人受到压力时,就会分泌肾上腺素,为对付压力而做�� �备。如果长期受到压力,人体新陈代谢的平衡就会遭到破坏� ��皮肤所需的营养供应趋于缓慢,色素母细胞就会变得很活跃 。   二、荷尔蒙分泌失调   避孕药里所含的女性荷尔蒙雌激素,会刺激麦拉宁细胞�� �分泌而形成不均匀的斑点,因避孕药而形成的斑点,虽然在� ��药中断后会停止,但仍会在皮肤上停留很长一段时间。怀孕 中因女性荷尔蒙雌激素的增加,从怀孕4—5个月开始会容易出 现斑,这时候出现的斑点在产后大部分会消失。可是,新陈�� �谢不正常、肌肤裸露在强烈的紫外线下、精神上受到压力等� ��因,都会使斑加深。有时新长出的斑,产后也不会消失,所 以需要更加注意。   三、新陈代谢缓慢   肝的新陈代谢功能不正常或卵巢功能减退时也会出现斑�� �因为新陈代谢不顺畅、或内分泌失调,使身体处于敏感状态� ��,从而加剧色素问题。我们常说的便秘会形成斑,其实就是 内分泌失调导致过敏体质而形成的。另外,身体状态不正常�� �时候,紫外线的照射也会加速斑的形成。   四、错误的使用化妆品   使用了不适合自己皮肤的化妆品,会导致皮肤过敏。在�� �疗的过程中如过量照射到紫外线,皮肤会为了抵御外界的侵� ��,在有炎症的部位聚集麦拉宁色素,这样会出现色素沉着的 问题。   外部因素   一、紫外线   照射紫外线的时候,人体为了保护皮肤,会在基底层产�� �很多麦拉宁色素。所以为了保护皮肤,会在敏感部位聚集更� ��的色素。经常裸露在强烈的阳光底下不仅促进皮肤的老化, 还会引起黑斑、雀斑等色素沉着的皮肤疾患。   二、不良的清洁习惯   因强烈的清洁习惯使皮肤变得敏感,这样会刺激皮肤。�� �皮肤敏感时,人体为了保护皮肤,黑色素细胞会分泌很多麦� ��宁色素,当色素过剩时就出现了斑、瑕疵等皮肤色素沉着的 问题。   三、遗传基因   父母中有长斑的,则本人长斑的概率就很高,这种情况�� �一定程度上就可判定是遗传基因的作用。所以家里特别是长� ��有长斑的人,要注意避免引发长斑的重要因素之一——紫外 线照射,这是预防斑必须注意的。 《有疑问帮你解决》   1,黛芙薇尔精华液真的有效果吗?真的可以把脸上的黄褐�� �去掉吗?   答:黛芙薇尔精华液DNA精华能够有效的修复周围难以触�� �的色斑,其独有的纳豆成分为皮肤的美白与靓丽,提供了必� ��可少的营养物质,可以有效的去除黄褐斑,黄褐斑,黄褐斑 ,蝴蝶斑,晒斑、妊娠斑等。它它完全突破了传统的美肤时�� �,宛如在皮肤中注入了一杯兼具活化、再生、滋养等功效的� ��尾酒,同时为脸部提供大量有机维生素精华,脸部的改变显 而易见。自产品上市以来,老顾客纷纷介绍新顾客,71%的新�� �客都是通过老顾客介绍而来,口碑由此而来!   2,服用黛芙薇尔美白,会伤身体吗?有副作用吗?   答:黛芙薇尔精华液应用了精纯复合配方和领先的分类�� �斑科技,并将“DNA美肤系统”疗法应用到了该产品中,能彻� ��祛除黄褐斑,蝴蝶斑,妊娠斑,晒斑,黄褐斑,老年斑,有 效淡化黄褐斑至接近肤色。黛芙薇尔通过法国、美国、台湾�� �地的专家通力协作,超过10年的研究以全新的DNA肌肤修复技�� �,挑战传统化学护肤理念,不懈追寻发现破译大自然的美丽� ��迹,令每一位爱美的女性都能享受到科技创新所带来的自然 之美。 专为亚洲女性肤质研制,精心呵护女性美丽,多年来,为数�� �百万计的女性解除了黄褐斑困扰。深得广大女性朋友的信赖!   3,去除黄褐斑之后,会反弹吗?   答:很多曾经长了黄褐斑的人士,自从选择了黛芙薇尔�� �白,就一劳永逸。这款祛斑产品是经过数十位权威祛斑专家� ��据斑的形成原因精心研制而成用事实说话,让消费者打分。 树立权威品牌!我们的很多新客户都是老客户介绍而来,请问� ��如果效果不好,会有客户转介绍吗?   4,你们的价格有点贵,能不能便宜一点?   答:如果您使用西药最少需要2000元,煎服的药最少需要3 000元,做手术最少是5000元,而这些毫无疑问,不会对彻底去� ��你的斑点有任何帮助!一分价钱,一份价值,我们现在做的�� �是一个口碑,一个品牌,价钱并不高。如果花这点钱把你的� ��褐斑彻底去除,你还会觉得贵吗?你还会再去花那么多冤枉�� �,不但斑没去掉,还把自己的皮肤弄的越来越糟吗   5,我适合用黛芙薇尔精华液吗?   答:黛芙薇尔适用人群:   1、生理紊乱引起的黄褐斑人群   2、生育引起的妊娠斑人群   3、年纪增长引起的老年斑人群   4、化妆品色素沉积、辐射斑人群   5、长期日照引起的日晒斑人群   6、肌肤暗淡急需美白的人群 《祛斑小方法》 脸上长色斑应吃什么好,同时为您分享祛斑小方法 去斑方法,牛奶柠檬汁:每周2次,在晚上用牛奶加柠檬汁混合 液搽脸,可以增白皮肤,减淡斑点。 ``` ----- Original issue reported on code.google.com by `additive...@gmail.com` on 1 Jul 2014 at 5:40
1.0
解谜脸上长色斑应吃什么好 - ``` 《摘要》 随缘,是一种胸怀,是一种成熟,是对自我内心的一种自信�� �把握。读懂随缘的人,总能在风云变幻、艰难坎坷的生活中� ��收放自如、游刃有余;总能在逆境中,找寻到前行的方向, 保持坦然愉快的心情。随缘,是对现实正确、清醒的认识,�� �对人生彻悟之后的精神自由,是“聚散离合本是缘”的达观� ��“得即高歌失即休”的超然,更是“一蓑烟雨任平生”的从 容。拥有一份随缘之心,你就会发现,天空中无论是阴云密�� �,还是阳光灿烂;生活的道路上无论是坎坷还是畅达,心中� ��是会拥有一份平静和恬淡。随缘,也让我认识了黛芙薇尔, 最后去掉脸上的斑。脸上长色斑应吃什么好, 《客户案例》   有一句话怎么说来着,“十个女人九个斑”,可是走在�� �街上我也没有发现多少个女孩有斑啊,偏偏就我不幸,从小� ��在脸上长了挺多色斑,到现在我还是弄不明白为什么自己会 那么倒霉。爱美之心人皆有之,我想方设法寻找祛斑方法。�� �论是走到街上,还是看报纸,看电视,或是上网,我都会关� ��祛斑的方法。美容祛斑、激光祛斑、美白祛斑产品,我都看 到过无数广告了,也做过几次美容,还有一次激光,皮肤白�� �一点,我还高兴了好长一段时间呢。但是角质层变薄了,被� ��阳晒过没多久,色斑又出现了,而且比以前更多,色素沉着 更严重。我就再也不敢做美容或激光了,就连在美容院买的�� �膜我也很少用。<br>   去年冬天,我想在网上找一些祛斑的小偏方,在百度上�� �索祛斑产品的时候,简直让人眼花缭乱啊,太多了,一时间� ��也不知道应该选择哪个产品。我就在论坛上发了篇文章,关 于自己祛斑的艰难过程,也希望有祛斑经验的网友能给我一�� �指引。没几天,就有好多回复信息,各种各样的说法都有,� ��的很感谢这些朋友们。其中有两条信息引起我的注意,都是 说「黛芙薇尔精华液」的,她们都用过,而且效果非常好。�� �前我也在百度上搜到我,但是没太在意。网友的建议让我有� ��重新了解「黛芙薇尔精华液」的念头。<br>   我上了「黛芙薇尔精华液」主网站,了解到这公司各方�� �实力确实可以,而且这个产品卖了很多年效果反映普遍不错� ��而且还得知「黛芙薇尔精华液」是纯植物萃取精华,而且使 用纳米技术。经过专家的介绍和分析,让我对「黛芙薇尔精�� �液」有了很透彻的了解,我就决定使用了。<br>   一个月过去后,不瞒你说,我的皮肤真的变白了,色斑�� �化了,到公司的时候,同事们都说我是不是什么美容养颜的� ��西了呢。嘿嘿,东西确实是了,但是我要等彻底把色斑去除 后才告诉她们我的什么。所以,暂时保密了三个月。她们看�� �我皮肤一天天好转,惊讶极了呢。我呢,在心底偷着乐呢。� ��用完第三个周期之后,我脸上的斑点就差不多没有了,而且 皮肤好滋润。这会我才告诉她们,我使用了「黛芙薇尔精华�� �」。嘻嘻,真是一个让人激动又快乐的治疗过程,而且使用� ��间没有任何不舒服的感觉,还很方便,最重要的是效果出乎 意料的好。我现在再也不用去想那些头疼的事情了,关于祛�� �的消息我可以一笑而过喽。因为,我已经不再是一个“斑点� ��孩”,而是一个皮肤超好的白领喽! 阅读了脸上长色斑应吃什么好,再看脸上容易长斑的原因: 《色斑形成原因》   内部因素   一、压力   当人受到压力时,就会分泌肾上腺素,为对付压力而做�� �备。如果长期受到压力,人体新陈代谢的平衡就会遭到破坏� ��皮肤所需的营养供应趋于缓慢,色素母细胞就会变得很活跃 。   二、荷尔蒙分泌失调   避孕药里所含的女性荷尔蒙雌激素,会刺激麦拉宁细胞�� �分泌而形成不均匀的斑点,因避孕药而形成的斑点,虽然在� ��药中断后会停止,但仍会在皮肤上停留很长一段时间。怀孕 中因女性荷尔蒙雌激素的增加,从怀孕4—5个月开始会容易出 现斑,这时候出现的斑点在产后大部分会消失。可是,新陈�� �谢不正常、肌肤裸露在强烈的紫外线下、精神上受到压力等� ��因,都会使斑加深。有时新长出的斑,产后也不会消失,所 以需要更加注意。   三、新陈代谢缓慢   肝的新陈代谢功能不正常或卵巢功能减退时也会出现斑�� �因为新陈代谢不顺畅、或内分泌失调,使身体处于敏感状态� ��,从而加剧色素问题。我们常说的便秘会形成斑,其实就是 内分泌失调导致过敏体质而形成的。另外,身体状态不正常�� �时候,紫外线的照射也会加速斑的形成。   四、错误的使用化妆品   使用了不适合自己皮肤的化妆品,会导致皮肤过敏。在�� �疗的过程中如过量照射到紫外线,皮肤会为了抵御外界的侵� ��,在有炎症的部位聚集麦拉宁色素,这样会出现色素沉着的 问题。   外部因素   一、紫外线   照射紫外线的时候,人体为了保护皮肤,会在基底层产�� �很多麦拉宁色素。所以为了保护皮肤,会在敏感部位聚集更� ��的色素。经常裸露在强烈的阳光底下不仅促进皮肤的老化, 还会引起黑斑、雀斑等色素沉着的皮肤疾患。   二、不良的清洁习惯   因强烈的清洁习惯使皮肤变得敏感,这样会刺激皮肤。�� �皮肤敏感时,人体为了保护皮肤,黑色素细胞会分泌很多麦� ��宁色素,当色素过剩时就出现了斑、瑕疵等皮肤色素沉着的 问题。   三、遗传基因   父母中有长斑的,则本人长斑的概率就很高,这种情况�� �一定程度上就可判定是遗传基因的作用。所以家里特别是长� ��有长斑的人,要注意避免引发长斑的重要因素之一——紫外 线照射,这是预防斑必须注意的。 《有疑问帮你解决》   1,黛芙薇尔精华液真的有效果吗?真的可以把脸上的黄褐�� �去掉吗?   答:黛芙薇尔精华液DNA精华能够有效的修复周围难以触�� �的色斑,其独有的纳豆成分为皮肤的美白与靓丽,提供了必� ��可少的营养物质,可以有效的去除黄褐斑,黄褐斑,黄褐斑 ,蝴蝶斑,晒斑、妊娠斑等。它它完全突破了传统的美肤时�� �,宛如在皮肤中注入了一杯兼具活化、再生、滋养等功效的� ��尾酒,同时为脸部提供大量有机维生素精华,脸部的改变显 而易见。自产品上市以来,老顾客纷纷介绍新顾客,71%的新�� �客都是通过老顾客介绍而来,口碑由此而来!   2,服用黛芙薇尔美白,会伤身体吗?有副作用吗?   答:黛芙薇尔精华液应用了精纯复合配方和领先的分类�� �斑科技,并将“DNA美肤系统”疗法应用到了该产品中,能彻� ��祛除黄褐斑,蝴蝶斑,妊娠斑,晒斑,黄褐斑,老年斑,有 效淡化黄褐斑至接近肤色。黛芙薇尔通过法国、美国、台湾�� �地的专家通力协作,超过10年的研究以全新的DNA肌肤修复技�� �,挑战传统化学护肤理念,不懈追寻发现破译大自然的美丽� ��迹,令每一位爱美的女性都能享受到科技创新所带来的自然 之美。 专为亚洲女性肤质研制,精心呵护女性美丽,多年来,为数�� �百万计的女性解除了黄褐斑困扰。深得广大女性朋友的信赖!   3,去除黄褐斑之后,会反弹吗?   答:很多曾经长了黄褐斑的人士,自从选择了黛芙薇尔�� �白,就一劳永逸。这款祛斑产品是经过数十位权威祛斑专家� ��据斑的形成原因精心研制而成用事实说话,让消费者打分。 树立权威品牌!我们的很多新客户都是老客户介绍而来,请问� ��如果效果不好,会有客户转介绍吗?   4,你们的价格有点贵,能不能便宜一点?   答:如果您使用西药最少需要2000元,煎服的药最少需要3 000元,做手术最少是5000元,而这些毫无疑问,不会对彻底去� ��你的斑点有任何帮助!一分价钱,一份价值,我们现在做的�� �是一个口碑,一个品牌,价钱并不高。如果花这点钱把你的� ��褐斑彻底去除,你还会觉得贵吗?你还会再去花那么多冤枉�� �,不但斑没去掉,还把自己的皮肤弄的越来越糟吗   5,我适合用黛芙薇尔精华液吗?   答:黛芙薇尔适用人群:   1、生理紊乱引起的黄褐斑人群   2、生育引起的妊娠斑人群   3、年纪增长引起的老年斑人群   4、化妆品色素沉积、辐射斑人群   5、长期日照引起的日晒斑人群   6、肌肤暗淡急需美白的人群 《祛斑小方法》 脸上长色斑应吃什么好,同时为您分享祛斑小方法 去斑方法,牛奶柠檬汁:每周2次,在晚上用牛奶加柠檬汁混合 液搽脸,可以增白皮肤,减淡斑点。 ``` ----- Original issue reported on code.google.com by `additive...@gmail.com` on 1 Jul 2014 at 5:40
defect
解谜脸上长色斑应吃什么好 《摘要》 随缘,是一种胸怀,是一种成熟,是对自我内心的一种自信�� �把握。读懂随缘的人,总能在风云变幻、艰难坎坷的生活中� ��收放自如、游刃有余;总能在逆境中,找寻到前行的方向, 保持坦然愉快的心情。随缘,是对现实正确、清醒的认识,�� �对人生彻悟之后的精神自由,是“聚散离合本是缘”的达观� ��“得即高歌失即休”的超然,更是“一蓑烟雨任平生”的从 容。拥有一份随缘之心,你就会发现,天空中无论是阴云密�� �,还是阳光灿烂;生活的道路上无论是坎坷还是畅达,心中� ��是会拥有一份平静和恬淡。随缘,也让我认识了黛芙薇尔, 最后去掉脸上的斑。脸上长色斑应吃什么好, 《客户案例》   有一句话怎么说来着,“十个女人九个斑”,可是走在�� �街上我也没有发现多少个女孩有斑啊,偏偏就我不幸,从小� ��在脸上长了挺多色斑,到现在我还是弄不明白为什么自己会 那么倒霉。爱美之心人皆有之,我想方设法寻找祛斑方法。�� �论是走到街上,还是看报纸,看电视,或是上网,我都会关� ��祛斑的方法。美容祛斑、激光祛斑、美白祛斑产品,我都看 到过无数广告了,也做过几次美容,还有一次激光,皮肤白�� �一点,我还高兴了好长一段时间呢。但是角质层变薄了,被� ��阳晒过没多久,色斑又出现了,而且比以前更多,色素沉着 更严重。我就再也不敢做美容或激光了,就连在美容院买的�� �膜我也很少用。   去年冬天,我想在网上找一些祛斑的小偏方,在百度上�� �索祛斑产品的时候,简直让人眼花缭乱啊,太多了,一时间� ��也不知道应该选择哪个产品。我就在论坛上发了篇文章,关 于自己祛斑的艰难过程,也希望有祛斑经验的网友能给我一�� �指引。没几天,就有好多回复信息,各种各样的说法都有,� ��的很感谢这些朋友们。其中有两条信息引起我的注意,都是 说「黛芙薇尔精华液」的,她们都用过,而且效果非常好。�� �前我也在百度上搜到我,但是没太在意。网友的建议让我有� ��重新了解「黛芙薇尔精华液」的念头。   我上了「黛芙薇尔精华液」主网站,了解到这公司各方�� �实力确实可以,而且这个产品卖了很多年效果反映普遍不错� ��而且还得知「黛芙薇尔精华液」是纯植物萃取精华,而且使 用纳米技术。经过专家的介绍和分析,让我对「黛芙薇尔精�� �液」有了很透彻的了解,我就决定使用了。   一个月过去后,不瞒你说,我的皮肤真的变白了,色斑�� �化了,到公司的时候,同事们都说我是不是什么美容养颜的� ��西了呢。嘿嘿,东西确实是了,但是我要等彻底把色斑去除 后才告诉她们我的什么。所以,暂时保密了三个月。她们看�� �我皮肤一天天好转,惊讶极了呢。我呢,在心底偷着乐呢。� ��用完第三个周期之后,我脸上的斑点就差不多没有了,而且 皮肤好滋润。这会我才告诉她们,我使用了「黛芙薇尔精华�� �」。嘻嘻,真是一个让人激动又快乐的治疗过程,而且使用� ��间没有任何不舒服的感觉,还很方便,最重要的是效果出乎 意料的好。我现在再也不用去想那些头疼的事情了,关于祛�� �的消息我可以一笑而过喽。因为,我已经不再是一个“斑点� ��孩”,而是一个皮肤超好的白领喽 阅读了脸上长色斑应吃什么好,再看脸上容易长斑的原因: 《色斑形成原因》   内部因素   一、压力   当人受到压力时,就会分泌肾上腺素,为对付压力而做�� �备。如果长期受到压力,人体新陈代谢的平衡就会遭到破坏� ��皮肤所需的营养供应趋于缓慢,色素母细胞就会变得很活跃 。   二、荷尔蒙分泌失调   避孕药里所含的女性荷尔蒙雌激素,会刺激麦拉宁细胞�� �分泌而形成不均匀的斑点,因避孕药而形成的斑点,虽然在� ��药中断后会停止,但仍会在皮肤上停留很长一段时间。怀孕 中因女性荷尔蒙雌激素的增加, — 现斑,这时候出现的斑点在产后大部分会消失。可是,新陈�� �谢不正常、肌肤裸露在强烈的紫外线下、精神上受到压力等� ��因,都会使斑加深。有时新长出的斑,产后也不会消失,所 以需要更加注意。   三、新陈代谢缓慢   肝的新陈代谢功能不正常或卵巢功能减退时也会出现斑�� �因为新陈代谢不顺畅、或内分泌失调,使身体处于敏感状态� ��,从而加剧色素问题。我们常说的便秘会形成斑,其实就是 内分泌失调导致过敏体质而形成的。另外,身体状态不正常�� �时候,紫外线的照射也会加速斑的形成。   四、错误的使用化妆品   使用了不适合自己皮肤的化妆品,会导致皮肤过敏。在�� �疗的过程中如过量照射到紫外线,皮肤会为了抵御外界的侵� ��,在有炎症的部位聚集麦拉宁色素,这样会出现色素沉着的 问题。   外部因素   一、紫外线   照射紫外线的时候,人体为了保护皮肤,会在基底层产�� �很多麦拉宁色素。所以为了保护皮肤,会在敏感部位聚集更� ��的色素。经常裸露在强烈的阳光底下不仅促进皮肤的老化, 还会引起黑斑、雀斑等色素沉着的皮肤疾患。   二、不良的清洁习惯   因强烈的清洁习惯使皮肤变得敏感,这样会刺激皮肤。�� �皮肤敏感时,人体为了保护皮肤,黑色素细胞会分泌很多麦� ��宁色素,当色素过剩时就出现了斑、瑕疵等皮肤色素沉着的 问题。   三、遗传基因   父母中有长斑的,则本人长斑的概率就很高,这种情况�� �一定程度上就可判定是遗传基因的作用。所以家里特别是长� ��有长斑的人,要注意避免引发长斑的重要因素之一——紫外 线照射,这是预防斑必须注意的。 《有疑问帮你解决》    黛芙薇尔精华液真的有效果吗 真的可以把脸上的黄褐�� �去掉吗   答:黛芙薇尔精华液dna精华能够有效的修复周围难以触�� �的色斑,其独有的纳豆成分为皮肤的美白与靓丽,提供了必� ��可少的营养物质,可以有效的去除黄褐斑,黄褐斑,黄褐斑 ,蝴蝶斑,晒斑、妊娠斑等。它它完全突破了传统的美肤时�� �,宛如在皮肤中注入了一杯兼具活化、再生、滋养等功效的� ��尾酒,同时为脸部提供大量有机维生素精华,脸部的改变显 而易见。自产品上市以来,老顾客纷纷介绍新顾客, 的新�� �客都是通过老顾客介绍而来,口碑由此而来    ,服用黛芙薇尔美白,会伤身体吗 有副作用吗   答:黛芙薇尔精华液应用了精纯复合配方和领先的分类�� �斑科技,并将“dna美肤系统”疗法应用到了该产品中,能彻� ��祛除黄褐斑,蝴蝶斑,妊娠斑,晒斑,黄褐斑,老年斑,有 效淡化黄褐斑至接近肤色。黛芙薇尔通过法国、美国、台湾�� �地的专家通力协作, �� �,挑战传统化学护肤理念,不懈追寻发现破译大自然的美丽� ��迹,令每一位爱美的女性都能享受到科技创新所带来的自然 之美。 专为亚洲女性肤质研制,精心呵护女性美丽,多年来,为数�� �百万计的女性解除了黄褐斑困扰。深得广大女性朋友的信赖    ,去除黄褐斑之后,会反弹吗   答:很多曾经长了黄褐斑的人士,自从选择了黛芙薇尔�� �白,就一劳永逸。这款祛斑产品是经过数十位权威祛斑专家� ��据斑的形成原因精心研制而成用事实说话,让消费者打分。 树立权威品牌 我们的很多新客户都是老客户介绍而来,请问� ��如果效果不好,会有客户转介绍吗    ,你们的价格有点贵,能不能便宜一点   答: , , ,而这些毫无疑问,不会对彻底去� ��你的斑点有任何帮助 一分价钱,一份价值,我们现在做的�� �是一个口碑,一个品牌,价钱并不高。如果花这点钱把你的� ��褐斑彻底去除,你还会觉得贵吗 你还会再去花那么多冤枉�� �,不但斑没去掉,还把自己的皮肤弄的越来越糟吗    ,我适合用黛芙薇尔精华液吗   答:黛芙薇尔适用人群:    、生理紊乱引起的黄褐斑人群    、生育引起的妊娠斑人群    、年纪增长引起的老年斑人群    、化妆品色素沉积、辐射斑人群    、长期日照引起的日晒斑人群    、肌肤暗淡急需美白的人群 《祛斑小方法》 脸上长色斑应吃什么好,同时为您分享祛斑小方法 去斑方法 牛奶柠檬汁: ,在晚上用牛奶加柠檬汁混合 液搽脸,可以增白皮肤,减淡斑点。 original issue reported on code google com by additive gmail com on jul at
1
57,955
11,810,582,677
IssuesEvent
2020-03-19 16:43:13
dotnet/runtime
https://api.github.com/repos/dotnet/runtime
opened
Remove redundant configuration from [DllImport] declaration
api-suggestion area-System.Runtime.InteropServices code-analyzer
Flag places where an attribute is set on a DllImport that's already the default value. **Category**: Style
1.0
Remove redundant configuration from [DllImport] declaration - Flag places where an attribute is set on a DllImport that's already the default value. **Category**: Style
non_defect
remove redundant configuration from declaration flag places where an attribute is set on a dllimport that s already the default value category style
0
251,987
18,983,949,530
IssuesEvent
2021-11-21 11:47:19
codeidea-korea/greenpass
https://api.github.com/repos/codeidea-korea/greenpass
reopened
NDEF(NFC)데이터 타입이 정확히 무엇인지 확인 안됨
documentation help wanted
![KakaoTalk_20211121_182705100](https://user-images.githubusercontent.com/17987845/142760432-32c84764-2a48-44df-b179-ac16b6185b68.jpg) - [ ] byte 데이터인가? - [ ] unsigned char 형인가? - [ ] 인코딩은 어떻게 되는가? - [ ] 혹시 구조체인데 seperator 가 있는가?
1.0
NDEF(NFC)데이터 타입이 정확히 무엇인지 확인 안됨 - ![KakaoTalk_20211121_182705100](https://user-images.githubusercontent.com/17987845/142760432-32c84764-2a48-44df-b179-ac16b6185b68.jpg) - [ ] byte 데이터인가? - [ ] unsigned char 형인가? - [ ] 인코딩은 어떻게 되는가? - [ ] 혹시 구조체인데 seperator 가 있는가?
non_defect
ndef nfc 데이터 타입이 정확히 무엇인지 확인 안됨 byte 데이터인가 unsigned char 형인가 인코딩은 어떻게 되는가 혹시 구조체인데 seperator 가 있는가
0
426,471
29,520,645,465
IssuesEvent
2023-06-05 01:15:17
Unirep/Unirep
https://api.github.com/repos/Unirep/Unirep
closed
docs: add "Getting Started" section
documentation
The "Getting Started" section aims to provide a brief introduction to Unirep and to guide developers through the process of using `create-unirep-app`. This section should include a step-by-step tutorial, which covers the installation of `create-unirep-app` and important functionality of the repository. Additionally, the tutorial should explain how Unirep works to help developers understand the system better. Examples of attesters can be a separate section with comments on how the example attester is using the unirep functionality.
1.0
docs: add "Getting Started" section - The "Getting Started" section aims to provide a brief introduction to Unirep and to guide developers through the process of using `create-unirep-app`. This section should include a step-by-step tutorial, which covers the installation of `create-unirep-app` and important functionality of the repository. Additionally, the tutorial should explain how Unirep works to help developers understand the system better. Examples of attesters can be a separate section with comments on how the example attester is using the unirep functionality.
non_defect
docs add getting started section the getting started section aims to provide a brief introduction to unirep and to guide developers through the process of using create unirep app this section should include a step by step tutorial which covers the installation of create unirep app and important functionality of the repository additionally the tutorial should explain how unirep works to help developers understand the system better examples of attesters can be a separate section with comments on how the example attester is using the unirep functionality
0
447,162
31,625,959,056
IssuesEvent
2023-09-06 05:23:27
arkedge/c2a-core
https://api.github.com/repos/arkedge/c2a-core
opened
CHANGELOG ほしい
documentation enhancement priority::high
## 詳細 - 現状,c2a-core に対する変更はリリースの間の Pull Request を眺めてユーザ側で何が起きたかを察する運用になっている - 実際の C2A user への変更は `examples/mobc` の diff を見て察している - これは「対応する項目の一覧」としては機能するが,大変すぎる - 単に diff だけを見ていて何のために何をどう入れたのかという議論まで追われず,曖昧な対応がされていることもしばしばある - c2a-core 内部の変更(これが現状では少ないというのもあるが)と C2A user まで含んだ変更を区別し,ユーザに対して明示したい ## close条件 CHANGELOG を用意したら
1.0
CHANGELOG ほしい - ## 詳細 - 現状,c2a-core に対する変更はリリースの間の Pull Request を眺めてユーザ側で何が起きたかを察する運用になっている - 実際の C2A user への変更は `examples/mobc` の diff を見て察している - これは「対応する項目の一覧」としては機能するが,大変すぎる - 単に diff だけを見ていて何のために何をどう入れたのかという議論まで追われず,曖昧な対応がされていることもしばしばある - c2a-core 内部の変更(これが現状では少ないというのもあるが)と C2A user まで含んだ変更を区別し,ユーザに対して明示したい ## close条件 CHANGELOG を用意したら
non_defect
changelog ほしい 詳細 現状, core に対する変更はリリースの間の pull request を眺めてユーザ側で何が起きたかを察する運用になっている 実際の user への変更は examples mobc の diff を見て察している これは「対応する項目の一覧」としては機能するが,大変すぎる 単に diff だけを見ていて何のために何をどう入れたのかという議論まで追われず,曖昧な対応がされていることもしばしばある core 内部の変更(これが現状では少ないというのもあるが)と user まで含んだ変更を区別し,ユーザに対して明示したい close条件 changelog を用意したら
0
26,611
13,066,985,577
IssuesEvent
2020-07-30 23:04:15
elastic/kibana
https://api.github.com/repos/elastic/kibana
closed
Canvas loads slowly which takes more than 3 minutes
Team:Canvas bug loe:needs-research performance
**Kibana version:** 6.5.0 **Elasticsearch version:** 6.5.0 **Server OS Version** MacOS **Browser version:** Chrome **Browser OS version:** Version 70.0.3538.102 (Official Build) (64-bit) **Original install method (e.g. download page, yum, from source, etc.):** yum **Describe the bug:** Canvas Loads slowly. It takes more than 3 minutes to load the page. **Steps to reproduce:** 1.Load sample data like flights or web logging data 2.Open Canvas **Expected behavior:** It works well when I install canvas through plugin. The loading speed is about 5 seconds. **Screenshots (if relevant):** ![canvas_pro2](https://user-images.githubusercontent.com/1431033/48680801-50278d00-ebd9-11e8-909e-465b1d945498.jpg) **Errors in browser console (if relevant):** No error
True
Canvas loads slowly which takes more than 3 minutes - **Kibana version:** 6.5.0 **Elasticsearch version:** 6.5.0 **Server OS Version** MacOS **Browser version:** Chrome **Browser OS version:** Version 70.0.3538.102 (Official Build) (64-bit) **Original install method (e.g. download page, yum, from source, etc.):** yum **Describe the bug:** Canvas Loads slowly. It takes more than 3 minutes to load the page. **Steps to reproduce:** 1.Load sample data like flights or web logging data 2.Open Canvas **Expected behavior:** It works well when I install canvas through plugin. The loading speed is about 5 seconds. **Screenshots (if relevant):** ![canvas_pro2](https://user-images.githubusercontent.com/1431033/48680801-50278d00-ebd9-11e8-909e-465b1d945498.jpg) **Errors in browser console (if relevant):** No error
non_defect
canvas loads slowly which takes more than minutes kibana version elasticsearch version server os version macos browser version chrome browser os version version official build bit original install method e g download page yum from source etc yum describe the bug canvas loads slowly it takes more than minutes to load the page steps to reproduce load sample data like flights or web logging data open canvas expected behavior it works well when i install canvas through plugin the loading speed is about seconds screenshots if relevant errors in browser console if relevant no error
0
368,002
25,772,857,817
IssuesEvent
2022-12-09 09:29:12
dagger/dagger
https://api.github.com/repos/dagger/dagger
closed
NodeJS reference name for the default client should be more explicit
area/documentation sdk/nodejs
### What is the issue? Right now, the name of it is: `default`. When I was doing some demo, I couldn't find the API for the client, because it was not shown, except under `default` which was not clear at all. maybe `defaultClient` at least? cc @slumbering
1.0
NodeJS reference name for the default client should be more explicit - ### What is the issue? Right now, the name of it is: `default`. When I was doing some demo, I couldn't find the API for the client, because it was not shown, except under `default` which was not clear at all. maybe `defaultClient` at least? cc @slumbering
non_defect
nodejs reference name for the default client should be more explicit what is the issue right now the name of it is default when i was doing some demo i couldn t find the api for the client because it was not shown except under default which was not clear at all maybe defaultclient at least cc slumbering
0
4,747
2,610,154,482
IssuesEvent
2015-02-26 18:49:08
chrsmith/republic-at-war
https://api.github.com/repos/chrsmith/republic-at-war
closed
Map Issue
auto-migrated Priority-Medium Type-Defect
``` Touchup Minntooine passability for radar Green???? ``` ----- Original issue reported on code.google.com by `z3r0...@gmail.com` on 30 Jan 2011 at 2:13
1.0
Map Issue - ``` Touchup Minntooine passability for radar Green???? ``` ----- Original issue reported on code.google.com by `z3r0...@gmail.com` on 30 Jan 2011 at 2:13
defect
map issue touchup minntooine passability for radar green original issue reported on code google com by gmail com on jan at
1
26,556
4,757,884,667
IssuesEvent
2016-10-24 17:53:52
cakephp/cakephp
https://api.github.com/repos/cakephp/cakephp
closed
longitude disallows 0
Defect
This is a (multiple allowed): * [x] bug * [ ] enhancement * [ ] feature-discussion (RFC) * CakePHP Version: 3.3 * Platform and Target: nor relevant ### What you did Look here: http://api.cakephp.org/3.3/source-class-Cake.Validation.Validation.html#49 https://regexper.com/#%5B-%2B%5D%3F(180(%5C.0%2B)%3F%7C((1%5B0-7%5D%5Cd)%7C(%5B1-9%5D%3F%5Cd))(%5C.%5Cd%2B)%3F) ### What happened I was reading up the API. ### Expected Behavior The greenwhich meridian is a valid location on this planet. Suggestion: /^(\+|-)?(?:180(?:(?:\.0{1,10})?)|(?:[0-9]|[1-9][0-9]|1[0-7][0-9])(?:(?:\.[0-9]{1,10})?))$/ https://regexper.com/#%2F%5E(%5C%2B%7C-)%3F(%3F%3A180(%3F%3A(%3F%3A%5C.0%7B1%2C10%7D)%3F)%7C(%3F%3A%5B0-9%5D%7C%5B1-9%5D%5B0-9%5D%7C1%5B0-7%5D%5B0-9%5D)(%3F%3A(%3F%3A%5C.%5B0-9%5D%7B1%2C10%7D)%3F))%24%2F
1.0
longitude disallows 0 - This is a (multiple allowed): * [x] bug * [ ] enhancement * [ ] feature-discussion (RFC) * CakePHP Version: 3.3 * Platform and Target: nor relevant ### What you did Look here: http://api.cakephp.org/3.3/source-class-Cake.Validation.Validation.html#49 https://regexper.com/#%5B-%2B%5D%3F(180(%5C.0%2B)%3F%7C((1%5B0-7%5D%5Cd)%7C(%5B1-9%5D%3F%5Cd))(%5C.%5Cd%2B)%3F) ### What happened I was reading up the API. ### Expected Behavior The greenwhich meridian is a valid location on this planet. Suggestion: /^(\+|-)?(?:180(?:(?:\.0{1,10})?)|(?:[0-9]|[1-9][0-9]|1[0-7][0-9])(?:(?:\.[0-9]{1,10})?))$/ https://regexper.com/#%2F%5E(%5C%2B%7C-)%3F(%3F%3A180(%3F%3A(%3F%3A%5C.0%7B1%2C10%7D)%3F)%7C(%3F%3A%5B0-9%5D%7C%5B1-9%5D%5B0-9%5D%7C1%5B0-7%5D%5B0-9%5D)(%3F%3A(%3F%3A%5C.%5B0-9%5D%7B1%2C10%7D)%3F))%24%2F
defect
longitude disallows this is a multiple allowed bug enhancement feature discussion rfc cakephp version platform and target nor relevant what you did look here what happened i was reading up the api expected behavior the greenwhich meridian is a valid location on this planet suggestion
1
104,783
9,009,426,483
IssuesEvent
2019-02-05 09:01:56
owncloud/client
https://api.github.com/repos/owncloud/client
closed
[Windows] [OwnCloud 2.5.1 build 10807] prevents explorer "recently used" items from opening
ReadyToTest bug
### Expected behaviour Right clicking on the explorer icon in the taskbar shows recently used folders. Clicking on them should open window with item as a location. ### Actual behaviour Nothing happened. ### Steps to reproduce 1. Open some folder synchronized using OwnCloud to appear in the recently used folders explorer taskbar menu. 2. Click at the folder name in the recently used menu - nothing happens. 3. Using [instruction](https://superuser.com/a/976515/223696) and **Shell Extensions Manager by NirSoft** locate _OCContextMenuHandler Class_ for _Context Menu_ type, that points to ```C:\Program Files (x86)\ownCloud\shellext_x64\OCContextMenu.dll``` file. 4. Disable shell extension. 5. Restart explorer (kill and run again) 6. Click at the folder name in recently used menu - window with right location opens. 7. Enabling shell extension again makes the menu useless - `goto 2.` ### Client configuration Client version: `2.5.1 build 10807` Operating system: `Windows 10 Pro 64-bit` OS language: `Polish` Installation path of client: `C:\Program Files (x86)\ownCloud\` ### Logs ```01-17 12:33:05:953 [ info gui.socketapi ]: New connection QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Sending SocketAPI message --> "REGISTER_PATH:<owncloud-dir-path0>" to QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Sending SocketAPI message --> "REGISTER_PATH:<owncloud-dir-path1>" to QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Sending SocketAPI message --> "REGISTER_PATH:<owncloud-dir-path2>" to QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Received SocketAPI message <-- "GET_STRINGS:CONTEXT_MENU_TITLE" from QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Sending SocketAPI message --> "GET_STRINGS:BEGIN" to QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Sending SocketAPI message --> "STRING:CONTEXT_MENU_TITLE:ownCloud" to QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Sending SocketAPI message --> "GET_STRINGS:END" to QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Received SocketAPI message <-- "GET_MENU_ITEMS:<owncloud-dir-path>\\<frequent-dir-path>" from QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Sending SocketAPI message --> "GET_MENU_ITEMS:BEGIN" to QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ debug sync.database.sql ] [ OCC::SqlQuery::bindValue ]: SQL bind 1 QVariant(qlonglong, -4254138760872654692) 01-17 12:33:05:969 [ debug sync.database.sql ] [ OCC::SqlQuery::exec ]: SQL exec "SELECT path, inode, modtime, type, md5, fileid, remotePerm, filesize, ignoredChildrenRemote, contentchecksumtype.name || ':' || contentChecksum FROM metadata LEFT JOIN checksumtype as contentchecksumtype ON metadata.contentChecksumTypeId == contentchecksumtype.id WHERE phash=?1" 01-17 12:33:05:969 [ debug sync.database.sql ] [ OCC::SqlQuery::bindValue ]: SQL bind 1 QVariant(qlonglong, -4254138760872654692) 01-17 12:33:05:969 [ debug sync.database.sql ] [ OCC::SqlQuery::exec ]: SQL exec "SELECT path, inode, modtime, type, md5, fileid, remotePerm, filesize, ignoredChildrenRemote, contentchecksumtype.name || ':' || contentChecksum FROM metadata LEFT JOIN checksumtype as contentchecksumtype ON metadata.contentChecksumTypeId == contentchecksumtype.id WHERE phash=?1" 01-17 12:33:05:969 [ info gui.socketapi ]: Sending SocketAPI message --> "MENU_ITEM:SHARE::Udostępnij..." to QLocalSocket(0x8eb8418) 01-17 12:33:05:969 [ info gui.socketapi ]: Sending SocketAPI message --> "MENU_ITEM:COPY_PUBLIC_LINK::Kopiuj link publiczny do schowka" to QLocalSocket(0x8eb8418) 01-17 12:33:05:969 [ info gui.socketapi ]: Sending SocketAPI message --> "MENU_ITEM:COPY_PRIVATE_LINK::Kopiuj link prywatny do schowka" to QLocalSocket(0x8eb8418) 01-17 12:33:05:969 [ info gui.socketapi ]: Sending SocketAPI message --> "MENU_ITEM:OPEN_PRIVATE_LINK::Otwórz w przeglądarce" to QLocalSocket(0x8eb8418) 01-17 12:33:05:969 [ info gui.socketapi ]: Sending SocketAPI message --> "GET_MENU_ITEMS:END" to QLocalSocket(0x8eb8418) 01-17 12:33:06:025 [ info gui.socketapi ]: Lost connection QLocalSocket(0x8eb8418) 01-17 12:33:06:125 [ info gui.socketapi ]: New connection QLocalSocket(0x8eb84b8) 01-17 12:33:06:125 [ info gui.socketapi ]: Sending SocketAPI message --> "REGISTER_PATH:<owncloud-dir-path0>" to QLocalSocket(0x8eb84b8) 01-17 12:33:06:125 [ info gui.socketapi ]: Sending SocketAPI message --> "REGISTER_PATH:<owncloud-dir-path1>" to QLocalSocket(0x8eb84b8) 01-17 12:33:06:125 [ info gui.socketapi ]: Sending SocketAPI message --> "REGISTER_PATH:<owncloud-dir-path2>" to QLocalSocket(0x8eb84b8) 01-17 12:33:06:140 [ info gui.socketapi ]: Lost connection QLocalSocket(0x8eb84b8) 01-17 12:33:06:140 [ info gui.socketapi ]: Received SocketAPI message <-- "opennewwindow:<owncloud-dir-path2>\\<frequent-dir-path>" from QLocalSocket(0x8eb84b8) 01-17 12:33:06:140 [ warning gui.socketapi ]: The command is not supported by this version of the client: "opennewwindow" with argument: "<owncloud-dir-path2>\\<frequent-dir-path>" ```
1.0
[Windows] [OwnCloud 2.5.1 build 10807] prevents explorer "recently used" items from opening - ### Expected behaviour Right clicking on the explorer icon in the taskbar shows recently used folders. Clicking on them should open window with item as a location. ### Actual behaviour Nothing happened. ### Steps to reproduce 1. Open some folder synchronized using OwnCloud to appear in the recently used folders explorer taskbar menu. 2. Click at the folder name in the recently used menu - nothing happens. 3. Using [instruction](https://superuser.com/a/976515/223696) and **Shell Extensions Manager by NirSoft** locate _OCContextMenuHandler Class_ for _Context Menu_ type, that points to ```C:\Program Files (x86)\ownCloud\shellext_x64\OCContextMenu.dll``` file. 4. Disable shell extension. 5. Restart explorer (kill and run again) 6. Click at the folder name in recently used menu - window with right location opens. 7. Enabling shell extension again makes the menu useless - `goto 2.` ### Client configuration Client version: `2.5.1 build 10807` Operating system: `Windows 10 Pro 64-bit` OS language: `Polish` Installation path of client: `C:\Program Files (x86)\ownCloud\` ### Logs ```01-17 12:33:05:953 [ info gui.socketapi ]: New connection QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Sending SocketAPI message --> "REGISTER_PATH:<owncloud-dir-path0>" to QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Sending SocketAPI message --> "REGISTER_PATH:<owncloud-dir-path1>" to QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Sending SocketAPI message --> "REGISTER_PATH:<owncloud-dir-path2>" to QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Received SocketAPI message <-- "GET_STRINGS:CONTEXT_MENU_TITLE" from QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Sending SocketAPI message --> "GET_STRINGS:BEGIN" to QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Sending SocketAPI message --> "STRING:CONTEXT_MENU_TITLE:ownCloud" to QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Sending SocketAPI message --> "GET_STRINGS:END" to QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Received SocketAPI message <-- "GET_MENU_ITEMS:<owncloud-dir-path>\\<frequent-dir-path>" from QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ info gui.socketapi ]: Sending SocketAPI message --> "GET_MENU_ITEMS:BEGIN" to QLocalSocket(0x8eb8418) 01-17 12:33:05:953 [ debug sync.database.sql ] [ OCC::SqlQuery::bindValue ]: SQL bind 1 QVariant(qlonglong, -4254138760872654692) 01-17 12:33:05:969 [ debug sync.database.sql ] [ OCC::SqlQuery::exec ]: SQL exec "SELECT path, inode, modtime, type, md5, fileid, remotePerm, filesize, ignoredChildrenRemote, contentchecksumtype.name || ':' || contentChecksum FROM metadata LEFT JOIN checksumtype as contentchecksumtype ON metadata.contentChecksumTypeId == contentchecksumtype.id WHERE phash=?1" 01-17 12:33:05:969 [ debug sync.database.sql ] [ OCC::SqlQuery::bindValue ]: SQL bind 1 QVariant(qlonglong, -4254138760872654692) 01-17 12:33:05:969 [ debug sync.database.sql ] [ OCC::SqlQuery::exec ]: SQL exec "SELECT path, inode, modtime, type, md5, fileid, remotePerm, filesize, ignoredChildrenRemote, contentchecksumtype.name || ':' || contentChecksum FROM metadata LEFT JOIN checksumtype as contentchecksumtype ON metadata.contentChecksumTypeId == contentchecksumtype.id WHERE phash=?1" 01-17 12:33:05:969 [ info gui.socketapi ]: Sending SocketAPI message --> "MENU_ITEM:SHARE::Udostępnij..." to QLocalSocket(0x8eb8418) 01-17 12:33:05:969 [ info gui.socketapi ]: Sending SocketAPI message --> "MENU_ITEM:COPY_PUBLIC_LINK::Kopiuj link publiczny do schowka" to QLocalSocket(0x8eb8418) 01-17 12:33:05:969 [ info gui.socketapi ]: Sending SocketAPI message --> "MENU_ITEM:COPY_PRIVATE_LINK::Kopiuj link prywatny do schowka" to QLocalSocket(0x8eb8418) 01-17 12:33:05:969 [ info gui.socketapi ]: Sending SocketAPI message --> "MENU_ITEM:OPEN_PRIVATE_LINK::Otwórz w przeglądarce" to QLocalSocket(0x8eb8418) 01-17 12:33:05:969 [ info gui.socketapi ]: Sending SocketAPI message --> "GET_MENU_ITEMS:END" to QLocalSocket(0x8eb8418) 01-17 12:33:06:025 [ info gui.socketapi ]: Lost connection QLocalSocket(0x8eb8418) 01-17 12:33:06:125 [ info gui.socketapi ]: New connection QLocalSocket(0x8eb84b8) 01-17 12:33:06:125 [ info gui.socketapi ]: Sending SocketAPI message --> "REGISTER_PATH:<owncloud-dir-path0>" to QLocalSocket(0x8eb84b8) 01-17 12:33:06:125 [ info gui.socketapi ]: Sending SocketAPI message --> "REGISTER_PATH:<owncloud-dir-path1>" to QLocalSocket(0x8eb84b8) 01-17 12:33:06:125 [ info gui.socketapi ]: Sending SocketAPI message --> "REGISTER_PATH:<owncloud-dir-path2>" to QLocalSocket(0x8eb84b8) 01-17 12:33:06:140 [ info gui.socketapi ]: Lost connection QLocalSocket(0x8eb84b8) 01-17 12:33:06:140 [ info gui.socketapi ]: Received SocketAPI message <-- "opennewwindow:<owncloud-dir-path2>\\<frequent-dir-path>" from QLocalSocket(0x8eb84b8) 01-17 12:33:06:140 [ warning gui.socketapi ]: The command is not supported by this version of the client: "opennewwindow" with argument: "<owncloud-dir-path2>\\<frequent-dir-path>" ```
non_defect
prevents explorer recently used items from opening expected behaviour right clicking on the explorer icon in the taskbar shows recently used folders clicking on them should open window with item as a location actual behaviour nothing happened steps to reproduce open some folder synchronized using owncloud to appear in the recently used folders explorer taskbar menu click at the folder name in the recently used menu nothing happens using and shell extensions manager by nirsoft locate occontextmenuhandler class for context menu type that points to c program files owncloud shellext occontextmenu dll file disable shell extension restart explorer kill and run again click at the folder name in recently used menu window with right location opens enabling shell extension again makes the menu useless goto client configuration client version build operating system windows pro bit os language polish installation path of client c program files owncloud logs new connection qlocalsocket sending socketapi message register path to qlocalsocket sending socketapi message register path to qlocalsocket sending socketapi message register path to qlocalsocket received socketapi message get strings context menu title from qlocalsocket sending socketapi message get strings begin to qlocalsocket sending socketapi message string context menu title owncloud to qlocalsocket sending socketapi message get strings end to qlocalsocket received socketapi message from qlocalsocket sending socketapi message get menu items begin to qlocalsocket sql bind qvariant qlonglong sql exec select path inode modtime type fileid remoteperm filesize ignoredchildrenremote contentchecksumtype name contentchecksum from metadata left join checksumtype as contentchecksumtype on metadata contentchecksumtypeid contentchecksumtype id where phash sql bind qvariant qlonglong sql exec select path inode modtime type fileid remoteperm filesize ignoredchildrenremote contentchecksumtype name contentchecksum from metadata left join checksumtype as contentchecksumtype on metadata contentchecksumtypeid contentchecksumtype id where phash sending socketapi message menu item share udostępnij to qlocalsocket sending socketapi message menu item copy public link kopiuj link publiczny do schowka to qlocalsocket sending socketapi message menu item copy private link kopiuj link prywatny do schowka to qlocalsocket sending socketapi message menu item open private link otwórz w przeglądarce to qlocalsocket sending socketapi message get menu items end to qlocalsocket lost connection qlocalsocket new connection qlocalsocket sending socketapi message register path to qlocalsocket sending socketapi message register path to qlocalsocket sending socketapi message register path to qlocalsocket lost connection qlocalsocket received socketapi message from qlocalsocket the command is not supported by this version of the client opennewwindow with argument
0
60,075
17,023,327,722
IssuesEvent
2021-07-03 01:27:13
tomhughes/trac-tickets
https://api.github.com/repos/tomhughes/trac-tickets
closed
Revert function not functioning
Component: potlatch (flash editor) Priority: major Resolution: fixed Type: defect
**[Submitted to the original trac issue database at 10.31am, Thursday, 27th November 2008]** While trying to undo changes made by other user I discovered that the revert function is not working properly. For example I reverted changes made to ways ID 9891826, 24673690, 24673678. But they didn't change. At the same time the author of the way change, and the revert function is proposing an older recovery point to revert to.
1.0
Revert function not functioning - **[Submitted to the original trac issue database at 10.31am, Thursday, 27th November 2008]** While trying to undo changes made by other user I discovered that the revert function is not working properly. For example I reverted changes made to ways ID 9891826, 24673690, 24673678. But they didn't change. At the same time the author of the way change, and the revert function is proposing an older recovery point to revert to.
defect
revert function not functioning while trying to undo changes made by other user i discovered that the revert function is not working properly for example i reverted changes made to ways id but they didn t change at the same time the author of the way change and the revert function is proposing an older recovery point to revert to
1
10,295
8,875,176,737
IssuesEvent
2019-01-12 01:02:10
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
Why is there a difference between the Iteration Prediction and Quick Test Prediction
cognitive-services/svc cxp in-progress product-question triaged
I was looking through the Training images in my Iteration page. An image was supposed to be predicted as 'check' but it predicted it as 'striped'(the 'striped' probability was 83%) So, I copied the image url of that image, and ran it trough the 'Quick Test'. On the 'Quick Test', it correctly predicted the image as 'check' and the probability was 99%. Why is it that even if I use the same image, the probability differs depending on which part the image is trained on? --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: e898e972-43eb-601b-fc5c-1b0199315473 * Version Independent ID: e8504165-4688-e91b-1ee3-3da10560e583 * Content: [Test and retrain a model - Custom Vision Service](https://docs.microsoft.com/en-us/azure/cognitive-services/custom-vision-service/test-your-model) * Content Source: [articles/cognitive-services/Custom-Vision-Service/test-your-model.md](https://github.com/Microsoft/azure-docs/blob/master/articles/cognitive-services/Custom-Vision-Service/test-your-model.md) * Service: **cognitive-services** * GitHub Login: @anrothMSFT * Microsoft Alias: **anroth**
1.0
Why is there a difference between the Iteration Prediction and Quick Test Prediction - I was looking through the Training images in my Iteration page. An image was supposed to be predicted as 'check' but it predicted it as 'striped'(the 'striped' probability was 83%) So, I copied the image url of that image, and ran it trough the 'Quick Test'. On the 'Quick Test', it correctly predicted the image as 'check' and the probability was 99%. Why is it that even if I use the same image, the probability differs depending on which part the image is trained on? --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: e898e972-43eb-601b-fc5c-1b0199315473 * Version Independent ID: e8504165-4688-e91b-1ee3-3da10560e583 * Content: [Test and retrain a model - Custom Vision Service](https://docs.microsoft.com/en-us/azure/cognitive-services/custom-vision-service/test-your-model) * Content Source: [articles/cognitive-services/Custom-Vision-Service/test-your-model.md](https://github.com/Microsoft/azure-docs/blob/master/articles/cognitive-services/Custom-Vision-Service/test-your-model.md) * Service: **cognitive-services** * GitHub Login: @anrothMSFT * Microsoft Alias: **anroth**
non_defect
why is there a difference between the iteration prediction and quick test prediction i was looking through the training images in my iteration page an image was supposed to be predicted as check but it predicted it as striped the striped probability was so i copied the image url of that image and ran it trough the quick test on the quick test it correctly predicted the image as check and the probability was why is it that even if i use the same image the probability differs depending on which part the image is trained on document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service cognitive services github login anrothmsft microsoft alias anroth
0
51,237
13,207,399,433
IssuesEvent
2020-08-14 22:57:30
icecube-trac/tix4
https://api.github.com/repos/icecube-trac/tix4
opened
Bad error message when reading in non .i3 file (Trac #78)
Incomplete Migration Migrated from Trac dataio defect
<details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/78">https://code.icecube.wisc.edu/projects/icecube/ticket/78</a>, reported by prothand owned by troy</em></summary> <p> ```json { "status": "closed", "changetime": "2007-09-07T17:13:20", "_ts": "1189185200000000", "description": "I was using python's glob() in order to get a list of input files together. Unfortunately, a text file leaked into the list. I got this error message:\n/local/proth/work/jeb/V00-00-05-src/dataio/private/dataio/FrameIO.cxx:253: FATAL: Frame in file is version 1684107084, this software can read only up to 3\n\nFortunately, someone recognized that error and saved me lots of headaches. Maybe there could be a warning if a file you're reading doesn't end in \".i3\" or \".i3.gz\". Or maybe dataio could recognize a bad file format and give a meaningful error. I know for sure this issue exists in the latest release of offline-software, so forgive me if it's solved on the trunk already.", "reporter": "proth", "cc": "", "resolution": "fixed", "time": "2007-07-18T16:35:31", "component": "dataio", "summary": "Bad error message when reading in non .i3 file", "priority": "major", "keywords": "", "milestone": "", "owner": "troy", "type": "defect" } ``` </p> </details>
1.0
Bad error message when reading in non .i3 file (Trac #78) - <details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/78">https://code.icecube.wisc.edu/projects/icecube/ticket/78</a>, reported by prothand owned by troy</em></summary> <p> ```json { "status": "closed", "changetime": "2007-09-07T17:13:20", "_ts": "1189185200000000", "description": "I was using python's glob() in order to get a list of input files together. Unfortunately, a text file leaked into the list. I got this error message:\n/local/proth/work/jeb/V00-00-05-src/dataio/private/dataio/FrameIO.cxx:253: FATAL: Frame in file is version 1684107084, this software can read only up to 3\n\nFortunately, someone recognized that error and saved me lots of headaches. Maybe there could be a warning if a file you're reading doesn't end in \".i3\" or \".i3.gz\". Or maybe dataio could recognize a bad file format and give a meaningful error. I know for sure this issue exists in the latest release of offline-software, so forgive me if it's solved on the trunk already.", "reporter": "proth", "cc": "", "resolution": "fixed", "time": "2007-07-18T16:35:31", "component": "dataio", "summary": "Bad error message when reading in non .i3 file", "priority": "major", "keywords": "", "milestone": "", "owner": "troy", "type": "defect" } ``` </p> </details>
defect
bad error message when reading in non file trac migrated from json status closed changetime ts description i was using python s glob in order to get a list of input files together unfortunately a text file leaked into the list i got this error message n local proth work jeb src dataio private dataio frameio cxx fatal frame in file is version this software can read only up to n nfortunately someone recognized that error and saved me lots of headaches maybe there could be a warning if a file you re reading doesn t end in or gz or maybe dataio could recognize a bad file format and give a meaningful error i know for sure this issue exists in the latest release of offline software so forgive me if it s solved on the trunk already reporter proth cc resolution fixed time component dataio summary bad error message when reading in non file priority major keywords milestone owner troy type defect
1
64,619
18,770,126,951
IssuesEvent
2021-11-06 17:33:17
hazelcast/hazelcast-go-client
https://api.github.com/repos/hazelcast/hazelcast-go-client
closed
Client should not open multiple connections to same member
Type: Defect Source: Internal to-jira
Go client opens connections in 3 paths: 1-> member added 2-> fix connection 3-> respawn connection on connection close. Java client only does fix connection every second.
1.0
Client should not open multiple connections to same member - Go client opens connections in 3 paths: 1-> member added 2-> fix connection 3-> respawn connection on connection close. Java client only does fix connection every second.
defect
client should not open multiple connections to same member go client opens connections in paths member added fix connection respawn connection on connection close java client only does fix connection every second
1
5,264
2,610,184,341
IssuesEvent
2015-02-26 18:58:36
chrsmith/quchuseban
https://api.github.com/repos/chrsmith/quchuseban
opened
分享遗传性色斑能去掉吗
auto-migrated Priority-Medium Type-Defect
``` 《摘要》 人生如戏,可又有别于戏。它没有预演的机会,一旦拉开了�� �幕,不管你如何怯场,都得演到戏的结尾。因为,人生是没� ��草稿的。面对人生,有人小心谨慎,三思而后行,以求尽可 能有一个完美的人生;有的人却漠不关心,乱冲乱撞,直到�� �己无力地在生死边缘挣扎时,才懂得流泪。乱涂乱画的人生� ��注定逃不过被丢进纸篓的命运,成为一张毫无用处的废纸。 细心描绘的人生,尽管可能它并不是完美的,但它却可以得�� �命运的垂青和怜爱,成为上帝的宠儿。祛斑也是如此,遗传� ��色斑能去掉吗, 《客户案例》   我是个导游,做我们这行的挺辛苦的,整体带着团东奔�� �跑的,遇到风吹日晒,别人躲到空调房了,我们还得坚守岗� ��,几年下来不仅腿都跑细了,脸上还被晒了很多的斑斑点点 ,不仅影响形象,工作的时候总觉得别人拿异样的眼光看我�� �感觉挺不舒服的。<br>   后来听同事说「黛芙薇尔精华液」不错,就在网上买了�� �个周期,刚开始没什么效果,我还以为是骗人的呢,在用了� ��近一个月的时候,感觉斑淡了点,我这才稍微放点心,后来 情况就好了,斑越来越淡了,现在已经看不出来了,这个还�� �不错。现在我再也不担心别人的眼光了,这样真好。 阅读了遗传性色斑能去掉吗,再看脸上容易长斑的原因: 《色斑形成原因》   内部因素   一、压力   当人受到压力时,就会分泌肾上腺素,为对付压力而做�� �备。如果长期受到压力,人体新陈代谢的平衡就会遭到破坏� ��皮肤所需的营养供应趋于缓慢,色素母细胞就会变得很活跃 。   二、荷尔蒙分泌失调   避孕药里所含的女性荷尔蒙雌激素,会刺激麦拉宁细胞�� �分泌而形成不均匀的斑点,因避孕药而形成的斑点,虽然在� ��药中断后会停止,但仍会在皮肤上停留很长一段时间。怀孕 中因女性荷尔蒙雌激素的增加,从怀孕4—5个月开始会容易出 现斑,这时候出现的斑点在产后大部分会消失。可是,新陈�� �谢不正常、肌肤裸露在强烈的紫外线下、精神上受到压力等� ��因,都会使斑加深。有时新长出的斑,产后也不会消失,所 以需要更加注意。   三、新陈代谢缓慢   肝的新陈代谢功能不正常或卵巢功能减退时也会出现斑�� �因为新陈代谢不顺畅、或内分泌失调,使身体处于敏感状态� ��,从而加剧色素问题。我们常说的便秘会形成斑,其实就是 内分泌失调导致过敏体质而形成的。另外,身体状态不正常�� �时候,紫外线的照射也会加速斑的形成。   四、错误的使用化妆品   使用了不适合自己皮肤的化妆品,会导致皮肤过敏。在�� �疗的过程中如过量照射到紫外线,皮肤会为了抵御外界的侵� ��,在有炎症的部位聚集麦拉宁色素,这样会出现色素沉着的 问题。   外部因素   一、紫外线   照射紫外线的时候,人体为了保护皮肤,会在基底层产�� �很多麦拉宁色素。所以为了保护皮肤,会在敏感部位聚集更� ��的色素。经常裸露在强烈的阳光底下不仅促进皮肤的老化, 还会引起黑斑、雀斑等色素沉着的皮肤疾患。   二、不良的清洁习惯   因强烈的清洁习惯使皮肤变得敏感,这样会刺激皮肤。�� �皮肤敏感时,人体为了保护皮肤,黑色素细胞会分泌很多麦� ��宁色素,当色素过剩时就出现了斑、瑕疵等皮肤色素沉着的 问题。   三、遗传基因   父母中有长斑的,则本人长斑的概率就很高,这种情况�� �一定程度上就可判定是遗传基因的作用。所以家里特别是长� ��有长斑的人,要注意避免引发长斑的重要因素之一——紫外 线照射,这是预防斑必须注意的。 《有疑问帮你解决》   1,黛芙薇尔精华液真的有效果吗?真的可以把脸上的黄褐�� �去掉吗?   答:黛芙薇尔精华液DNA精华能够有效的修复周围难以触�� �的色斑,其独有的纳豆成分为皮肤的美白与靓丽,提供了必� ��可少的营养物质,可以有效的去除黄褐斑,黄褐斑,黄褐斑 ,蝴蝶斑,晒斑、妊娠斑等。它它完全突破了传统的美肤时�� �,宛如在皮肤中注入了一杯兼具活化、再生、滋养等功效的� ��尾酒,同时为脸部提供大量有机维生素精华,脸部的改变显 而易见。自产品上市以来,老顾客纷纷介绍新顾客,71%的新�� �客都是通过老顾客介绍而来,口碑由此而来!   2,服用黛芙薇尔美白,会伤身体吗?有副作用吗?   答:黛芙薇尔精华液应用了精纯复合配方和领先的分类�� �斑科技,并将“DNA美肤系统”疗法应用到了该产品中,能彻� ��祛除黄褐斑,蝴蝶斑,妊娠斑,晒斑,黄褐斑,老年斑,有 效淡化黄褐斑至接近肤色。黛芙薇尔通过法国、美国、台湾�� �地的专家通力协作,超过10年的研究以全新的DNA肌肤修复技�� �,挑战传统化学护肤理念,不懈追寻发现破译大自然的美丽� ��迹,令每一位爱美的女性都能享受到科技创新所带来的自然 之美。 专为亚洲女性肤质研制,精心呵护女性美丽,多年来,为数�� �百万计的女性解除了黄褐斑困扰。深得广大女性朋友的信赖!   3,去除黄褐斑之后,会反弹吗?   答:很多曾经长了黄褐斑的人士,自从选择了黛芙薇尔�� �白,就一劳永逸。这款祛斑产品是经过数十位权威祛斑专家� ��据斑的形成原因精心研制而成用事实说话,让消费者打分。 树立权威品牌!我们的很多新客户都是老客户介绍而来,请问� ��如果效果不好,会有客户转介绍吗?   4,你们的价格有点贵,能不能便宜一点?   答:如果您使用西药最少需要2000元,煎服的药最少需要3 000元,做手术最少是5000元,而这些毫无疑问,不会对彻底去� ��你的斑点有任何帮助!一分价钱,一份价值,我们现在做的�� �是一个口碑,一个品牌,价钱并不高。如果花这点钱把你的� ��褐斑彻底去除,你还会觉得贵吗?你还会再去花那么多冤枉�� �,不但斑没去掉,还把自己的皮肤弄的越来越糟吗   5,我适合用黛芙薇尔精华液吗?   答:黛芙薇尔适用人群:   1、生理紊乱引起的黄褐斑人群   2、生育引起的妊娠斑人群   3、年纪增长引起的老年斑人群   4、化妆品色素沉积、辐射斑人群   5、长期日照引起的日晒斑人群   6、肌肤暗淡急需美白的人群 《祛斑小方法》 遗传性色斑能去掉吗,同时为您分享祛斑小方法 西瓜面膜 西瓜可去油脂,改善皮肤出油状况。 材料:吃剩的西瓜(几片)。 做法:把西瓜的果肉剔除,露出青色的果皮,敷在脸上5-10�� �钟再洗干净。 注意: 敷完脸后记得洗干净脸皮上西瓜留下的甜味,否则可能会吸�� �小蚂蚁来野餐。 南瓜番茄胡萝卜汤 用料:南瓜220克,番茄110克,胡萝卜110克,瘦猪肉100克,生�� �15克,调料适量。 做法:上料洗净后加水煮汤,瓜熟烂即可。 食法:吃菜喝汤,每日1剂。30日为一疗程。 功效:除黑斑,美肌肤。 ``` ----- Original issue reported on code.google.com by `additive...@gmail.com` on 1 Jul 2014 at 3:19
1.0
分享遗传性色斑能去掉吗 - ``` 《摘要》 人生如戏,可又有别于戏。它没有预演的机会,一旦拉开了�� �幕,不管你如何怯场,都得演到戏的结尾。因为,人生是没� ��草稿的。面对人生,有人小心谨慎,三思而后行,以求尽可 能有一个完美的人生;有的人却漠不关心,乱冲乱撞,直到�� �己无力地在生死边缘挣扎时,才懂得流泪。乱涂乱画的人生� ��注定逃不过被丢进纸篓的命运,成为一张毫无用处的废纸。 细心描绘的人生,尽管可能它并不是完美的,但它却可以得�� �命运的垂青和怜爱,成为上帝的宠儿。祛斑也是如此,遗传� ��色斑能去掉吗, 《客户案例》   我是个导游,做我们这行的挺辛苦的,整体带着团东奔�� �跑的,遇到风吹日晒,别人躲到空调房了,我们还得坚守岗� ��,几年下来不仅腿都跑细了,脸上还被晒了很多的斑斑点点 ,不仅影响形象,工作的时候总觉得别人拿异样的眼光看我�� �感觉挺不舒服的。<br>   后来听同事说「黛芙薇尔精华液」不错,就在网上买了�� �个周期,刚开始没什么效果,我还以为是骗人的呢,在用了� ��近一个月的时候,感觉斑淡了点,我这才稍微放点心,后来 情况就好了,斑越来越淡了,现在已经看不出来了,这个还�� �不错。现在我再也不担心别人的眼光了,这样真好。 阅读了遗传性色斑能去掉吗,再看脸上容易长斑的原因: 《色斑形成原因》   内部因素   一、压力   当人受到压力时,就会分泌肾上腺素,为对付压力而做�� �备。如果长期受到压力,人体新陈代谢的平衡就会遭到破坏� ��皮肤所需的营养供应趋于缓慢,色素母细胞就会变得很活跃 。   二、荷尔蒙分泌失调   避孕药里所含的女性荷尔蒙雌激素,会刺激麦拉宁细胞�� �分泌而形成不均匀的斑点,因避孕药而形成的斑点,虽然在� ��药中断后会停止,但仍会在皮肤上停留很长一段时间。怀孕 中因女性荷尔蒙雌激素的增加,从怀孕4—5个月开始会容易出 现斑,这时候出现的斑点在产后大部分会消失。可是,新陈�� �谢不正常、肌肤裸露在强烈的紫外线下、精神上受到压力等� ��因,都会使斑加深。有时新长出的斑,产后也不会消失,所 以需要更加注意。   三、新陈代谢缓慢   肝的新陈代谢功能不正常或卵巢功能减退时也会出现斑�� �因为新陈代谢不顺畅、或内分泌失调,使身体处于敏感状态� ��,从而加剧色素问题。我们常说的便秘会形成斑,其实就是 内分泌失调导致过敏体质而形成的。另外,身体状态不正常�� �时候,紫外线的照射也会加速斑的形成。   四、错误的使用化妆品   使用了不适合自己皮肤的化妆品,会导致皮肤过敏。在�� �疗的过程中如过量照射到紫外线,皮肤会为了抵御外界的侵� ��,在有炎症的部位聚集麦拉宁色素,这样会出现色素沉着的 问题。   外部因素   一、紫外线   照射紫外线的时候,人体为了保护皮肤,会在基底层产�� �很多麦拉宁色素。所以为了保护皮肤,会在敏感部位聚集更� ��的色素。经常裸露在强烈的阳光底下不仅促进皮肤的老化, 还会引起黑斑、雀斑等色素沉着的皮肤疾患。   二、不良的清洁习惯   因强烈的清洁习惯使皮肤变得敏感,这样会刺激皮肤。�� �皮肤敏感时,人体为了保护皮肤,黑色素细胞会分泌很多麦� ��宁色素,当色素过剩时就出现了斑、瑕疵等皮肤色素沉着的 问题。   三、遗传基因   父母中有长斑的,则本人长斑的概率就很高,这种情况�� �一定程度上就可判定是遗传基因的作用。所以家里特别是长� ��有长斑的人,要注意避免引发长斑的重要因素之一——紫外 线照射,这是预防斑必须注意的。 《有疑问帮你解决》   1,黛芙薇尔精华液真的有效果吗?真的可以把脸上的黄褐�� �去掉吗?   答:黛芙薇尔精华液DNA精华能够有效的修复周围难以触�� �的色斑,其独有的纳豆成分为皮肤的美白与靓丽,提供了必� ��可少的营养物质,可以有效的去除黄褐斑,黄褐斑,黄褐斑 ,蝴蝶斑,晒斑、妊娠斑等。它它完全突破了传统的美肤时�� �,宛如在皮肤中注入了一杯兼具活化、再生、滋养等功效的� ��尾酒,同时为脸部提供大量有机维生素精华,脸部的改变显 而易见。自产品上市以来,老顾客纷纷介绍新顾客,71%的新�� �客都是通过老顾客介绍而来,口碑由此而来!   2,服用黛芙薇尔美白,会伤身体吗?有副作用吗?   答:黛芙薇尔精华液应用了精纯复合配方和领先的分类�� �斑科技,并将“DNA美肤系统”疗法应用到了该产品中,能彻� ��祛除黄褐斑,蝴蝶斑,妊娠斑,晒斑,黄褐斑,老年斑,有 效淡化黄褐斑至接近肤色。黛芙薇尔通过法国、美国、台湾�� �地的专家通力协作,超过10年的研究以全新的DNA肌肤修复技�� �,挑战传统化学护肤理念,不懈追寻发现破译大自然的美丽� ��迹,令每一位爱美的女性都能享受到科技创新所带来的自然 之美。 专为亚洲女性肤质研制,精心呵护女性美丽,多年来,为数�� �百万计的女性解除了黄褐斑困扰。深得广大女性朋友的信赖!   3,去除黄褐斑之后,会反弹吗?   答:很多曾经长了黄褐斑的人士,自从选择了黛芙薇尔�� �白,就一劳永逸。这款祛斑产品是经过数十位权威祛斑专家� ��据斑的形成原因精心研制而成用事实说话,让消费者打分。 树立权威品牌!我们的很多新客户都是老客户介绍而来,请问� ��如果效果不好,会有客户转介绍吗?   4,你们的价格有点贵,能不能便宜一点?   答:如果您使用西药最少需要2000元,煎服的药最少需要3 000元,做手术最少是5000元,而这些毫无疑问,不会对彻底去� ��你的斑点有任何帮助!一分价钱,一份价值,我们现在做的�� �是一个口碑,一个品牌,价钱并不高。如果花这点钱把你的� ��褐斑彻底去除,你还会觉得贵吗?你还会再去花那么多冤枉�� �,不但斑没去掉,还把自己的皮肤弄的越来越糟吗   5,我适合用黛芙薇尔精华液吗?   答:黛芙薇尔适用人群:   1、生理紊乱引起的黄褐斑人群   2、生育引起的妊娠斑人群   3、年纪增长引起的老年斑人群   4、化妆品色素沉积、辐射斑人群   5、长期日照引起的日晒斑人群   6、肌肤暗淡急需美白的人群 《祛斑小方法》 遗传性色斑能去掉吗,同时为您分享祛斑小方法 西瓜面膜 西瓜可去油脂,改善皮肤出油状况。 材料:吃剩的西瓜(几片)。 做法:把西瓜的果肉剔除,露出青色的果皮,敷在脸上5-10�� �钟再洗干净。 注意: 敷完脸后记得洗干净脸皮上西瓜留下的甜味,否则可能会吸�� �小蚂蚁来野餐。 南瓜番茄胡萝卜汤 用料:南瓜220克,番茄110克,胡萝卜110克,瘦猪肉100克,生�� �15克,调料适量。 做法:上料洗净后加水煮汤,瓜熟烂即可。 食法:吃菜喝汤,每日1剂。30日为一疗程。 功效:除黑斑,美肌肤。 ``` ----- Original issue reported on code.google.com by `additive...@gmail.com` on 1 Jul 2014 at 3:19
defect
分享遗传性色斑能去掉吗 《摘要》 人生如戏,可又有别于戏。它没有预演的机会,一旦拉开了�� �幕,不管你如何怯场,都得演到戏的结尾。因为,人生是没� ��草稿的。面对人生,有人小心谨慎,三思而后行,以求尽可 能有一个完美的人生;有的人却漠不关心,乱冲乱撞,直到�� �己无力地在生死边缘挣扎时,才懂得流泪。乱涂乱画的人生� ��注定逃不过被丢进纸篓的命运,成为一张毫无用处的废纸。 细心描绘的人生,尽管可能它并不是完美的,但它却可以得�� �命运的垂青和怜爱,成为上帝的宠儿。祛斑也是如此,遗传� ��色斑能去掉吗, 《客户案例》   我是个导游,做我们这行的挺辛苦的,整体带着团东奔�� �跑的,遇到风吹日晒,别人躲到空调房了,我们还得坚守岗� ��,几年下来不仅腿都跑细了,脸上还被晒了很多的斑斑点点 ,不仅影响形象,工作的时候总觉得别人拿异样的眼光看我�� �感觉挺不舒服的。   后来听同事说「黛芙薇尔精华液」不错,就在网上买了�� �个周期,刚开始没什么效果,我还以为是骗人的呢,在用了� ��近一个月的时候,感觉斑淡了点,我这才稍微放点心,后来 情况就好了,斑越来越淡了,现在已经看不出来了,这个还�� �不错。现在我再也不担心别人的眼光了,这样真好。 阅读了遗传性色斑能去掉吗,再看脸上容易长斑的原因: 《色斑形成原因》   内部因素   一、压力   当人受到压力时,就会分泌肾上腺素,为对付压力而做�� �备。如果长期受到压力,人体新陈代谢的平衡就会遭到破坏� ��皮肤所需的营养供应趋于缓慢,色素母细胞就会变得很活跃 。   二、荷尔蒙分泌失调   避孕药里所含的女性荷尔蒙雌激素,会刺激麦拉宁细胞�� �分泌而形成不均匀的斑点,因避孕药而形成的斑点,虽然在� ��药中断后会停止,但仍会在皮肤上停留很长一段时间。怀孕 中因女性荷尔蒙雌激素的增加, — 现斑,这时候出现的斑点在产后大部分会消失。可是,新陈�� �谢不正常、肌肤裸露在强烈的紫外线下、精神上受到压力等� ��因,都会使斑加深。有时新长出的斑,产后也不会消失,所 以需要更加注意。   三、新陈代谢缓慢   肝的新陈代谢功能不正常或卵巢功能减退时也会出现斑�� �因为新陈代谢不顺畅、或内分泌失调,使身体处于敏感状态� ��,从而加剧色素问题。我们常说的便秘会形成斑,其实就是 内分泌失调导致过敏体质而形成的。另外,身体状态不正常�� �时候,紫外线的照射也会加速斑的形成。   四、错误的使用化妆品   使用了不适合自己皮肤的化妆品,会导致皮肤过敏。在�� �疗的过程中如过量照射到紫外线,皮肤会为了抵御外界的侵� ��,在有炎症的部位聚集麦拉宁色素,这样会出现色素沉着的 问题。   外部因素   一、紫外线   照射紫外线的时候,人体为了保护皮肤,会在基底层产�� �很多麦拉宁色素。所以为了保护皮肤,会在敏感部位聚集更� ��的色素。经常裸露在强烈的阳光底下不仅促进皮肤的老化, 还会引起黑斑、雀斑等色素沉着的皮肤疾患。   二、不良的清洁习惯   因强烈的清洁习惯使皮肤变得敏感,这样会刺激皮肤。�� �皮肤敏感时,人体为了保护皮肤,黑色素细胞会分泌很多麦� ��宁色素,当色素过剩时就出现了斑、瑕疵等皮肤色素沉着的 问题。   三、遗传基因   父母中有长斑的,则本人长斑的概率就很高,这种情况�� �一定程度上就可判定是遗传基因的作用。所以家里特别是长� ��有长斑的人,要注意避免引发长斑的重要因素之一——紫外 线照射,这是预防斑必须注意的。 《有疑问帮你解决》    黛芙薇尔精华液真的有效果吗 真的可以把脸上的黄褐�� �去掉吗   答:黛芙薇尔精华液dna精华能够有效的修复周围难以触�� �的色斑,其独有的纳豆成分为皮肤的美白与靓丽,提供了必� ��可少的营养物质,可以有效的去除黄褐斑,黄褐斑,黄褐斑 ,蝴蝶斑,晒斑、妊娠斑等。它它完全突破了传统的美肤时�� �,宛如在皮肤中注入了一杯兼具活化、再生、滋养等功效的� ��尾酒,同时为脸部提供大量有机维生素精华,脸部的改变显 而易见。自产品上市以来,老顾客纷纷介绍新顾客, 的新�� �客都是通过老顾客介绍而来,口碑由此而来    ,服用黛芙薇尔美白,会伤身体吗 有副作用吗   答:黛芙薇尔精华液应用了精纯复合配方和领先的分类�� �斑科技,并将“dna美肤系统”疗法应用到了该产品中,能彻� ��祛除黄褐斑,蝴蝶斑,妊娠斑,晒斑,黄褐斑,老年斑,有 效淡化黄褐斑至接近肤色。黛芙薇尔通过法国、美国、台湾�� �地的专家通力协作, �� �,挑战传统化学护肤理念,不懈追寻发现破译大自然的美丽� ��迹,令每一位爱美的女性都能享受到科技创新所带来的自然 之美。 专为亚洲女性肤质研制,精心呵护女性美丽,多年来,为数�� �百万计的女性解除了黄褐斑困扰。深得广大女性朋友的信赖    ,去除黄褐斑之后,会反弹吗   答:很多曾经长了黄褐斑的人士,自从选择了黛芙薇尔�� �白,就一劳永逸。这款祛斑产品是经过数十位权威祛斑专家� ��据斑的形成原因精心研制而成用事实说话,让消费者打分。 树立权威品牌 我们的很多新客户都是老客户介绍而来,请问� ��如果效果不好,会有客户转介绍吗    ,你们的价格有点贵,能不能便宜一点   答: , , ,而这些毫无疑问,不会对彻底去� ��你的斑点有任何帮助 一分价钱,一份价值,我们现在做的�� �是一个口碑,一个品牌,价钱并不高。如果花这点钱把你的� ��褐斑彻底去除,你还会觉得贵吗 你还会再去花那么多冤枉�� �,不但斑没去掉,还把自己的皮肤弄的越来越糟吗    ,我适合用黛芙薇尔精华液吗   答:黛芙薇尔适用人群:    、生理紊乱引起的黄褐斑人群    、生育引起的妊娠斑人群    、年纪增长引起的老年斑人群    、化妆品色素沉积、辐射斑人群    、长期日照引起的日晒斑人群    、肌肤暗淡急需美白的人群 《祛斑小方法》 遗传性色斑能去掉吗,同时为您分享祛斑小方法 西瓜面膜 西瓜可去油脂,改善皮肤出油状况。 材料:吃剩的西瓜(几片)。 做法:把西瓜的果肉剔除,露出青色的果皮, - �� �钟再洗干净。 注意: 敷完脸后记得洗干净脸皮上西瓜留下的甜味,否则可能会吸�� �小蚂蚁来野餐。 南瓜番茄胡萝卜汤 用料: , , , ,生�� � ,调料适量。 做法:上料洗净后加水煮汤,瓜熟烂即可。 食法:吃菜喝汤, 。 。 功效:除黑斑,美肌肤。 original issue reported on code google com by additive gmail com on jul at
1
711,247
24,455,291,986
IssuesEvent
2022-10-07 05:57:14
rtCamp/login-with-google
https://api.github.com/repos/rtCamp/login-with-google
closed
QA for 1.3.0 release
priority/critical Ready for QA
## Issue Description Please conduct a thorough testing of the plugin for upcoming 1.3.0 release. Below are the features, fixes and additions made to the plugin: = V 1.3.0 = * Feature: Gutenberg block for Login button (https://github.com/rtCamp/login-with-google/issues/77) * Feature: Save user first name and last name on registration (https://github.com/rtCamp/login-with-google/issues/101) * Add: Added hook after user authentication (https://github.com/rtCamp/login-with-google/issues/89) * Add: Added hook after user is logged-in (https://github.com/rtCamp/login-with-google/issues/110) * Fix: set login cookie with shortcode display - https://github.com/rtCamp/rtcamp.com/issues/1415 * Updated npm packages and laravel-mix * Fixes https://github.com/rtCamp/login-with-google/issues/112
1.0
QA for 1.3.0 release - ## Issue Description Please conduct a thorough testing of the plugin for upcoming 1.3.0 release. Below are the features, fixes and additions made to the plugin: = V 1.3.0 = * Feature: Gutenberg block for Login button (https://github.com/rtCamp/login-with-google/issues/77) * Feature: Save user first name and last name on registration (https://github.com/rtCamp/login-with-google/issues/101) * Add: Added hook after user authentication (https://github.com/rtCamp/login-with-google/issues/89) * Add: Added hook after user is logged-in (https://github.com/rtCamp/login-with-google/issues/110) * Fix: set login cookie with shortcode display - https://github.com/rtCamp/rtcamp.com/issues/1415 * Updated npm packages and laravel-mix * Fixes https://github.com/rtCamp/login-with-google/issues/112
non_defect
qa for release issue description please conduct a thorough testing of the plugin for upcoming release below are the features fixes and additions made to the plugin v feature gutenberg block for login button feature save user first name and last name on registration add added hook after user authentication add added hook after user is logged in fix set login cookie with shortcode display updated npm packages and laravel mix fixes
0