Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
4
112
repo_url
stringlengths
33
141
action
stringclasses
3 values
title
stringlengths
1
1.02k
labels
stringlengths
4
1.54k
body
stringlengths
1
262k
index
stringclasses
17 values
text_combine
stringlengths
95
262k
label
stringclasses
2 values
text
stringlengths
96
252k
binary_label
int64
0
1
282,962
24,508,097,722
IssuesEvent
2022-10-10 18:24:23
helidon-io/helidon
https://api.github.com/repos/helidon-io/helidon
closed
2.x: Intermittent HelloWorldAsyncResponseTest.testAsyncWithArg
2.x testing intermittent
``` java.lang.AssertionError: Synthetic SimpleTimer count Expected: is <3L> but: was <2L> at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:20) at io.helidon.microprofile.metrics.HelloWorldAsyncResponseTest.testAsyncWithArg(HelloWorldAsyncResponseTest.java:142) at io.helidon.microprofile.metrics.HelloWorldAsyncResponseTest$Proxy$_$$_WeldClientProxy.testAsyncWithArg(Unknown Source) ``` ``` Standard Output 2022.09.28 00:01:39 INFO org.jboss.weld.Bootstrap Thread[main,5,main]: WELD-ENV-000014: Falling back to Java Reflection for bean-discovery-mode="annotated" discovery. Add org.jboss:jandex to the classpath to speed-up startup. 2022.09.28 00:01:39 INFO org.jboss.weld.Bootstrap Thread[main,5,main]: WELD-000101: Transactional services not available. Injection of @Inject UserTransaction not available. Transactional observers will be invoked synchronously. 2022.09.28 00:01:39 INFO org.jboss.weld.Event Thread[main,5,main]: WELD-000411: Observer method [BackedAnnotatedMethod] public org.glassfish.jersey.ext.cdi1x.internal.ProcessAllAnnotatedTypes.processAnnotatedType(@Observes ProcessAnnotatedType<?>, BeanManager) receives events for all annotated types. Consider restricting events using @WithAnnotations or a generic type with bounds. 2022.09.28 00:01:39 WARNING io.helidon.microprofile.metrics.MetricUtil Thread[main,5,main]: Attribute 'absolute=true' in metric annotation ignored at class level 2022.09.28 00:01:39 WARNING io.helidon.microprofile.metrics.MetricUtil Thread[main,5,main]: Attribute 'absolute=true' in metric annotation ignored at class level 2022.09.28 00:01:39 INFO io.helidon.microprofile.server.ServerCdiExtension Thread[main,5,main]: Registering JAX-RS Application: HelloWorldApp 2022.09.28 00:01:39 INFO io.helidon.webserver.NettyWebServer Thread[nioEventLoopGroup-4-1,10,main]: Channel '@default' started: [id: 0x922626cc, L:/0.0.0.0:41493] 2022.09.28 00:01:39 INFO io.helidon.microprofile.server.ServerCdiExtension Thread[main,5,main]: Server started on http://localhost:41493/ (and all other host addresses) in 8107 milliseconds (since JVM startup). 2022.09.28 00:01:49 INFO io.helidon.webserver.NettyWebServer Thread[nioEventLoopGroup-4-1,10,main]: Channel '@default' closed: [id: 0x922626cc, L:/0.0.0.0:41493] 2022.09.28 00:01:50 INFO io.helidon.microprofile.server.ServerCdiExtension Thread[main,5,main]: Server stopped in 35 milliseconds. 2022.09.28 00:01:50 INFO org.jboss.weld.Bootstrap Thread[main,5,main]: WELD-ENV-002001: Weld SE container 5d9f9493-4f61-4b35-97af-30705ae2152f shut down ```
1.0
2.x: Intermittent HelloWorldAsyncResponseTest.testAsyncWithArg - ``` java.lang.AssertionError: Synthetic SimpleTimer count Expected: is <3L> but: was <2L> at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:20) at io.helidon.microprofile.metrics.HelloWorldAsyncResponseTest.testAsyncWithArg(HelloWorldAsyncResponseTest.java:142) at io.helidon.microprofile.metrics.HelloWorldAsyncResponseTest$Proxy$_$$_WeldClientProxy.testAsyncWithArg(Unknown Source) ``` ``` Standard Output 2022.09.28 00:01:39 INFO org.jboss.weld.Bootstrap Thread[main,5,main]: WELD-ENV-000014: Falling back to Java Reflection for bean-discovery-mode="annotated" discovery. Add org.jboss:jandex to the classpath to speed-up startup. 2022.09.28 00:01:39 INFO org.jboss.weld.Bootstrap Thread[main,5,main]: WELD-000101: Transactional services not available. Injection of @Inject UserTransaction not available. Transactional observers will be invoked synchronously. 2022.09.28 00:01:39 INFO org.jboss.weld.Event Thread[main,5,main]: WELD-000411: Observer method [BackedAnnotatedMethod] public org.glassfish.jersey.ext.cdi1x.internal.ProcessAllAnnotatedTypes.processAnnotatedType(@Observes ProcessAnnotatedType<?>, BeanManager) receives events for all annotated types. Consider restricting events using @WithAnnotations or a generic type with bounds. 2022.09.28 00:01:39 WARNING io.helidon.microprofile.metrics.MetricUtil Thread[main,5,main]: Attribute 'absolute=true' in metric annotation ignored at class level 2022.09.28 00:01:39 WARNING io.helidon.microprofile.metrics.MetricUtil Thread[main,5,main]: Attribute 'absolute=true' in metric annotation ignored at class level 2022.09.28 00:01:39 INFO io.helidon.microprofile.server.ServerCdiExtension Thread[main,5,main]: Registering JAX-RS Application: HelloWorldApp 2022.09.28 00:01:39 INFO io.helidon.webserver.NettyWebServer Thread[nioEventLoopGroup-4-1,10,main]: Channel '@default' started: [id: 0x922626cc, L:/0.0.0.0:41493] 2022.09.28 00:01:39 INFO io.helidon.microprofile.server.ServerCdiExtension Thread[main,5,main]: Server started on http://localhost:41493/ (and all other host addresses) in 8107 milliseconds (since JVM startup). 2022.09.28 00:01:49 INFO io.helidon.webserver.NettyWebServer Thread[nioEventLoopGroup-4-1,10,main]: Channel '@default' closed: [id: 0x922626cc, L:/0.0.0.0:41493] 2022.09.28 00:01:50 INFO io.helidon.microprofile.server.ServerCdiExtension Thread[main,5,main]: Server stopped in 35 milliseconds. 2022.09.28 00:01:50 INFO org.jboss.weld.Bootstrap Thread[main,5,main]: WELD-ENV-002001: Weld SE container 5d9f9493-4f61-4b35-97af-30705ae2152f shut down ```
test
x intermittent helloworldasyncresponsetest testasyncwitharg java lang assertionerror synthetic simpletimer count expected is but was at org hamcrest matcherassert assertthat matcherassert java at io helidon microprofile metrics helloworldasyncresponsetest testasyncwitharg helloworldasyncresponsetest java at io helidon microprofile metrics helloworldasyncresponsetest proxy weldclientproxy testasyncwitharg unknown source standard output info org jboss weld bootstrap thread weld env falling back to java reflection for bean discovery mode annotated discovery add org jboss jandex to the classpath to speed up startup info org jboss weld bootstrap thread weld transactional services not available injection of inject usertransaction not available transactional observers will be invoked synchronously info org jboss weld event thread weld observer method public org glassfish jersey ext internal processallannotatedtypes processannotatedtype observes processannotatedtype beanmanager receives events for all annotated types consider restricting events using withannotations or a generic type with bounds warning io helidon microprofile metrics metricutil thread attribute absolute true in metric annotation ignored at class level warning io helidon microprofile metrics metricutil thread attribute absolute true in metric annotation ignored at class level info io helidon microprofile server servercdiextension thread registering jax rs application helloworldapp info io helidon webserver nettywebserver thread channel default started info io helidon microprofile server servercdiextension thread server started on and all other host addresses in milliseconds since jvm startup info io helidon webserver nettywebserver thread channel default closed info io helidon microprofile server servercdiextension thread server stopped in milliseconds info org jboss weld bootstrap thread weld env weld se container shut down
1
231,631
17,702,100,827
IssuesEvent
2021-08-25 00:04:09
kubernetes/ingress-nginx
https://api.github.com/repos/kubernetes/ingress-nginx
closed
docs for migration to apiVersion networking.k8s.io v1
area/docs priority/important-soon kind/documentation needs-triage
**NGINX Ingress controller version**: all builds once k8s v1.22 is released **Kubernetes version** (use `kubectl version`): v1.22 and above **Environment**: all supported environs - **Cloud provider or hardware configuration**: - **OS** (e.g. from /etc/os-release): - **Kernel** (e.g. `uname -a`): - **Install tools**: - **Others**: **What happened**: As per #7341, we need docs for the migration to apiVersion networking.k8s.io/v1 **What you expected to happen**: Required docs for the migration to apiVersion networking.k8s.io/v1 becomes available **How to reproduce it**: Currently the docs are not present **Anything else we need to know**: Discussion is in #7341 <!-- If this is actually about documentation, add `/kind documentation` below --> /area docs
1.0
docs for migration to apiVersion networking.k8s.io v1 - **NGINX Ingress controller version**: all builds once k8s v1.22 is released **Kubernetes version** (use `kubectl version`): v1.22 and above **Environment**: all supported environs - **Cloud provider or hardware configuration**: - **OS** (e.g. from /etc/os-release): - **Kernel** (e.g. `uname -a`): - **Install tools**: - **Others**: **What happened**: As per #7341, we need docs for the migration to apiVersion networking.k8s.io/v1 **What you expected to happen**: Required docs for the migration to apiVersion networking.k8s.io/v1 becomes available **How to reproduce it**: Currently the docs are not present **Anything else we need to know**: Discussion is in #7341 <!-- If this is actually about documentation, add `/kind documentation` below --> /area docs
non_test
docs for migration to apiversion networking io nginx ingress controller version all builds once is released kubernetes version use kubectl version and above environment all supported environs cloud provider or hardware configuration os e g from etc os release kernel e g uname a install tools others what happened as per we need docs for the migration to apiversion networking io what you expected to happen required docs for the migration to apiversion networking io becomes available how to reproduce it currently the docs are not present anything else we need to know discussion is in area docs
0
55,574
13,643,674,122
IssuesEvent
2020-09-25 17:32:11
GoogleCloudPlatform/python-docs-samples
https://api.github.com/repos/GoogleCloudPlatform/python-docs-samples
closed
memorystore.redis.cloud_run_deployment.e2e_test: test_end_to_end failed
:rotating_light: buildcop: issue priority: p1 samples type: bug
Note: #4514 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky. ---- commit: 5ae6bbe4f18eb97de865a64d045fef48af3abecc buildURL: [Build Status](https://source.cloud.google.com/results/invocations/3027d3dc-6713-42e7-ab0e-b078f573436e), [Sponge](http://sponge2/3027d3dc-6713-42e7-ab0e-b078f573436e) status: failed <details><summary>Test output</summary><br><pre>Traceback (most recent call last): File "/workspace/memorystore/redis/cloud_run_deployment/e2e_test.py", line 70, in services subprocess.run( File "/usr/local/lib/python3.8/subprocess.py", line 512, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command '['gcloud', 'redis', 'instances', 'create', 'test-instance-4a0d3e0618', '--region=us-central1', '--network', 'test-network-4a0d3e0618', '--project', 'python-docs-samples-tests']' returned non-zero exit status 1.</pre></details>
1.0
memorystore.redis.cloud_run_deployment.e2e_test: test_end_to_end failed - Note: #4514 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky. ---- commit: 5ae6bbe4f18eb97de865a64d045fef48af3abecc buildURL: [Build Status](https://source.cloud.google.com/results/invocations/3027d3dc-6713-42e7-ab0e-b078f573436e), [Sponge](http://sponge2/3027d3dc-6713-42e7-ab0e-b078f573436e) status: failed <details><summary>Test output</summary><br><pre>Traceback (most recent call last): File "/workspace/memorystore/redis/cloud_run_deployment/e2e_test.py", line 70, in services subprocess.run( File "/usr/local/lib/python3.8/subprocess.py", line 512, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command '['gcloud', 'redis', 'instances', 'create', 'test-instance-4a0d3e0618', '--region=us-central1', '--network', 'test-network-4a0d3e0618', '--project', 'python-docs-samples-tests']' returned non-zero exit status 1.</pre></details>
non_test
memorystore redis cloud run deployment test test end to end failed note was also for this test but it was closed more than days ago so i didn t mark it flaky commit buildurl status failed test output traceback most recent call last file workspace memorystore redis cloud run deployment test py line in services subprocess run file usr local lib subprocess py line in run raise calledprocesserror retcode process args subprocess calledprocesserror command returned non zero exit status
0
88,715
8,175,240,576
IssuesEvent
2018-08-28 00:54:36
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
server: TestNodeStatusResponse failed under stress
A-core-kv C-test-failure O-robot
SHA: https://github.com/cockroachdb/cockroach/commits/9ee43b32c83db6418070f92d9d698c1890d80b9e Parameters: ``` TAGS= GOFLAGS= ``` To repro, try: ``` # Don't forget to check out a clean suitable branch and experiment with the # stress invocation until the desired results present themselves. For example, # using stressrace instead of stress and passing the '-p' stressflag which # controls concurrency. ./scripts/gceworker.sh start && ./scripts/gceworker.sh mosh cd ~/go/src/github.com/cockroachdb/cockroach && \ make stress TESTS=TestNodeStatusResponse PKG=github.com/cockroachdb/cockroach/pkg/server TESTTIMEOUT=5m STRESSFLAGS='-stderr=false -maxtime 20m -timeout 10m' ``` Failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=861348&tab=buildLog ``` === RUN TestNodeStatusResponse W180826 06:00:57.241641 40136 server/status/runtime.go:294 [n?] Could not parse build timestamp: parsing time "" as "2006/01/02 15:04:05": cannot parse "" as "2006" I180826 06:00:57.253833 40136 server/server.go:830 [n?] monitoring forward clock jumps based on server.clock.forward_jump_check_enabled I180826 06:00:57.254595 40136 base/addr_validation.go:260 [n?] server certificate addresses: IP=127.0.0.1,::1; DNS=localhost,*.local; CN=node I180826 06:00:57.254651 40136 base/addr_validation.go:300 [n?] web UI certificate addresses: IP=127.0.0.1,::1; DNS=localhost,*.local; CN=node I180826 06:00:57.272264 40136 server/config.go:496 [n?] 3 storage engines initialized I180826 06:00:57.272318 40136 server/config.go:499 [n?] RocksDB cache size: 128 MiB I180826 06:00:57.272335 40136 server/config.go:499 [n?] store 0: in-memory, size 0 B I180826 06:00:57.272350 40136 server/config.go:499 [n?] store 1: in-memory, size 0 B I180826 06:00:57.272364 40136 server/config.go:499 [n?] store 2: in-memory, size 0 B I180826 06:00:57.277074 40136 server/node.go:373 [n?] **** cluster 21e63dc4-b41a-4623-ba8d-16c23bcaa065 has been created I180826 06:00:57.277103 40136 server/server.go:1401 [n?] **** add additional nodes by specifying --join=127.0.0.1:36435 I180826 06:00:57.277314 40136 gossip/gossip.go:382 [n1] NodeDescriptor set to node_id:1 address:<network_field:"tcp" address_field:"127.0.0.1:36435" > attrs:<> locality:<> ServerVersion:<major_val:2 minor_val:0 patch:0 unstable:12 > build_tag:"v2.1.0-alpha.20180702-1991-g9ee43b3" started_at:1535263257277259329 I180826 06:00:57.278686 40136 storage/store.go:1541 [n1,s1] [n1,s1]: failed initial metrics computation: [n1,s1]: system config not yet available I180826 06:00:57.278769 40136 server/node.go:476 [n1] initialized store [n1,s1]: disk (capacity=512 MiB, available=512 MiB, used=0 B, logicalBytes=7.0 KiB), ranges=1, leases=0, queries=0.00, writes=0.00, bytesPerReplica={p10=7139.00 p25=7139.00 p50=7139.00 p75=7139.00 p90=7139.00 pMax=7139.00}, writesPerReplica={p10=0.00 p25=0.00 p50=0.00 p75=0.00 p90=0.00 pMax=0.00} I180826 06:00:57.278922 40136 storage/store.go:1541 [n1,s2] [n1,s2]: failed initial metrics computation: [n1,s2]: system config not yet available I180826 06:00:57.278967 40136 server/node.go:476 [n1] initialized store [n1,s2]: disk (capacity=512 MiB, available=512 MiB, used=0 B, logicalBytes=0 B), ranges=0, leases=0, queries=0.00, writes=0.00, bytesPerReplica={p10=0.00 p25=0.00 p50=0.00 p75=0.00 p90=0.00 pMax=0.00}, writesPerReplica={p10=0.00 p25=0.00 p50=0.00 p75=0.00 p90=0.00 pMax=0.00} I180826 06:00:57.279112 40136 storage/store.go:1541 [n1,s3] [n1,s3]: failed initial metrics computation: [n1,s3]: system config not yet available I180826 06:00:57.279150 40136 server/node.go:476 [n1] initialized store [n1,s3]: disk (capacity=512 MiB, available=512 MiB, used=0 B, logicalBytes=0 B), ranges=0, leases=0, queries=0.00, writes=0.00, bytesPerReplica={p10=0.00 p25=0.00 p50=0.00 p75=0.00 p90=0.00 pMax=0.00}, writesPerReplica={p10=0.00 p25=0.00 p50=0.00 p75=0.00 p90=0.00 pMax=0.00} I180826 06:00:57.279199 40136 storage/stores.go:242 [n1] read 0 node addresses from persistent storage I180826 06:00:57.279289 40136 server/node.go:697 [n1] connecting to gossip network to verify cluster ID... I180826 06:00:57.280071 40136 server/node.go:722 [n1] node connected via gossip and verified as part of cluster "21e63dc4-b41a-4623-ba8d-16c23bcaa065" I180826 06:00:57.280112 40136 server/node.go:546 [n1] node=1: started with [<no-attributes>=<in-mem> <no-attributes>=<in-mem> <no-attributes>=<in-mem>] engine(s) and attributes [] I180826 06:00:57.280321 40136 server/status/recorder.go:652 [n1] available memory from cgroups (8.0 EiB) exceeds system memory 16 GiB, using system memory I180826 06:00:57.280347 40136 server/server.go:1807 [n1] Could not start heap profiler worker due to: directory to store profiles could not be determined I180826 06:00:57.280411 40136 server/server.go:1538 [n1] starting https server at 127.0.0.1:41543 (use: 127.0.0.1:41543) I180826 06:00:57.280443 40136 server/server.go:1540 [n1] starting grpc/postgres server at 127.0.0.1:36435 I180826 06:00:57.280456 40136 server/server.go:1541 [n1] advertising CockroachDB node at 127.0.0.1:36435 I180826 06:00:57.281665 40680 server/status/recorder.go:652 [n1,summaries] available memory from cgroups (8.0 EiB) exceeds system memory 16 GiB, using system memory I180826 06:00:57.284490 40719 storage/replica_command.go:289 [split,n1,s1,r1/1:/M{in-ax}] initiating a split of this range at key /System/"" [r2] I180826 06:00:57.320900 40189 storage/replica_command.go:289 [split,n1,s1,r2/1:/{System/-Max}] initiating a split of this range at key /System/NodeLiveness [r3] I180826 06:00:57.326793 40739 storage/replica_command.go:289 [split,n1,s1,r3/1:/{System/NodeL…-Max}] initiating a split of this range at key /System/NodeLivenessMax [r4] W180826 06:00:57.333648 40746 storage/intent_resolver.go:668 [n1,s1] failed to push during intent resolution: failed to push "split" id=dd8ac196 key=/Local/Range/System/NodeLiveness/RangeDescriptor rw=true pri=0.02265368 iso=SERIALIZABLE stat=PENDING epo=0 ts=1535263257.327385511,0 orig=1535263257.327385511,0 max=1535263257.327385511,0 wto=false rop=false seq=1 I180826 06:00:57.336015 40703 storage/replica_command.go:289 [split,n1,s1,r4/1:/{System/NodeL…-Max}] initiating a split of this range at key /System/tsd [r5] E180826 06:00:57.336538 40748 storage/queue.go:788 [replicate,n1,s1,r3/1:/System/NodeLiveness{-Max}] [n1,s1,r3/1:/System/NodeLiveness{-Max}]: unable to add replica (n1,s3):?; node already has a replica I180826 06:00:57.342205 40647 storage/replica_command.go:289 [split,n1,s1,r5/1:/{System/tsd-Max}] initiating a split of this range at key /System/"tse" [r6] E180826 06:00:57.343182 40150 storage/queue.go:788 [replicate,n1,s1,r4/1:/System/{NodeLive…-tsd}] [n1,s1,r4/1:/System/{NodeLive…-tsd}]: unable to add replica (n1,s2):?; node already has a replica I180826 06:00:57.348644 40757 sql/event_log.go:126 [n1,intExec=optInToDiagnosticsStatReporting] Event: "set_cluster_setting", target: 0, info: {SettingName:diagnostics.reporting.enabled Value:true User:root} I180826 06:00:57.359991 40157 sql/event_log.go:126 [n1,intExec=set-setting] Event: "set_cluster_setting", target: 0, info: {SettingName:version Value:2.0-12 User:root} I180826 06:00:57.373114 40778 storage/replica_command.go:289 [split,n1,s1,r6/1:/{System/tse-Max}] initiating a split of this range at key /Table/SystemConfigSpan/Start [r7] E180826 06:00:57.374965 40653 storage/queue.go:788 [replicate,n1,s1,r5/1:/System/ts{d-e}] [n1,s1,r5/1:/System/ts{d-e}]: unable to add replica (n1,s2):?; node already has a replica I180826 06:00:57.376253 40761 sql/event_log.go:126 [n1,intExec=disableNetTrace] Event: "set_cluster_setting", target: 0, info: {SettingName:trace.debug.enable Value:false User:root} I180826 06:00:57.383641 40768 storage/replica_command.go:289 [split,n1,s1,r7/1:/{Table/System…-Max}] initiating a split of this range at key /Table/11 [r8] E180826 06:00:57.384038 40783 storage/queue.go:788 [replicate,n1,s1,r6/1:/{System/tse-Table/System…}] [n1,s1,r6/1:/{System/tse-Table/System…}]: unable to add replica (n1,s3):?; node already has a replica I180826 06:00:57.391374 40809 storage/replica_command.go:289 [split,n1,s1,r8/1:/{Table/11-Max}] initiating a split of this range at key /Table/12 [r9] E180826 06:00:57.394163 40817 storage/queue.go:788 [replicate,n1,s1,r7/1:/Table/{SystemCon…-11}] [n1,s1,r7/1:/Table/{SystemCon…-11}]: unable to add replica (n1,s2):?; node already has a replica I180826 06:00:57.397410 40798 storage/replica_command.go:289 [split,n1,s1,r9/1:/{Table/12-Max}] initiating a split of this range at key /Table/13 [r10] E180826 06:00:57.397703 40065 storage/queue.go:788 [replicate,n1,s1,r8/1:/Table/1{1-2}] [n1,s1,r8/1:/Table/1{1-2}]: unable to add replica (n1,s3):?; node already has a replica I180826 06:00:57.403386 40855 storage/replica_command.go:289 [split,n1,s1,r10/1:/{Table/13-Max}] initiating a split of this range at key /Table/14 [r11] E180826 06:00:57.405624 40835 storage/queue.go:788 [replicate,n1,s1,r9/1:/Table/1{2-3}] [n1,s1,r9/1:/Table/1{2-3}]: unable to add replica (n1,s2):?; node already has a replica I180826 06:00:57.411307 40793 sql/event_log.go:126 [n1,intExec=initializeClusterSecret] Event: "set_cluster_setting", target: 0, info: {SettingName:cluster.secret Value:4476f0d6-f112-40ff-8471-254e32ed6583 User:root} E180826 06:00:57.431666 40655 storage/queue.go:788 [replicate,n1,s1,r10/1:/Table/1{3-4}] [n1,s1,r10/1:/Table/1{3-4}]: unable to add replica (n1,s2):?; node already has a replica I180826 06:00:57.432931 40842 sql/event_log.go:126 [n1,intExec=create-default-db] Event: "create_database", target: 50, info: {DatabaseName:defaultdb Statement:CREATE DATABASE IF NOT EXISTS defaultdb User:root} I180826 06:00:57.437904 40862 storage/replica_command.go:289 [split,n1,s1,r11/1:/{Table/14-Max}] initiating a split of this range at key /Table/15 [r12] I180826 06:00:57.438357 40729 sql/event_log.go:126 [n1,intExec=create-default-db] Event: "create_database", target: 51, info: {DatabaseName:postgres Statement:CREATE DATABASE IF NOT EXISTS postgres User:root} I180826 06:00:57.449891 40136 server/server.go:1594 [n1] done ensuring all necessary migrations have run I180826 06:00:57.449925 40136 server/server.go:1597 [n1] serving sql connections E180826 06:00:57.452312 40828 storage/queue.go:788 [replicate,n1,s1,r11/1:/Table/1{4-5}] [n1,s1,r11/1:/Table/1{4-5}]: unable to add replica (n1,s3):?; node already has a replica I180826 06:00:57.454760 40822 server/server_update.go:67 [n1] no need to upgrade, cluster already at the newest version I180826 06:00:57.455354 40824 sql/event_log.go:126 [n1] Event: "node_join", target: 1, info: {Descriptor:{NodeID:1 Address:{NetworkField:tcp AddressField:127.0.0.1:36435} Attrs: Locality: ServerVersion:2.0-12 BuildTag:v2.1.0-alpha.20180702-1991-g9ee43b3 StartedAt:1535263257277259329 LocalityAddress:[]} ClusterID:21e63dc4-b41a-4623-ba8d-16c23bcaa065 StartedAt:1535263257277259329 LastUp:1535263257277259329} E180826 06:00:58.282101 40833 storage/queue.go:788 [replicate,n1,s1,r1/1:/{Min-System/}] [n1,s1,r1/1:/{Min-System/}]: unable to add replica (n1,s3):?; node already has a replica I180826 06:00:58.295916 40879 rpc/nodedialer/nodedialer.go:92 [consistencyChecker,n1,s1,r1/1:/{Min-System/}] connection to n1 established E180826 06:00:59.282710 40949 storage/queue.go:788 [replicate,n1,s1,r4/1:/System/{NodeLive…-tsd}] [n1,s1,r4/1:/System/{NodeLive…-tsd}]: unable to add replica (n1,s3):?; node already has a replica E180826 06:01:00.282655 40968 storage/queue.go:788 [replicate,n1,s1,r3/1:/System/NodeLiveness{-Max}] [n1,s1,r3/1:/System/NodeLiveness{-Max}]: unable to add replica (n1,s2):?; node already has a replica E180826 06:01:01.285832 40969 storage/queue.go:788 [replicate,n1,s1,r5/1:/System/ts{d-e}] [n1,s1,r5/1:/System/ts{d-e}]: unable to add replica (n1,s3):?; node already has a replica E180826 06:01:02.283758 40953 storage/queue.go:788 [replicate,n1,s1,r8/1:/Table/1{1-2}] [n1,s1,r8/1:/Table/1{1-2}]: unable to add replica (n1,s3):?; node already has a replica E180826 06:01:03.285839 40989 storage/queue.go:788 [replicate,n1,s1,r6/1:/{System/tse-Table/System…}] [n1,s1,r6/1:/{System/tse-Table/System…}]: unable to add replica (n1,s3):?; node already has a replica E180826 06:01:04.289085 40997 storage/queue.go:788 [replicate,n1,s1,r9/1:/Table/1{2-3}] [n1,s1,r9/1:/Table/1{2-3}]: unable to add replica (n1,s2):?; node already has a replica E180826 06:01:05.294381 40971 storage/queue.go:788 [replicate,n1,s1,r2/1:/System/{-NodeLive…}] [n1,s1,r2/1:/System/{-NodeLive…}]: unable to add replica (n1,s3):?; node already has a replica E180826 06:01:06.293716 40907 storage/queue.go:788 [replicate,n1,s1,r1/1:/{Min-System/}] [n1,s1,r1/1:/{Min-System/}]: unable to add replica (n1,s3):?; node already has a replica I180826 06:01:07.281804 40678 server/status/runtime.go:433 [n1] runtime stats: 244 MiB RSS, 456 goroutines, 17 MiB/94 MiB/130 MiB GO alloc/idle/total, 16 MiB/50 MiB CGO alloc/total, 0.00cgo/sec, 0.00/0.00 %(u/s)time, 0.00 %gc (319x) I180826 06:01:07.294363 40466 storage/replica_proposal.go:214 [n1,s1,r7/1:/Table/{SystemCon…-11}] new range lease repl=(n1,s1):1 seq=3 start=1535263257.278406743,0 epo=1 pro=1535263267.294018498,0 following repl=(n1,s1):1 seq=2 start=1535263257.278406743,0 exp=1535263266.279738737,0 pro=1535263257.279756134,0 E180826 06:01:07.295583 40999 storage/queue.go:788 [replicate,n1,s1,r7/1:/Table/{SystemCon…-11}] [n1,s1,r7/1:/Table/{SystemCon…-11}]: unable to add replica (n1,s2):?; node already has a replica I180826 06:01:07.309968 40680 server/status/recorder.go:652 [n1,summaries] available memory from cgroups (8.0 EiB) exceeds system memory 16 GiB, using system memory I180826 06:01:07.319062 40468 storage/replica_proposal.go:214 [n1,s1,r5/1:/System/ts{d-e}] new range lease repl=(n1,s1):1 seq=3 start=1535263257.278406743,0 epo=1 pro=1535263267.318714189,0 following repl=(n1,s1):1 seq=2 start=1535263257.278406743,0 exp=1535263266.279738737,0 pro=1535263257.279756134,0 I180826 06:01:07.325781 40471 storage/replica_proposal.go:214 [n1,s1,r4/1:/System/{NodeLive…-tsd}] new range lease repl=(n1,s1):1 seq=3 start=1535263257.278406743,0 epo=1 pro=1535263267.325084997,0 following repl=(n1,s1):1 seq=2 start=1535263257.278406743,0 exp=1535263266.279738737,0 pro=1535263257.279756134,0 --- FAIL: TestNodeStatusResponse (10.32s) test_server_shim.go:176: had 12 ranges at startup, expected 22 ```
1.0
server: TestNodeStatusResponse failed under stress - SHA: https://github.com/cockroachdb/cockroach/commits/9ee43b32c83db6418070f92d9d698c1890d80b9e Parameters: ``` TAGS= GOFLAGS= ``` To repro, try: ``` # Don't forget to check out a clean suitable branch and experiment with the # stress invocation until the desired results present themselves. For example, # using stressrace instead of stress and passing the '-p' stressflag which # controls concurrency. ./scripts/gceworker.sh start && ./scripts/gceworker.sh mosh cd ~/go/src/github.com/cockroachdb/cockroach && \ make stress TESTS=TestNodeStatusResponse PKG=github.com/cockroachdb/cockroach/pkg/server TESTTIMEOUT=5m STRESSFLAGS='-stderr=false -maxtime 20m -timeout 10m' ``` Failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=861348&tab=buildLog ``` === RUN TestNodeStatusResponse W180826 06:00:57.241641 40136 server/status/runtime.go:294 [n?] Could not parse build timestamp: parsing time "" as "2006/01/02 15:04:05": cannot parse "" as "2006" I180826 06:00:57.253833 40136 server/server.go:830 [n?] monitoring forward clock jumps based on server.clock.forward_jump_check_enabled I180826 06:00:57.254595 40136 base/addr_validation.go:260 [n?] server certificate addresses: IP=127.0.0.1,::1; DNS=localhost,*.local; CN=node I180826 06:00:57.254651 40136 base/addr_validation.go:300 [n?] web UI certificate addresses: IP=127.0.0.1,::1; DNS=localhost,*.local; CN=node I180826 06:00:57.272264 40136 server/config.go:496 [n?] 3 storage engines initialized I180826 06:00:57.272318 40136 server/config.go:499 [n?] RocksDB cache size: 128 MiB I180826 06:00:57.272335 40136 server/config.go:499 [n?] store 0: in-memory, size 0 B I180826 06:00:57.272350 40136 server/config.go:499 [n?] store 1: in-memory, size 0 B I180826 06:00:57.272364 40136 server/config.go:499 [n?] store 2: in-memory, size 0 B I180826 06:00:57.277074 40136 server/node.go:373 [n?] **** cluster 21e63dc4-b41a-4623-ba8d-16c23bcaa065 has been created I180826 06:00:57.277103 40136 server/server.go:1401 [n?] **** add additional nodes by specifying --join=127.0.0.1:36435 I180826 06:00:57.277314 40136 gossip/gossip.go:382 [n1] NodeDescriptor set to node_id:1 address:<network_field:"tcp" address_field:"127.0.0.1:36435" > attrs:<> locality:<> ServerVersion:<major_val:2 minor_val:0 patch:0 unstable:12 > build_tag:"v2.1.0-alpha.20180702-1991-g9ee43b3" started_at:1535263257277259329 I180826 06:00:57.278686 40136 storage/store.go:1541 [n1,s1] [n1,s1]: failed initial metrics computation: [n1,s1]: system config not yet available I180826 06:00:57.278769 40136 server/node.go:476 [n1] initialized store [n1,s1]: disk (capacity=512 MiB, available=512 MiB, used=0 B, logicalBytes=7.0 KiB), ranges=1, leases=0, queries=0.00, writes=0.00, bytesPerReplica={p10=7139.00 p25=7139.00 p50=7139.00 p75=7139.00 p90=7139.00 pMax=7139.00}, writesPerReplica={p10=0.00 p25=0.00 p50=0.00 p75=0.00 p90=0.00 pMax=0.00} I180826 06:00:57.278922 40136 storage/store.go:1541 [n1,s2] [n1,s2]: failed initial metrics computation: [n1,s2]: system config not yet available I180826 06:00:57.278967 40136 server/node.go:476 [n1] initialized store [n1,s2]: disk (capacity=512 MiB, available=512 MiB, used=0 B, logicalBytes=0 B), ranges=0, leases=0, queries=0.00, writes=0.00, bytesPerReplica={p10=0.00 p25=0.00 p50=0.00 p75=0.00 p90=0.00 pMax=0.00}, writesPerReplica={p10=0.00 p25=0.00 p50=0.00 p75=0.00 p90=0.00 pMax=0.00} I180826 06:00:57.279112 40136 storage/store.go:1541 [n1,s3] [n1,s3]: failed initial metrics computation: [n1,s3]: system config not yet available I180826 06:00:57.279150 40136 server/node.go:476 [n1] initialized store [n1,s3]: disk (capacity=512 MiB, available=512 MiB, used=0 B, logicalBytes=0 B), ranges=0, leases=0, queries=0.00, writes=0.00, bytesPerReplica={p10=0.00 p25=0.00 p50=0.00 p75=0.00 p90=0.00 pMax=0.00}, writesPerReplica={p10=0.00 p25=0.00 p50=0.00 p75=0.00 p90=0.00 pMax=0.00} I180826 06:00:57.279199 40136 storage/stores.go:242 [n1] read 0 node addresses from persistent storage I180826 06:00:57.279289 40136 server/node.go:697 [n1] connecting to gossip network to verify cluster ID... I180826 06:00:57.280071 40136 server/node.go:722 [n1] node connected via gossip and verified as part of cluster "21e63dc4-b41a-4623-ba8d-16c23bcaa065" I180826 06:00:57.280112 40136 server/node.go:546 [n1] node=1: started with [<no-attributes>=<in-mem> <no-attributes>=<in-mem> <no-attributes>=<in-mem>] engine(s) and attributes [] I180826 06:00:57.280321 40136 server/status/recorder.go:652 [n1] available memory from cgroups (8.0 EiB) exceeds system memory 16 GiB, using system memory I180826 06:00:57.280347 40136 server/server.go:1807 [n1] Could not start heap profiler worker due to: directory to store profiles could not be determined I180826 06:00:57.280411 40136 server/server.go:1538 [n1] starting https server at 127.0.0.1:41543 (use: 127.0.0.1:41543) I180826 06:00:57.280443 40136 server/server.go:1540 [n1] starting grpc/postgres server at 127.0.0.1:36435 I180826 06:00:57.280456 40136 server/server.go:1541 [n1] advertising CockroachDB node at 127.0.0.1:36435 I180826 06:00:57.281665 40680 server/status/recorder.go:652 [n1,summaries] available memory from cgroups (8.0 EiB) exceeds system memory 16 GiB, using system memory I180826 06:00:57.284490 40719 storage/replica_command.go:289 [split,n1,s1,r1/1:/M{in-ax}] initiating a split of this range at key /System/"" [r2] I180826 06:00:57.320900 40189 storage/replica_command.go:289 [split,n1,s1,r2/1:/{System/-Max}] initiating a split of this range at key /System/NodeLiveness [r3] I180826 06:00:57.326793 40739 storage/replica_command.go:289 [split,n1,s1,r3/1:/{System/NodeL…-Max}] initiating a split of this range at key /System/NodeLivenessMax [r4] W180826 06:00:57.333648 40746 storage/intent_resolver.go:668 [n1,s1] failed to push during intent resolution: failed to push "split" id=dd8ac196 key=/Local/Range/System/NodeLiveness/RangeDescriptor rw=true pri=0.02265368 iso=SERIALIZABLE stat=PENDING epo=0 ts=1535263257.327385511,0 orig=1535263257.327385511,0 max=1535263257.327385511,0 wto=false rop=false seq=1 I180826 06:00:57.336015 40703 storage/replica_command.go:289 [split,n1,s1,r4/1:/{System/NodeL…-Max}] initiating a split of this range at key /System/tsd [r5] E180826 06:00:57.336538 40748 storage/queue.go:788 [replicate,n1,s1,r3/1:/System/NodeLiveness{-Max}] [n1,s1,r3/1:/System/NodeLiveness{-Max}]: unable to add replica (n1,s3):?; node already has a replica I180826 06:00:57.342205 40647 storage/replica_command.go:289 [split,n1,s1,r5/1:/{System/tsd-Max}] initiating a split of this range at key /System/"tse" [r6] E180826 06:00:57.343182 40150 storage/queue.go:788 [replicate,n1,s1,r4/1:/System/{NodeLive…-tsd}] [n1,s1,r4/1:/System/{NodeLive…-tsd}]: unable to add replica (n1,s2):?; node already has a replica I180826 06:00:57.348644 40757 sql/event_log.go:126 [n1,intExec=optInToDiagnosticsStatReporting] Event: "set_cluster_setting", target: 0, info: {SettingName:diagnostics.reporting.enabled Value:true User:root} I180826 06:00:57.359991 40157 sql/event_log.go:126 [n1,intExec=set-setting] Event: "set_cluster_setting", target: 0, info: {SettingName:version Value:2.0-12 User:root} I180826 06:00:57.373114 40778 storage/replica_command.go:289 [split,n1,s1,r6/1:/{System/tse-Max}] initiating a split of this range at key /Table/SystemConfigSpan/Start [r7] E180826 06:00:57.374965 40653 storage/queue.go:788 [replicate,n1,s1,r5/1:/System/ts{d-e}] [n1,s1,r5/1:/System/ts{d-e}]: unable to add replica (n1,s2):?; node already has a replica I180826 06:00:57.376253 40761 sql/event_log.go:126 [n1,intExec=disableNetTrace] Event: "set_cluster_setting", target: 0, info: {SettingName:trace.debug.enable Value:false User:root} I180826 06:00:57.383641 40768 storage/replica_command.go:289 [split,n1,s1,r7/1:/{Table/System…-Max}] initiating a split of this range at key /Table/11 [r8] E180826 06:00:57.384038 40783 storage/queue.go:788 [replicate,n1,s1,r6/1:/{System/tse-Table/System…}] [n1,s1,r6/1:/{System/tse-Table/System…}]: unable to add replica (n1,s3):?; node already has a replica I180826 06:00:57.391374 40809 storage/replica_command.go:289 [split,n1,s1,r8/1:/{Table/11-Max}] initiating a split of this range at key /Table/12 [r9] E180826 06:00:57.394163 40817 storage/queue.go:788 [replicate,n1,s1,r7/1:/Table/{SystemCon…-11}] [n1,s1,r7/1:/Table/{SystemCon…-11}]: unable to add replica (n1,s2):?; node already has a replica I180826 06:00:57.397410 40798 storage/replica_command.go:289 [split,n1,s1,r9/1:/{Table/12-Max}] initiating a split of this range at key /Table/13 [r10] E180826 06:00:57.397703 40065 storage/queue.go:788 [replicate,n1,s1,r8/1:/Table/1{1-2}] [n1,s1,r8/1:/Table/1{1-2}]: unable to add replica (n1,s3):?; node already has a replica I180826 06:00:57.403386 40855 storage/replica_command.go:289 [split,n1,s1,r10/1:/{Table/13-Max}] initiating a split of this range at key /Table/14 [r11] E180826 06:00:57.405624 40835 storage/queue.go:788 [replicate,n1,s1,r9/1:/Table/1{2-3}] [n1,s1,r9/1:/Table/1{2-3}]: unable to add replica (n1,s2):?; node already has a replica I180826 06:00:57.411307 40793 sql/event_log.go:126 [n1,intExec=initializeClusterSecret] Event: "set_cluster_setting", target: 0, info: {SettingName:cluster.secret Value:4476f0d6-f112-40ff-8471-254e32ed6583 User:root} E180826 06:00:57.431666 40655 storage/queue.go:788 [replicate,n1,s1,r10/1:/Table/1{3-4}] [n1,s1,r10/1:/Table/1{3-4}]: unable to add replica (n1,s2):?; node already has a replica I180826 06:00:57.432931 40842 sql/event_log.go:126 [n1,intExec=create-default-db] Event: "create_database", target: 50, info: {DatabaseName:defaultdb Statement:CREATE DATABASE IF NOT EXISTS defaultdb User:root} I180826 06:00:57.437904 40862 storage/replica_command.go:289 [split,n1,s1,r11/1:/{Table/14-Max}] initiating a split of this range at key /Table/15 [r12] I180826 06:00:57.438357 40729 sql/event_log.go:126 [n1,intExec=create-default-db] Event: "create_database", target: 51, info: {DatabaseName:postgres Statement:CREATE DATABASE IF NOT EXISTS postgres User:root} I180826 06:00:57.449891 40136 server/server.go:1594 [n1] done ensuring all necessary migrations have run I180826 06:00:57.449925 40136 server/server.go:1597 [n1] serving sql connections E180826 06:00:57.452312 40828 storage/queue.go:788 [replicate,n1,s1,r11/1:/Table/1{4-5}] [n1,s1,r11/1:/Table/1{4-5}]: unable to add replica (n1,s3):?; node already has a replica I180826 06:00:57.454760 40822 server/server_update.go:67 [n1] no need to upgrade, cluster already at the newest version I180826 06:00:57.455354 40824 sql/event_log.go:126 [n1] Event: "node_join", target: 1, info: {Descriptor:{NodeID:1 Address:{NetworkField:tcp AddressField:127.0.0.1:36435} Attrs: Locality: ServerVersion:2.0-12 BuildTag:v2.1.0-alpha.20180702-1991-g9ee43b3 StartedAt:1535263257277259329 LocalityAddress:[]} ClusterID:21e63dc4-b41a-4623-ba8d-16c23bcaa065 StartedAt:1535263257277259329 LastUp:1535263257277259329} E180826 06:00:58.282101 40833 storage/queue.go:788 [replicate,n1,s1,r1/1:/{Min-System/}] [n1,s1,r1/1:/{Min-System/}]: unable to add replica (n1,s3):?; node already has a replica I180826 06:00:58.295916 40879 rpc/nodedialer/nodedialer.go:92 [consistencyChecker,n1,s1,r1/1:/{Min-System/}] connection to n1 established E180826 06:00:59.282710 40949 storage/queue.go:788 [replicate,n1,s1,r4/1:/System/{NodeLive…-tsd}] [n1,s1,r4/1:/System/{NodeLive…-tsd}]: unable to add replica (n1,s3):?; node already has a replica E180826 06:01:00.282655 40968 storage/queue.go:788 [replicate,n1,s1,r3/1:/System/NodeLiveness{-Max}] [n1,s1,r3/1:/System/NodeLiveness{-Max}]: unable to add replica (n1,s2):?; node already has a replica E180826 06:01:01.285832 40969 storage/queue.go:788 [replicate,n1,s1,r5/1:/System/ts{d-e}] [n1,s1,r5/1:/System/ts{d-e}]: unable to add replica (n1,s3):?; node already has a replica E180826 06:01:02.283758 40953 storage/queue.go:788 [replicate,n1,s1,r8/1:/Table/1{1-2}] [n1,s1,r8/1:/Table/1{1-2}]: unable to add replica (n1,s3):?; node already has a replica E180826 06:01:03.285839 40989 storage/queue.go:788 [replicate,n1,s1,r6/1:/{System/tse-Table/System…}] [n1,s1,r6/1:/{System/tse-Table/System…}]: unable to add replica (n1,s3):?; node already has a replica E180826 06:01:04.289085 40997 storage/queue.go:788 [replicate,n1,s1,r9/1:/Table/1{2-3}] [n1,s1,r9/1:/Table/1{2-3}]: unable to add replica (n1,s2):?; node already has a replica E180826 06:01:05.294381 40971 storage/queue.go:788 [replicate,n1,s1,r2/1:/System/{-NodeLive…}] [n1,s1,r2/1:/System/{-NodeLive…}]: unable to add replica (n1,s3):?; node already has a replica E180826 06:01:06.293716 40907 storage/queue.go:788 [replicate,n1,s1,r1/1:/{Min-System/}] [n1,s1,r1/1:/{Min-System/}]: unable to add replica (n1,s3):?; node already has a replica I180826 06:01:07.281804 40678 server/status/runtime.go:433 [n1] runtime stats: 244 MiB RSS, 456 goroutines, 17 MiB/94 MiB/130 MiB GO alloc/idle/total, 16 MiB/50 MiB CGO alloc/total, 0.00cgo/sec, 0.00/0.00 %(u/s)time, 0.00 %gc (319x) I180826 06:01:07.294363 40466 storage/replica_proposal.go:214 [n1,s1,r7/1:/Table/{SystemCon…-11}] new range lease repl=(n1,s1):1 seq=3 start=1535263257.278406743,0 epo=1 pro=1535263267.294018498,0 following repl=(n1,s1):1 seq=2 start=1535263257.278406743,0 exp=1535263266.279738737,0 pro=1535263257.279756134,0 E180826 06:01:07.295583 40999 storage/queue.go:788 [replicate,n1,s1,r7/1:/Table/{SystemCon…-11}] [n1,s1,r7/1:/Table/{SystemCon…-11}]: unable to add replica (n1,s2):?; node already has a replica I180826 06:01:07.309968 40680 server/status/recorder.go:652 [n1,summaries] available memory from cgroups (8.0 EiB) exceeds system memory 16 GiB, using system memory I180826 06:01:07.319062 40468 storage/replica_proposal.go:214 [n1,s1,r5/1:/System/ts{d-e}] new range lease repl=(n1,s1):1 seq=3 start=1535263257.278406743,0 epo=1 pro=1535263267.318714189,0 following repl=(n1,s1):1 seq=2 start=1535263257.278406743,0 exp=1535263266.279738737,0 pro=1535263257.279756134,0 I180826 06:01:07.325781 40471 storage/replica_proposal.go:214 [n1,s1,r4/1:/System/{NodeLive…-tsd}] new range lease repl=(n1,s1):1 seq=3 start=1535263257.278406743,0 epo=1 pro=1535263267.325084997,0 following repl=(n1,s1):1 seq=2 start=1535263257.278406743,0 exp=1535263266.279738737,0 pro=1535263257.279756134,0 --- FAIL: TestNodeStatusResponse (10.32s) test_server_shim.go:176: had 12 ranges at startup, expected 22 ```
test
server testnodestatusresponse failed under stress sha parameters tags goflags to repro try don t forget to check out a clean suitable branch and experiment with the stress invocation until the desired results present themselves for example using stressrace instead of stress and passing the p stressflag which controls concurrency scripts gceworker sh start scripts gceworker sh mosh cd go src github com cockroachdb cockroach make stress tests testnodestatusresponse pkg github com cockroachdb cockroach pkg server testtimeout stressflags stderr false maxtime timeout failed test run testnodestatusresponse server status runtime go could not parse build timestamp parsing time as cannot parse as server server go monitoring forward clock jumps based on server clock forward jump check enabled base addr validation go server certificate addresses ip dns localhost local cn node base addr validation go web ui certificate addresses ip dns localhost local cn node server config go storage engines initialized server config go rocksdb cache size mib server config go store in memory size b server config go store in memory size b server config go store in memory size b server node go cluster has been created server server go add additional nodes by specifying join gossip gossip go nodedescriptor set to node id address attrs locality serverversion build tag alpha started at storage store go failed initial metrics computation system config not yet available server node go initialized store disk capacity mib available mib used b logicalbytes kib ranges leases queries writes bytesperreplica pmax writesperreplica pmax storage store go failed initial metrics computation system config not yet available server node go initialized store disk capacity mib available mib used b logicalbytes b ranges leases queries writes bytesperreplica pmax writesperreplica pmax storage store go failed initial metrics computation system config not yet available server node go initialized store disk capacity mib available mib used b logicalbytes b ranges leases queries writes bytesperreplica pmax writesperreplica pmax storage stores go read node addresses from persistent storage server node go connecting to gossip network to verify cluster id server node go node connected via gossip and verified as part of cluster server node go node started with engine s and attributes server status recorder go available memory from cgroups eib exceeds system memory gib using system memory server server go could not start heap profiler worker due to directory to store profiles could not be determined server server go starting https server at use server server go starting grpc postgres server at server server go advertising cockroachdb node at server status recorder go available memory from cgroups eib exceeds system memory gib using system memory storage replica command go initiating a split of this range at key system storage replica command go initiating a split of this range at key system nodeliveness storage replica command go initiating a split of this range at key system nodelivenessmax storage intent resolver go failed to push during intent resolution failed to push split id key local range system nodeliveness rangedescriptor rw true pri iso serializable stat pending epo ts orig max wto false rop false seq storage replica command go initiating a split of this range at key system tsd storage queue go unable to add replica node already has a replica storage replica command go initiating a split of this range at key system tse storage queue go unable to add replica node already has a replica sql event log go event set cluster setting target info settingname diagnostics reporting enabled value true user root sql event log go event set cluster setting target info settingname version value user root storage replica command go initiating a split of this range at key table systemconfigspan start storage queue go unable to add replica node already has a replica sql event log go event set cluster setting target info settingname trace debug enable value false user root storage replica command go initiating a split of this range at key table storage queue go unable to add replica node already has a replica storage replica command go initiating a split of this range at key table storage queue go unable to add replica node already has a replica storage replica command go initiating a split of this range at key table storage queue go unable to add replica node already has a replica storage replica command go initiating a split of this range at key table storage queue go unable to add replica node already has a replica sql event log go event set cluster setting target info settingname cluster secret value user root storage queue go unable to add replica node already has a replica sql event log go event create database target info databasename defaultdb statement create database if not exists defaultdb user root storage replica command go initiating a split of this range at key table sql event log go event create database target info databasename postgres statement create database if not exists postgres user root server server go done ensuring all necessary migrations have run server server go serving sql connections storage queue go unable to add replica node already has a replica server server update go no need to upgrade cluster already at the newest version sql event log go event node join target info descriptor nodeid address networkfield tcp addressfield attrs locality serverversion buildtag alpha startedat localityaddress clusterid startedat lastup storage queue go unable to add replica node already has a replica rpc nodedialer nodedialer go connection to established storage queue go unable to add replica node already has a replica storage queue go unable to add replica node already has a replica storage queue go unable to add replica node already has a replica storage queue go unable to add replica node already has a replica storage queue go unable to add replica node already has a replica storage queue go unable to add replica node already has a replica storage queue go unable to add replica node already has a replica storage queue go unable to add replica node already has a replica server status runtime go runtime stats mib rss goroutines mib mib mib go alloc idle total mib mib cgo alloc total sec u s time gc storage replica proposal go new range lease repl seq start epo pro following repl seq start exp pro storage queue go unable to add replica node already has a replica server status recorder go available memory from cgroups eib exceeds system memory gib using system memory storage replica proposal go new range lease repl seq start epo pro following repl seq start exp pro storage replica proposal go new range lease repl seq start epo pro following repl seq start exp pro fail testnodestatusresponse test server shim go had ranges at startup expected
1
8,007
3,126,772,027
IssuesEvent
2015-09-08 11:20:11
arduino/Arduino
https://api.github.com/repos/arduino/Arduino
closed
Documentation error for SoftwareSerial - inverse_logic parameter not mentioned
Component: Documentation Waiting for feedback
On the documentation for the SoftwareSerial constructor: https://www.arduino.cc/en/Reference/SoftwareSerialConstructor > SoftwareSerial(rxPin, txPin) There is no mention of the **inverse_logic** argument. See the source: ```C++ public: // public methods SoftwareSerial(uint8_t receivePin, uint8_t transmitPin, bool inverse_logic = false); ``` This is confusing for people who read code which uses that third argument. It should be mentioned on the page as an optional argument with an explanation along the lines of: > **inverse_logic** is used to invert the sense of incoming bits (the default is normal logic). > > If set, SoftwareSerial treats a LOW (0 volts on the pin, normally) on the Rx pin as a **1**-bit (the idle state) and a HIGH (5 volts on the pin, normally) as a **0**-bit. It also affects the way that it writes to the Tx pin. > **Warning**: You should not connect devices which output serial data outside the range that the Arduino can handle, normally 0V to 5V, for a board running at 5V, and 0V to 3.3V for a board running at 3.3V.
1.0
Documentation error for SoftwareSerial - inverse_logic parameter not mentioned - On the documentation for the SoftwareSerial constructor: https://www.arduino.cc/en/Reference/SoftwareSerialConstructor > SoftwareSerial(rxPin, txPin) There is no mention of the **inverse_logic** argument. See the source: ```C++ public: // public methods SoftwareSerial(uint8_t receivePin, uint8_t transmitPin, bool inverse_logic = false); ``` This is confusing for people who read code which uses that third argument. It should be mentioned on the page as an optional argument with an explanation along the lines of: > **inverse_logic** is used to invert the sense of incoming bits (the default is normal logic). > > If set, SoftwareSerial treats a LOW (0 volts on the pin, normally) on the Rx pin as a **1**-bit (the idle state) and a HIGH (5 volts on the pin, normally) as a **0**-bit. It also affects the way that it writes to the Tx pin. > **Warning**: You should not connect devices which output serial data outside the range that the Arduino can handle, normally 0V to 5V, for a board running at 5V, and 0V to 3.3V for a board running at 3.3V.
non_test
documentation error for softwareserial inverse logic parameter not mentioned on the documentation for the softwareserial constructor softwareserial rxpin txpin there is no mention of the inverse logic argument see the source c public public methods softwareserial t receivepin t transmitpin bool inverse logic false this is confusing for people who read code which uses that third argument it should be mentioned on the page as an optional argument with an explanation along the lines of inverse logic is used to invert the sense of incoming bits the default is normal logic if set softwareserial treats a low volts on the pin normally on the rx pin as a bit the idle state and a high volts on the pin normally as a bit it also affects the way that it writes to the tx pin warning you should not connect devices which output serial data outside the range that the arduino can handle normally to for a board running at and to for a board running at
0
4,081
2,702,741,973
IssuesEvent
2015-04-06 11:56:04
jvalanen/diomber
https://api.github.com/repos/jvalanen/diomber
closed
Thumbnail attribute to diory
3 - Under testing / done
<!--- @huboard:{"order":4.75,"milestone_order":24,"custom_state":""} -->
1.0
Thumbnail attribute to diory - <!--- @huboard:{"order":4.75,"milestone_order":24,"custom_state":""} -->
test
thumbnail attribute to diory huboard order milestone order custom state
1
281,666
24,411,689,541
IssuesEvent
2022-10-05 12:55:11
elastic/kibana
https://api.github.com/repos/elastic/kibana
closed
Failing test: Jest Tests.x-pack/plugins/synthetics/public/legacy_uptime/components/overview/alerts/monitor_status_alert - alert monitor status component AlertMonitorStatus passes default props to children
blocker failed-test Team:uptime skipped-test v8.3.0
A test failed on a tracked branch ``` Error: Autocomplete was not set. at get (/var/lib/buildkite-agent/builds/kb-n2-4-spot-d815f963871e913c/elastic/kibana-on-merge/kibana/src/plugins/kibana_utils/common/create_getter_setter.ts:20:13) at QueryStringInputUI.getSuggestions (/var/lib/buildkite-agent/builds/kb-n2-4-spot-d815f963871e913c/elastic/kibana-on-merge/kibana/src/plugins/unified_search/public/query_string_input/query_string_input.tsx:203:33) at QueryStringInputUI.<anonymous> (/var/lib/buildkite-agent/builds/kb-n2-4-spot-d815f963871e913c/elastic/kibana-on-merge/kibana/src/plugins/unified_search/public/query_string_input/query_string_input.tsx:263:37) at invokeFunc (/var/lib/buildkite-agent/builds/kb-n2-4-spot-d815f963871e913c/elastic/kibana-on-merge/kibana/node_modules/lodash/lodash.js:10401:23) at trailingEdge (/var/lib/buildkite-agent/builds/kb-n2-4-spot-d815f963871e913c/elastic/kibana-on-merge/kibana/node_modules/lodash/lodash.js:10450:18) at timerExpired (/var/lib/buildkite-agent/builds/kb-n2-4-spot-d815f963871e913c/elastic/kibana-on-merge/kibana/node_modules/lodash/lodash.js:10438:18) at Timeout.task [as _onTimeout] (/var/lib/buildkite-agent/builds/kb-n2-4-spot-d815f963871e913c/elastic/kibana-on-merge/kibana/node_modules/jest-environment-jsdom/node_modules/jsdom/lib/jsdom/browser/Window.js:391:19) at listOnTimeout (node:internal/timers:559:17) at processTimers (node:internal/timers:502:7) ``` First failure: [CI Build - main](https://buildkite.com/elastic/kibana-on-merge/builds/16619#01811a70-ad66-4e45-8c83-a9da61c4dcf6) <!-- kibanaCiData = {"failed-test":{"test.class":"Jest Tests.x-pack/plugins/synthetics/public/legacy_uptime/components/overview/alerts/monitor_status_alert","test.name":"alert monitor status component AlertMonitorStatus passes default props to children","test.failCount":1}} -->
2.0
Failing test: Jest Tests.x-pack/plugins/synthetics/public/legacy_uptime/components/overview/alerts/monitor_status_alert - alert monitor status component AlertMonitorStatus passes default props to children - A test failed on a tracked branch ``` Error: Autocomplete was not set. at get (/var/lib/buildkite-agent/builds/kb-n2-4-spot-d815f963871e913c/elastic/kibana-on-merge/kibana/src/plugins/kibana_utils/common/create_getter_setter.ts:20:13) at QueryStringInputUI.getSuggestions (/var/lib/buildkite-agent/builds/kb-n2-4-spot-d815f963871e913c/elastic/kibana-on-merge/kibana/src/plugins/unified_search/public/query_string_input/query_string_input.tsx:203:33) at QueryStringInputUI.<anonymous> (/var/lib/buildkite-agent/builds/kb-n2-4-spot-d815f963871e913c/elastic/kibana-on-merge/kibana/src/plugins/unified_search/public/query_string_input/query_string_input.tsx:263:37) at invokeFunc (/var/lib/buildkite-agent/builds/kb-n2-4-spot-d815f963871e913c/elastic/kibana-on-merge/kibana/node_modules/lodash/lodash.js:10401:23) at trailingEdge (/var/lib/buildkite-agent/builds/kb-n2-4-spot-d815f963871e913c/elastic/kibana-on-merge/kibana/node_modules/lodash/lodash.js:10450:18) at timerExpired (/var/lib/buildkite-agent/builds/kb-n2-4-spot-d815f963871e913c/elastic/kibana-on-merge/kibana/node_modules/lodash/lodash.js:10438:18) at Timeout.task [as _onTimeout] (/var/lib/buildkite-agent/builds/kb-n2-4-spot-d815f963871e913c/elastic/kibana-on-merge/kibana/node_modules/jest-environment-jsdom/node_modules/jsdom/lib/jsdom/browser/Window.js:391:19) at listOnTimeout (node:internal/timers:559:17) at processTimers (node:internal/timers:502:7) ``` First failure: [CI Build - main](https://buildkite.com/elastic/kibana-on-merge/builds/16619#01811a70-ad66-4e45-8c83-a9da61c4dcf6) <!-- kibanaCiData = {"failed-test":{"test.class":"Jest Tests.x-pack/plugins/synthetics/public/legacy_uptime/components/overview/alerts/monitor_status_alert","test.name":"alert monitor status component AlertMonitorStatus passes default props to children","test.failCount":1}} -->
test
failing test jest tests x pack plugins synthetics public legacy uptime components overview alerts monitor status alert alert monitor status component alertmonitorstatus passes default props to children a test failed on a tracked branch error autocomplete was not set at get var lib buildkite agent builds kb spot elastic kibana on merge kibana src plugins kibana utils common create getter setter ts at querystringinputui getsuggestions var lib buildkite agent builds kb spot elastic kibana on merge kibana src plugins unified search public query string input query string input tsx at querystringinputui var lib buildkite agent builds kb spot elastic kibana on merge kibana src plugins unified search public query string input query string input tsx at invokefunc var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules lodash lodash js at trailingedge var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules lodash lodash js at timerexpired var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules lodash lodash js at timeout task var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest environment jsdom node modules jsdom lib jsdom browser window js at listontimeout node internal timers at processtimers node internal timers first failure
1
114,709
9,747,305,253
IssuesEvent
2019-06-03 14:08:28
MicrosoftDocs/visualstudio-docs
https://api.github.com/repos/MicrosoftDocs/visualstudio-docs
closed
So how does one make a webtest?
P2 area - test doc-bug support-request visual-studio-dev15/prod
Trying to add a recording on WebTest1.webtest of opening a page, logging in, and clicking a menu item never adds anything to webtest1.webtest. I dont see a link here that explains what is supposed to happen. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: d0780169-4b7b-97ce-5ccd-9f962472a2c8 * Version Independent ID: f0c68169-baeb-cba0-ccf0-a149fa96bb67 * Content: [Create a web performance and load test project in Visual Studio - Visual Studio](https://docs.microsoft.com/en-us/visualstudio/test/quickstart-create-a-load-test-project?view=vs-2017) * Content Source: [docs/test/quickstart-create-a-load-test-project.md](https://github.com/MicrosoftDocs/visualstudio-docs/blob/master/docs/test/quickstart-create-a-load-test-project.md) * Product: **visual-studio-dev15** * GitHub Login: @gewarren * Microsoft Alias: **gewarren**
1.0
So how does one make a webtest? - Trying to add a recording on WebTest1.webtest of opening a page, logging in, and clicking a menu item never adds anything to webtest1.webtest. I dont see a link here that explains what is supposed to happen. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: d0780169-4b7b-97ce-5ccd-9f962472a2c8 * Version Independent ID: f0c68169-baeb-cba0-ccf0-a149fa96bb67 * Content: [Create a web performance and load test project in Visual Studio - Visual Studio](https://docs.microsoft.com/en-us/visualstudio/test/quickstart-create-a-load-test-project?view=vs-2017) * Content Source: [docs/test/quickstart-create-a-load-test-project.md](https://github.com/MicrosoftDocs/visualstudio-docs/blob/master/docs/test/quickstart-create-a-load-test-project.md) * Product: **visual-studio-dev15** * GitHub Login: @gewarren * Microsoft Alias: **gewarren**
test
so how does one make a webtest trying to add a recording on webtest of opening a page logging in and clicking a menu item never adds anything to webtest i dont see a link here that explains what is supposed to happen document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id baeb content content source product visual studio github login gewarren microsoft alias gewarren
1
78,829
22,441,550,177
IssuesEvent
2022-06-21 01:54:58
dotnet/arcade
https://api.github.com/repos/dotnet/arcade
opened
Build failed: dotnet-arcade-validation-official/main #20220620.4
Build Failed
Build [#20220620.4](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_build/results?buildId=1835467) partiallySucceeded ## :warning: : internal / dotnet-arcade-validation-official partiallySucceeded ### Summary **Finished** - Tue, 21 Jun 2022 01:54:45 GMT **Duration** - 107 minutes **Requested for** - Microsoft.VisualStudio.Services.TFS **Reason** - schedule ### Details #### Promote Arcade to '.NET Eng - Latest' channel - :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/1835467/logs/381) - The latest build on 'main' branch for the 'installer' repository was not successful. ### Changes - [35bdcba3](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/017fb734-e4b4-4cc1-a90f-98a09ac25cd5/commit/35bdcba385f71dc9db7b44317e2a59bd7149f98c) - dotnet-maestro[bot] - [main] Update dependencies from dotnet/arcade (#3179) - [1c36e92a](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/017fb734-e4b4-4cc1-a90f-98a09ac25cd5/commit/1c36e92aea67e83a6b64de5c8cfc3fe606b63188) - Matt Galbraith - Move from impendingly-removed Debian 9 queue to a dockerized Debian 10 (#3180) - [63711fa0](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/017fb734-e4b4-4cc1-a90f-98a09ac25cd5/commit/63711fa0b64d3aa3dc130147b4d641295fbf4d87) - dotnet-maestro[bot] - Update dependencies from https://github.com/dotnet/arcade build 20220617.1 (#3178)
1.0
Build failed: dotnet-arcade-validation-official/main #20220620.4 - Build [#20220620.4](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_build/results?buildId=1835467) partiallySucceeded ## :warning: : internal / dotnet-arcade-validation-official partiallySucceeded ### Summary **Finished** - Tue, 21 Jun 2022 01:54:45 GMT **Duration** - 107 minutes **Requested for** - Microsoft.VisualStudio.Services.TFS **Reason** - schedule ### Details #### Promote Arcade to '.NET Eng - Latest' channel - :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/1835467/logs/381) - The latest build on 'main' branch for the 'installer' repository was not successful. ### Changes - [35bdcba3](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/017fb734-e4b4-4cc1-a90f-98a09ac25cd5/commit/35bdcba385f71dc9db7b44317e2a59bd7149f98c) - dotnet-maestro[bot] - [main] Update dependencies from dotnet/arcade (#3179) - [1c36e92a](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/017fb734-e4b4-4cc1-a90f-98a09ac25cd5/commit/1c36e92aea67e83a6b64de5c8cfc3fe606b63188) - Matt Galbraith - Move from impendingly-removed Debian 9 queue to a dockerized Debian 10 (#3180) - [63711fa0](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/017fb734-e4b4-4cc1-a90f-98a09ac25cd5/commit/63711fa0b64d3aa3dc130147b4d641295fbf4d87) - dotnet-maestro[bot] - Update dependencies from https://github.com/dotnet/arcade build 20220617.1 (#3178)
non_test
build failed dotnet arcade validation official main build partiallysucceeded warning internal dotnet arcade validation official partiallysucceeded summary finished tue jun gmt duration minutes requested for microsoft visualstudio services tfs reason schedule details promote arcade to net eng latest channel warning the latest build on main branch for the installer repository was not successful changes dotnet maestro update dependencies from dotnet arcade matt galbraith move from impendingly removed debian queue to a dockerized debian dotnet maestro update dependencies from build
0
159,394
24,987,206,438
IssuesEvent
2022-11-02 15:53:02
GSA-TTS/FAC
https://api.github.com/repos/GSA-TTS/FAC
opened
Update Notes to SEFA page to support data entry
design copy
This issue covers the design of the Notes to SEFA page within the SF-SAC form. When our deadline was originally October, we looked for ways to simplify the MVP. One of the suggestions was to remove the Notes to SEFA page. With our deadline now stretching an extra year, we have time to implement this page in its entirety, and Census suggests we do so. See #326 and #406 for more history. ### Acceptance criteria (We'll know we're done when...) - [ ] The Notes to SEFA page allows data entry ### Tasks - [ ] Design - [ ] Peer review - [ ] Copy review - [ ] Write implementation notes ### Definition of Done - [ ] Screens in Figma are ready to be handed off to front end - [x] ~All text has a contrast ratio of 4.5:1 with the background~ (Built into USWDS) - [ ] Interaction are documented if they deviate from USWDS components - [ ] The needs of someone navigating the site with the use of a screenreader or keyboard has been considered and documented - [x] ~There is an option for multilingual support~ (We've established multilingual support in the header) - [ ] There is supporting documentation for alt text, labels for links (anchor + target), error text in forms, tab order, focus order, and what assistive technologies should announce (if applicable)
1.0
Update Notes to SEFA page to support data entry - This issue covers the design of the Notes to SEFA page within the SF-SAC form. When our deadline was originally October, we looked for ways to simplify the MVP. One of the suggestions was to remove the Notes to SEFA page. With our deadline now stretching an extra year, we have time to implement this page in its entirety, and Census suggests we do so. See #326 and #406 for more history. ### Acceptance criteria (We'll know we're done when...) - [ ] The Notes to SEFA page allows data entry ### Tasks - [ ] Design - [ ] Peer review - [ ] Copy review - [ ] Write implementation notes ### Definition of Done - [ ] Screens in Figma are ready to be handed off to front end - [x] ~All text has a contrast ratio of 4.5:1 with the background~ (Built into USWDS) - [ ] Interaction are documented if they deviate from USWDS components - [ ] The needs of someone navigating the site with the use of a screenreader or keyboard has been considered and documented - [x] ~There is an option for multilingual support~ (We've established multilingual support in the header) - [ ] There is supporting documentation for alt text, labels for links (anchor + target), error text in forms, tab order, focus order, and what assistive technologies should announce (if applicable)
non_test
update notes to sefa page to support data entry this issue covers the design of the notes to sefa page within the sf sac form when our deadline was originally october we looked for ways to simplify the mvp one of the suggestions was to remove the notes to sefa page with our deadline now stretching an extra year we have time to implement this page in its entirety and census suggests we do so see and for more history acceptance criteria we ll know we re done when the notes to sefa page allows data entry tasks design peer review copy review write implementation notes definition of done screens in figma are ready to be handed off to front end all text has a contrast ratio of with the background built into uswds interaction are documented if they deviate from uswds components the needs of someone navigating the site with the use of a screenreader or keyboard has been considered and documented there is an option for multilingual support we ve established multilingual support in the header there is supporting documentation for alt text labels for links anchor target error text in forms tab order focus order and what assistive technologies should announce if applicable
0
254,786
21,876,595,902
IssuesEvent
2022-05-19 10:43:30
zkSNACKs/WalletWasabi
https://api.github.com/repos/zkSNACKs/WalletWasabi
closed
Search bar not working on MacBook Air (M1)
debug ww2 testing
### General Description - Can't use any of the suggestions of the `Search bar`, if there is any. - If there are suggestions, like `Logs` , `Data Folder`, after clicking on it, nothing happens, but the navbar and the sidebar colour changes, like it's out of focus (see Screenshot). - Clicking into the Search bar for the first time pops up suggestions. After clicking on any suggestion, a single click does nothing (can't write in it, no suggestions). Double clicking the Search bar allows me to write init, but no suggestions are popping up. - Clicking on the 3x3 dots do nothing. Nothing in the Logs about these, unfortunately. One more thing: Yesterday when I tried to open the `Data Folder`, somehow I managed to crash the app and the OS at the same time. See `Logs` for further details. Couldn't repro this today, so I hope it's a very rare case. ### How To Reproduce? 1. Try to use the search bar on a M1 MacBook. ### Screenshots Clicking on a suggestion makes the navbar and the sidebar change colour, like it's not in focus. <img width="1440" alt="Screenshot 2022-05-18 at 13 42 01" src="https://user-images.githubusercontent.com/45069029/169031255-33d325f1-d038-4685-af73-6c3df22a992e.png"> ### Operating System macOS Monterey 12.01 ### Logs Yesterday when I tried to open the `Data Folder`, somehow I managed to crash the app and the OS at the same time. ``` 2022-05-17 17:15:25.190 WalletWasabi.Fluent.Desktop[87023:58822432] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[NSToolbarFullScreenWindow isDialog]: unrecognized selector sent to instance 0x142923c40' *** First throw call stack: ( 0 CoreFoundation 0x0000000189cf812c __exceptionPreprocess + 240 1 libobjc.A.dylib 0x0000000189a49808 objc_exception_throw + 60 2 CoreFoundation 0x0000000189d8b100 -[NSObject(NSObject) __retain_OA] + 0 3 CoreFoundation 0x0000000189c582c0 ___forwarding___ + 1728 4 CoreFoundation 0x0000000189c57b40 _CF_forwarding_prep_0 + 96 5 libAvaloniaNative.dylib 0x00000001257b4794 -[AvnWindow canBecomeKeyWindow] + 196 6 AppKit 0x000000018ca38aec -[NSWindow _orderOutAndCalcKeyWithCounter:stillVisible:docWindow:] + 944 7 AppKit 0x000000018c81ce6c NSPerformVisuallyAtomicChange + 140 8 AppKit 0x000000018ca38628 -[NSWindow _doWindowOrderOutWithWithKeyCalc:forCounter:orderingDone:docWindow:] + 108 9 AppKit 0x000000018ca37f84 -[NSWindow _reallyDoOrderWindowOutRelativeTo:findKey:forCounter:force:isModal:] + 468 10 AppKit 0x000000018c903058 -[NSWindow _reallyDoOrderWindow:relativeTo:findKey:forCounter:force:isModal:] + 172 11 AppKit 0x000000018c902020 -[NSWindow _doOrderWindow:relativeTo:findKey:forCounter:force:isModal:] + 324 12 libAvaloniaNative.dylib 0x00000001257abcfc _ZN14WindowBaseImpl4HideEv + 104 13 libAvaloniaNative.dylib 0x00000001257abdc4 _ZTv0_n56_N14WindowBaseImpl4HideEv + 24 14 ??? 0x0000000287b1aa40 0x0 + 10866502208 15 ??? 0x0000000287b1a904 0x0 + 10866501892 16 ??? 0x0000000287b1a3c8 0x0 + 10866500552 17 ??? 0x0000000286d9c378 0x0 + 10852352888 18 ??? 0x0000000287b0de54 0x0 + 10866450004 19 ??? 0x0000000286003e3c 0x0 + 10838097468 20 ??? 0x0000000287adba78 0x0 + 10866244216 21 ??? 0x0000000285f11ab0 0x0 + 10837105328 22 ??? 0x0000000285f119d8 0x0 + 10837105112 23 ??? 0x0000000285f11904 0x0 + 10837104900 24 ??? 0x00000002850ba6dc 0x0 + 10822067932 25 libAvaloniaNative.dylib 0x00000001257b4cc0 -[AvnWindow resignKeyWindow] + 52 26 AppKit 0x000000018c900fbc -[NSWindow _changeKeyAndMainLimitedOK:] + 1020 27 AppKit 0x000000018c9c9438 -[NSWindow(NSEventRouting) _handleMouseDownEvent:isDelayedEvent:] + 3596 28 AppKit 0x000000018c93cc78 -[NSWindow(NSEventRouting) _reallySendEvent:isDelayedEvent:] + 2444 29 AppKit 0x000000018c93c080 -[NSWindow(NSEventRouting) sendEvent:] + 348 30 libAvaloniaNative.dylib 0x00000001257b4e58 -[AvnWindow sendEvent:] + 76 31 AppKit 0x000000018c93afe4 -[NSApplication(NSEvent) sendEvent:] + 2776 32 libAvaloniaNative.dylib 0x000000012579ce5c -[AvnApplication sendEvent:] + 96 33 AppKit 0x000000018cbf3588 -[NSApplication _handleEvent:] + 76 34 AppKit 0x000000018c7bc3d8 -[NSApplication run] + 636 35 libAvaloniaNative.dylib 0x00000001257a1930 _ZN26PlatformThreadingInterface7RunLoopEP20IAvnLoopCancellation + 204 36 ??? 0x000000028559477c 0x0 + 10827155324 37 ??? 0x0000000285593644 0x0 + 10827150916 38 ??? 0x0000000285592f24 0x0 + 10827149092 39 ??? 0x0000000285592b9c 0x0 + 10827148188 40 ??? 0x00000002839aa904 0x0 + 10797885700 41 ??? 0x0000000280dad738 0x0 + 10751760184 42 libcoreclr.dylib 0x00000001019f7c88 CallDescrWorkerInternal + 132 43 libcoreclr.dylib 0x0000000101868718 _ZN18MethodDescCallSite16CallTargetWorkerEPKmPmi + 868 44 libcoreclr.dylib 0x0000000101760294 _Z7RunMainP10MethodDescsPiPP8PtrArray + 652 45 libcoreclr.dylib 0x0000000101760580 _ZN8Assembly17ExecuteMainMethodEPP8PtrArrayi + 376 46 libcoreclr.dylib 0x000000010178e8d0 _ZN8CorHost215ExecuteAssemblyEjPKDsiPS1_Pj + 476 47 libcoreclr.dylib 0x000000010174b008 coreclr_execute_assembly + 208 48 libhostpolicy.dylib 0x000000010102de44 _Z19run_app_for_contextRK20hostpolicy_context_tiPPKc + 1056 49 libhostpolicy.dylib 0x000000010102ebb4 corehost_main + 240 50 libhostfxr.dylib 0x0000000100fb9e88 _ZN10fx_muxer_t24handle_exec_host_commandERKNSt3__112basic_stringIcNS0_11char_traitsIcEENS0_9allocatorIcEEEERK19host_startup_info_tS8_RKNS0_13unordered_mapI13known_optionsNS0_6vectorIS6_NS4_IS6_EEEE18known_options_hashNS0_8equal_toISD_EENS4_INS0_4pairIKSD_SG_EEEEEEiPPKci11host_mode_tbPciPi + 1320 51 libhostfxr.dylib 0x0000000100fb8f74 _ZN10fx_muxer_t7executeENSt3__112basic_stringIcNS0_11char_traitsIcEENS0_9allocatorIcEEEEiPPKcRK19host_startup_info_tPciPi + 856 52 libhostfxr.dylib 0x0000000100fb5ba0 hostfxr_main_startupinfo + 152 53 WalletWasabi.Fluent.Desktop 0x0000000100d25484 _Z9exe_startiPPKc + 1484 54 WalletWasabi.Fluent.Desktop 0x0000000100d256c8 main + 160 55 dyld 0x0000000100e8d0f4 start + 520 ) libc++abi: terminating with uncaught exception of type NSException ``` ### Wasabi Version Master https://github.com/zkSNACKs/WalletWasabi/commit/300c1aa9f8b92a9c344666560a1209a9dcc1b487
1.0
Search bar not working on MacBook Air (M1) - ### General Description - Can't use any of the suggestions of the `Search bar`, if there is any. - If there are suggestions, like `Logs` , `Data Folder`, after clicking on it, nothing happens, but the navbar and the sidebar colour changes, like it's out of focus (see Screenshot). - Clicking into the Search bar for the first time pops up suggestions. After clicking on any suggestion, a single click does nothing (can't write in it, no suggestions). Double clicking the Search bar allows me to write init, but no suggestions are popping up. - Clicking on the 3x3 dots do nothing. Nothing in the Logs about these, unfortunately. One more thing: Yesterday when I tried to open the `Data Folder`, somehow I managed to crash the app and the OS at the same time. See `Logs` for further details. Couldn't repro this today, so I hope it's a very rare case. ### How To Reproduce? 1. Try to use the search bar on a M1 MacBook. ### Screenshots Clicking on a suggestion makes the navbar and the sidebar change colour, like it's not in focus. <img width="1440" alt="Screenshot 2022-05-18 at 13 42 01" src="https://user-images.githubusercontent.com/45069029/169031255-33d325f1-d038-4685-af73-6c3df22a992e.png"> ### Operating System macOS Monterey 12.01 ### Logs Yesterday when I tried to open the `Data Folder`, somehow I managed to crash the app and the OS at the same time. ``` 2022-05-17 17:15:25.190 WalletWasabi.Fluent.Desktop[87023:58822432] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[NSToolbarFullScreenWindow isDialog]: unrecognized selector sent to instance 0x142923c40' *** First throw call stack: ( 0 CoreFoundation 0x0000000189cf812c __exceptionPreprocess + 240 1 libobjc.A.dylib 0x0000000189a49808 objc_exception_throw + 60 2 CoreFoundation 0x0000000189d8b100 -[NSObject(NSObject) __retain_OA] + 0 3 CoreFoundation 0x0000000189c582c0 ___forwarding___ + 1728 4 CoreFoundation 0x0000000189c57b40 _CF_forwarding_prep_0 + 96 5 libAvaloniaNative.dylib 0x00000001257b4794 -[AvnWindow canBecomeKeyWindow] + 196 6 AppKit 0x000000018ca38aec -[NSWindow _orderOutAndCalcKeyWithCounter:stillVisible:docWindow:] + 944 7 AppKit 0x000000018c81ce6c NSPerformVisuallyAtomicChange + 140 8 AppKit 0x000000018ca38628 -[NSWindow _doWindowOrderOutWithWithKeyCalc:forCounter:orderingDone:docWindow:] + 108 9 AppKit 0x000000018ca37f84 -[NSWindow _reallyDoOrderWindowOutRelativeTo:findKey:forCounter:force:isModal:] + 468 10 AppKit 0x000000018c903058 -[NSWindow _reallyDoOrderWindow:relativeTo:findKey:forCounter:force:isModal:] + 172 11 AppKit 0x000000018c902020 -[NSWindow _doOrderWindow:relativeTo:findKey:forCounter:force:isModal:] + 324 12 libAvaloniaNative.dylib 0x00000001257abcfc _ZN14WindowBaseImpl4HideEv + 104 13 libAvaloniaNative.dylib 0x00000001257abdc4 _ZTv0_n56_N14WindowBaseImpl4HideEv + 24 14 ??? 0x0000000287b1aa40 0x0 + 10866502208 15 ??? 0x0000000287b1a904 0x0 + 10866501892 16 ??? 0x0000000287b1a3c8 0x0 + 10866500552 17 ??? 0x0000000286d9c378 0x0 + 10852352888 18 ??? 0x0000000287b0de54 0x0 + 10866450004 19 ??? 0x0000000286003e3c 0x0 + 10838097468 20 ??? 0x0000000287adba78 0x0 + 10866244216 21 ??? 0x0000000285f11ab0 0x0 + 10837105328 22 ??? 0x0000000285f119d8 0x0 + 10837105112 23 ??? 0x0000000285f11904 0x0 + 10837104900 24 ??? 0x00000002850ba6dc 0x0 + 10822067932 25 libAvaloniaNative.dylib 0x00000001257b4cc0 -[AvnWindow resignKeyWindow] + 52 26 AppKit 0x000000018c900fbc -[NSWindow _changeKeyAndMainLimitedOK:] + 1020 27 AppKit 0x000000018c9c9438 -[NSWindow(NSEventRouting) _handleMouseDownEvent:isDelayedEvent:] + 3596 28 AppKit 0x000000018c93cc78 -[NSWindow(NSEventRouting) _reallySendEvent:isDelayedEvent:] + 2444 29 AppKit 0x000000018c93c080 -[NSWindow(NSEventRouting) sendEvent:] + 348 30 libAvaloniaNative.dylib 0x00000001257b4e58 -[AvnWindow sendEvent:] + 76 31 AppKit 0x000000018c93afe4 -[NSApplication(NSEvent) sendEvent:] + 2776 32 libAvaloniaNative.dylib 0x000000012579ce5c -[AvnApplication sendEvent:] + 96 33 AppKit 0x000000018cbf3588 -[NSApplication _handleEvent:] + 76 34 AppKit 0x000000018c7bc3d8 -[NSApplication run] + 636 35 libAvaloniaNative.dylib 0x00000001257a1930 _ZN26PlatformThreadingInterface7RunLoopEP20IAvnLoopCancellation + 204 36 ??? 0x000000028559477c 0x0 + 10827155324 37 ??? 0x0000000285593644 0x0 + 10827150916 38 ??? 0x0000000285592f24 0x0 + 10827149092 39 ??? 0x0000000285592b9c 0x0 + 10827148188 40 ??? 0x00000002839aa904 0x0 + 10797885700 41 ??? 0x0000000280dad738 0x0 + 10751760184 42 libcoreclr.dylib 0x00000001019f7c88 CallDescrWorkerInternal + 132 43 libcoreclr.dylib 0x0000000101868718 _ZN18MethodDescCallSite16CallTargetWorkerEPKmPmi + 868 44 libcoreclr.dylib 0x0000000101760294 _Z7RunMainP10MethodDescsPiPP8PtrArray + 652 45 libcoreclr.dylib 0x0000000101760580 _ZN8Assembly17ExecuteMainMethodEPP8PtrArrayi + 376 46 libcoreclr.dylib 0x000000010178e8d0 _ZN8CorHost215ExecuteAssemblyEjPKDsiPS1_Pj + 476 47 libcoreclr.dylib 0x000000010174b008 coreclr_execute_assembly + 208 48 libhostpolicy.dylib 0x000000010102de44 _Z19run_app_for_contextRK20hostpolicy_context_tiPPKc + 1056 49 libhostpolicy.dylib 0x000000010102ebb4 corehost_main + 240 50 libhostfxr.dylib 0x0000000100fb9e88 _ZN10fx_muxer_t24handle_exec_host_commandERKNSt3__112basic_stringIcNS0_11char_traitsIcEENS0_9allocatorIcEEEERK19host_startup_info_tS8_RKNS0_13unordered_mapI13known_optionsNS0_6vectorIS6_NS4_IS6_EEEE18known_options_hashNS0_8equal_toISD_EENS4_INS0_4pairIKSD_SG_EEEEEEiPPKci11host_mode_tbPciPi + 1320 51 libhostfxr.dylib 0x0000000100fb8f74 _ZN10fx_muxer_t7executeENSt3__112basic_stringIcNS0_11char_traitsIcEENS0_9allocatorIcEEEEiPPKcRK19host_startup_info_tPciPi + 856 52 libhostfxr.dylib 0x0000000100fb5ba0 hostfxr_main_startupinfo + 152 53 WalletWasabi.Fluent.Desktop 0x0000000100d25484 _Z9exe_startiPPKc + 1484 54 WalletWasabi.Fluent.Desktop 0x0000000100d256c8 main + 160 55 dyld 0x0000000100e8d0f4 start + 520 ) libc++abi: terminating with uncaught exception of type NSException ``` ### Wasabi Version Master https://github.com/zkSNACKs/WalletWasabi/commit/300c1aa9f8b92a9c344666560a1209a9dcc1b487
test
search bar not working on macbook air general description can t use any of the suggestions of the search bar if there is any if there are suggestions like logs data folder after clicking on it nothing happens but the navbar and the sidebar colour changes like it s out of focus see screenshot clicking into the search bar for the first time pops up suggestions after clicking on any suggestion a single click does nothing can t write in it no suggestions double clicking the search bar allows me to write init but no suggestions are popping up clicking on the dots do nothing nothing in the logs about these unfortunately one more thing yesterday when i tried to open the data folder somehow i managed to crash the app and the os at the same time see logs for further details couldn t repro this today so i hope it s a very rare case how to reproduce try to use the search bar on a macbook screenshots clicking on a suggestion makes the navbar and the sidebar change colour like it s not in focus img width alt screenshot at src operating system macos monterey logs yesterday when i tried to open the data folder somehow i managed to crash the app and the os at the same time walletwasabi fluent desktop terminating app due to uncaught exception nsinvalidargumentexception reason unrecognized selector sent to instance first throw call stack corefoundation exceptionpreprocess libobjc a dylib objc exception throw corefoundation corefoundation forwarding corefoundation cf forwarding prep libavalonianative dylib appkit appkit nsperformvisuallyatomicchange appkit appkit appkit appkit libavalonianative dylib libavalonianative dylib libavalonianative dylib appkit appkit appkit appkit libavalonianative dylib appkit libavalonianative dylib appkit appkit libavalonianative dylib libcoreclr dylib calldescrworkerinternal libcoreclr dylib libcoreclr dylib libcoreclr dylib libcoreclr dylib pj libcoreclr dylib coreclr execute assembly libhostpolicy dylib app for context tippkc libhostpolicy dylib corehost main libhostfxr dylib muxer exec host startup info options toisd sg mode tbpcipi libhostfxr dylib muxer startup info tpcipi libhostfxr dylib hostfxr main startupinfo walletwasabi fluent desktop startippkc walletwasabi fluent desktop main dyld start libc abi terminating with uncaught exception of type nsexception wasabi version master
1
3,987
2,799,896,645
IssuesEvent
2015-05-13 05:56:52
ufal/lindat-dspace
https://api.github.com/repos/ufal/lindat-dspace
closed
Typos and comments to some documentation pages
documentation enhancement lindat-specific
Some quite minor things: In page/deposit "How to Deposit" - "Only registered users can deposit items. If you are not a registered user please contact our Help Desk.” Why not give them first a pointer to the AAI help, rather than immediately referring them to the help desk? In page/citate "About Citations" - "citate" is not a verb, "cite" is :) - you don't give an example of CMDI In page/item-lifecycle "Deposited Item Lifecycle": - Resctricted <- Restricted (typo) - "LINDAT/CLARIN" <- LINDAT (in order to make the page portable) In page
1.0
Typos and comments to some documentation pages - Some quite minor things: In page/deposit "How to Deposit" - "Only registered users can deposit items. If you are not a registered user please contact our Help Desk.” Why not give them first a pointer to the AAI help, rather than immediately referring them to the help desk? In page/citate "About Citations" - "citate" is not a verb, "cite" is :) - you don't give an example of CMDI In page/item-lifecycle "Deposited Item Lifecycle": - Resctricted <- Restricted (typo) - "LINDAT/CLARIN" <- LINDAT (in order to make the page portable) In page
non_test
typos and comments to some documentation pages some quite minor things in page deposit how to deposit only registered users can deposit items if you are not a registered user please contact our help desk ” why not give them first a pointer to the aai help rather than immediately referring them to the help desk in page citate about citations citate is not a verb cite is you don t give an example of cmdi in page item lifecycle deposited item lifecycle resctricted restricted typo lindat clarin lindat in order to make the page portable in page
0
222,871
7,440,336,454
IssuesEvent
2018-03-27 09:45:58
wso2/product-is
https://api.github.com/repos/wso2/product-is
closed
No tooltips for any of the buttons in the Dashboard
Affected/5.5.0-Alpha2 Priority/High Severity/Minor Type/Improvement
The following were noted in the Consent Management UI ![36](https://user-images.githubusercontent.com/1845370/36667421-c73f2d68-1b13-11e8-852b-c8d47885d970.png) ![37](https://user-images.githubusercontent.com/1845370/36667448-e2d37c14-1b13-11e8-931d-1aab267d2387.png)
1.0
No tooltips for any of the buttons in the Dashboard - The following were noted in the Consent Management UI ![36](https://user-images.githubusercontent.com/1845370/36667421-c73f2d68-1b13-11e8-852b-c8d47885d970.png) ![37](https://user-images.githubusercontent.com/1845370/36667448-e2d37c14-1b13-11e8-931d-1aab267d2387.png)
non_test
no tooltips for any of the buttons in the dashboard the following were noted in the consent management ui
0
194,656
14,684,624,698
IssuesEvent
2021-01-01 04:04:04
github-vet/rangeloop-pointer-findings
https://api.github.com/repos/github-vet/rangeloop-pointer-findings
closed
itsivareddy/terrafrom-Oci: oci/waas_waas_policy_test.go; 16 LoC
fresh small test
Found a possible issue in [itsivareddy/terrafrom-Oci](https://www.github.com/itsivareddy/terrafrom-Oci) at [oci/waas_waas_policy_test.go](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/waas_waas_policy_test.go#L1070-L1085) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > reference to waasPolicyId is reassigned at line 1074 [Click here to see the code in its original context.](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/waas_waas_policy_test.go#L1070-L1085) <details> <summary>Click here to show the 16 line(s) of Go which triggered the analyzer.</summary> ```go for _, waasPolicyId := range waasPolicyIds { if ok := SweeperDefaultResourceId[waasPolicyId]; !ok { deleteWaasPolicyRequest := oci_waas.DeleteWaasPolicyRequest{} deleteWaasPolicyRequest.WaasPolicyId = &waasPolicyId deleteWaasPolicyRequest.RequestMetadata.RetryPolicy = getRetryPolicy(true, "waas") _, error := waasClient.DeleteWaasPolicy(context.Background(), deleteWaasPolicyRequest) if error != nil { fmt.Printf("Error deleting WaasPolicy %s %s, It is possible that the resource is already deleted. Please verify manually \n", waasPolicyId, error) continue } waitTillCondition(testAccProvider, &waasPolicyId, waasPolicySweepWaitCondition, time.Duration(3*time.Minute), waasPolicySweepResponseFetchOperation, "waas", true) } } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 075608a9e201ee0e32484da68d5ba5370dfde1be
1.0
itsivareddy/terrafrom-Oci: oci/waas_waas_policy_test.go; 16 LoC - Found a possible issue in [itsivareddy/terrafrom-Oci](https://www.github.com/itsivareddy/terrafrom-Oci) at [oci/waas_waas_policy_test.go](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/waas_waas_policy_test.go#L1070-L1085) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > reference to waasPolicyId is reassigned at line 1074 [Click here to see the code in its original context.](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/waas_waas_policy_test.go#L1070-L1085) <details> <summary>Click here to show the 16 line(s) of Go which triggered the analyzer.</summary> ```go for _, waasPolicyId := range waasPolicyIds { if ok := SweeperDefaultResourceId[waasPolicyId]; !ok { deleteWaasPolicyRequest := oci_waas.DeleteWaasPolicyRequest{} deleteWaasPolicyRequest.WaasPolicyId = &waasPolicyId deleteWaasPolicyRequest.RequestMetadata.RetryPolicy = getRetryPolicy(true, "waas") _, error := waasClient.DeleteWaasPolicy(context.Background(), deleteWaasPolicyRequest) if error != nil { fmt.Printf("Error deleting WaasPolicy %s %s, It is possible that the resource is already deleted. Please verify manually \n", waasPolicyId, error) continue } waitTillCondition(testAccProvider, &waasPolicyId, waasPolicySweepWaitCondition, time.Duration(3*time.Minute), waasPolicySweepResponseFetchOperation, "waas", true) } } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 075608a9e201ee0e32484da68d5ba5370dfde1be
test
itsivareddy terrafrom oci oci waas waas policy test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message reference to waaspolicyid is reassigned at line click here to show the line s of go which triggered the analyzer go for waaspolicyid range waaspolicyids if ok sweeperdefaultresourceid ok deletewaaspolicyrequest oci waas deletewaaspolicyrequest deletewaaspolicyrequest waaspolicyid waaspolicyid deletewaaspolicyrequest requestmetadata retrypolicy getretrypolicy true waas error waasclient deletewaaspolicy context background deletewaaspolicyrequest if error nil fmt printf error deleting waaspolicy s s it is possible that the resource is already deleted please verify manually n waaspolicyid error continue waittillcondition testaccprovider waaspolicyid waaspolicysweepwaitcondition time duration time minute waaspolicysweepresponsefetchoperation waas true leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
1
5,808
2,796,404,713
IssuesEvent
2015-05-12 07:14:31
osakagamba/7GIQSKRNE5P3AVTZCEFXE3ON
https://api.github.com/repos/osakagamba/7GIQSKRNE5P3AVTZCEFXE3ON
closed
CSFLhlQ5cAYHhz9FdZPy3ZHJjODDtZOMtwHIOsUuS1GgK2Mll/YqcVuN5Al19sGQ4lb3pjkRqxcISSqlSNIZpi/rozVs08WEViC6dHyYngGAL42ajSW2I43lTvg27Lja72WgkaGOQgL4sBDQ1E9X9VuCShC3dHkCkf8/xjANGfE=
design
6iOZinOAqokX0WMYqoL9QBs2QfDElfYLVZJ6S7I4mqx74VsgyXiaZhU7fZQGvWgIgeooVKcobhGDd6gHGS+29xEeYh5UYUPgPSrcSOxYrIJAH/a1P4w5S6uoEbDh1k9BPiAP2J/ZXm+1McAFFo8nlTcJ0gjHghsQrz/CWnXEH/Eg9qmOX/OJX0Zdu0A7z8fq9T2f2t8ALm4budSp+06e1vyJs5OXiRVwi+w/0L+tReZ1AnraxzxskFu1XrjrFNw0HXulRqZrLkF98jxl2jicwiMLT+OcxoFWYVGGOLqkij+nLUnj9jSApKZcuvzHn692K27GB8enjihO41tOIoRhDX+fZbZYM4pYWY9Bs6tHZeQMt6eeEHBWX2DL8TNxUpy0ZtUOUDpK0IkRORyZ34ryLjXhkIxfofB8tD4r4bb0w7jT1nJeyypRdL31OpmfH1eT1IUEd254a4/cqQHpf9t9R8uU8/BVQSvUmyynLa+z//KF5extNzj2gOzVZBzwyWV9cWpTVcwP20pk5QJlZNlibcp5LJhg7kXyjNsD+bABJhlomYRmqmdDWI/2lvNHh4pzUyHMODfBEEnu5holyQ/3dZj/e1QvcsT4tPJ3nr9tlqa4fDjQQi7fG5OCKwbgdj3gexw69X01jE1x1YZYwTbpyzBr+E1rUMBkmoyMgbTmFu2S1srFDFaetk5gCLrer6RR
1.0
CSFLhlQ5cAYHhz9FdZPy3ZHJjODDtZOMtwHIOsUuS1GgK2Mll/YqcVuN5Al19sGQ4lb3pjkRqxcISSqlSNIZpi/rozVs08WEViC6dHyYngGAL42ajSW2I43lTvg27Lja72WgkaGOQgL4sBDQ1E9X9VuCShC3dHkCkf8/xjANGfE= - 6iOZinOAqokX0WMYqoL9QBs2QfDElfYLVZJ6S7I4mqx74VsgyXiaZhU7fZQGvWgIgeooVKcobhGDd6gHGS+29xEeYh5UYUPgPSrcSOxYrIJAH/a1P4w5S6uoEbDh1k9BPiAP2J/ZXm+1McAFFo8nlTcJ0gjHghsQrz/CWnXEH/Eg9qmOX/OJX0Zdu0A7z8fq9T2f2t8ALm4budSp+06e1vyJs5OXiRVwi+w/0L+tReZ1AnraxzxskFu1XrjrFNw0HXulRqZrLkF98jxl2jicwiMLT+OcxoFWYVGGOLqkij+nLUnj9jSApKZcuvzHn692K27GB8enjihO41tOIoRhDX+fZbZYM4pYWY9Bs6tHZeQMt6eeEHBWX2DL8TNxUpy0ZtUOUDpK0IkRORyZ34ryLjXhkIxfofB8tD4r4bb0w7jT1nJeyypRdL31OpmfH1eT1IUEd254a4/cqQHpf9t9R8uU8/BVQSvUmyynLa+z//KF5extNzj2gOzVZBzwyWV9cWpTVcwP20pk5QJlZNlibcp5LJhg7kXyjNsD+bABJhlomYRmqmdDWI/2lvNHh4pzUyHMODfBEEnu5holyQ/3dZj/e1QvcsT4tPJ3nr9tlqa4fDjQQi7fG5OCKwbgdj3gexw69X01jE1x1YZYwTbpyzBr+E1rUMBkmoyMgbTmFu2S1srFDFaetk5gCLrer6RR
non_test
xjangfe zxm cwnxeh w ocxofwyvggolqkij bvqsvumyynla z babjhlomyrmqmddwi
0
36,431
8,109,790,767
IssuesEvent
2018-08-14 08:48:45
TypeCobolTeam/TypeCobol
https://api.github.com/repos/TypeCobolTeam/TypeCobol
closed
Pointers redefindes are not generated in case of function parameter
Bug Codegen
If a pointer is incremented inside a function, the generated propram does not contain the pointer redefines. ```cobol declare procedure BugPointer PRIVATE input ptr pointer . data division. working-storage section. procedure division. set ptr up by 1 goback . end-declare. ``` will generate ```cobol declare procedure BugPointer PRIVATE input ptr pointer . IDENTIFICATION DIVISION. PROGRAM-ID. b233702fBugPointer. data division. working-storage section. LINKAGE SECTION. 01 ptr pointer. PROCEDURE DIVISION USINGBY REFERENCE ptr . * set ptr up by 1 ADD 1 to ptrdbef1f3d goback . END PROGRAM b233702fBugPointer. ```
1.0
Pointers redefindes are not generated in case of function parameter - If a pointer is incremented inside a function, the generated propram does not contain the pointer redefines. ```cobol declare procedure BugPointer PRIVATE input ptr pointer . data division. working-storage section. procedure division. set ptr up by 1 goback . end-declare. ``` will generate ```cobol declare procedure BugPointer PRIVATE input ptr pointer . IDENTIFICATION DIVISION. PROGRAM-ID. b233702fBugPointer. data division. working-storage section. LINKAGE SECTION. 01 ptr pointer. PROCEDURE DIVISION USINGBY REFERENCE ptr . * set ptr up by 1 ADD 1 to ptrdbef1f3d goback . END PROGRAM b233702fBugPointer. ```
non_test
pointers redefindes are not generated in case of function parameter if a pointer is incremented inside a function the generated propram does not contain the pointer redefines cobol declare procedure bugpointer private input ptr pointer data division working storage section procedure division set ptr up by goback end declare will generate cobol declare procedure bugpointer private input ptr pointer identification division program id data division working storage section linkage section ptr pointer procedure division usingby reference ptr set ptr up by add to goback end program
0
323,877
23,971,395,178
IssuesEvent
2022-09-13 08:04:08
NatLibFi/Skosmos
https://api.github.com/repos/NatLibFi/Skosmos
closed
Need for contributing guidelines document
needs documentation
## At which URL did you encounter the problem? Skosmos wiki does not have a dedicated section for contibution guidelines. ## What is the expected output? What do you see instead? Having a document describing reasonable conventions for handling issues and pull requests would help standardizing contributions (Mostly those that come outside sprints). This document could be linked in the PR template or in Skomos wiki
1.0
Need for contributing guidelines document - ## At which URL did you encounter the problem? Skosmos wiki does not have a dedicated section for contibution guidelines. ## What is the expected output? What do you see instead? Having a document describing reasonable conventions for handling issues and pull requests would help standardizing contributions (Mostly those that come outside sprints). This document could be linked in the PR template or in Skomos wiki
non_test
need for contributing guidelines document at which url did you encounter the problem skosmos wiki does not have a dedicated section for contibution guidelines what is the expected output what do you see instead having a document describing reasonable conventions for handling issues and pull requests would help standardizing contributions mostly those that come outside sprints this document could be linked in the pr template or in skomos wiki
0
276,065
23,963,721,212
IssuesEvent
2022-09-12 21:47:06
chamilo/chamilo-lms
https://api.github.com/repos/chamilo/chamilo-lms
closed
Convert user_api_key.api_service to 'default'
Requires testing/validation
### Current behavior By default, it is set to 'dokeos'. ### Expected behavior Instances of that keyword have been almost completely removed and we don't want to introduce another dependency on a name. Better use 'default'. The API keys are still used in some connections to other platforms (like the Drupal module). ### Chamilo Version 1.11.6, but we need to remember to migrate this in the process to upgrade to 2.0 Something like `UPDATE user_api_key SET api_service = 'default' WHERE api_service = 'dokeos'`. And also update the code to use 'default'. See usermanager.lib.php::get_api_keys() and main/auth/profile.php, at least.
1.0
Convert user_api_key.api_service to 'default' - ### Current behavior By default, it is set to 'dokeos'. ### Expected behavior Instances of that keyword have been almost completely removed and we don't want to introduce another dependency on a name. Better use 'default'. The API keys are still used in some connections to other platforms (like the Drupal module). ### Chamilo Version 1.11.6, but we need to remember to migrate this in the process to upgrade to 2.0 Something like `UPDATE user_api_key SET api_service = 'default' WHERE api_service = 'dokeos'`. And also update the code to use 'default'. See usermanager.lib.php::get_api_keys() and main/auth/profile.php, at least.
test
convert user api key api service to default current behavior by default it is set to dokeos expected behavior instances of that keyword have been almost completely removed and we don t want to introduce another dependency on a name better use default the api keys are still used in some connections to other platforms like the drupal module chamilo version but we need to remember to migrate this in the process to upgrade to something like update user api key set api service default where api service dokeos and also update the code to use default see usermanager lib php get api keys and main auth profile php at least
1
308,585
26,615,234,253
IssuesEvent
2023-01-24 06:12:30
elastic/e2e-testing
https://api.github.com/repos/elastic/e2e-testing
opened
Flaky Test [Initializing / End-To-End Tests / fleet_debian_10_arm64_system_integration / Adding core system/metrics Integration to a Policy – System Integration]
flaky-test ci-reported
## Flaky Test * **Test Name:** `Initializing / End-To-End Tests / fleet_debian_10_arm64_system_integration / Adding core system/metrics Integration to a Policy – System Integration` * **Artifact Link:** https://beats-ci.elastic.co/blue/organizations/jenkins/e2e-tests%2Fe2e-testing-mbp%2F7.17/detail/7.17/1602/ * **PR:** None * **Commit:** 18abb282332700aabdf58c8b2afd2ff838bf7bb6 ### Error details ``` Step "system/metrics" with "core" metrics are present in the datastreams ```
1.0
Flaky Test [Initializing / End-To-End Tests / fleet_debian_10_arm64_system_integration / Adding core system/metrics Integration to a Policy – System Integration] - ## Flaky Test * **Test Name:** `Initializing / End-To-End Tests / fleet_debian_10_arm64_system_integration / Adding core system/metrics Integration to a Policy – System Integration` * **Artifact Link:** https://beats-ci.elastic.co/blue/organizations/jenkins/e2e-tests%2Fe2e-testing-mbp%2F7.17/detail/7.17/1602/ * **PR:** None * **Commit:** 18abb282332700aabdf58c8b2afd2ff838bf7bb6 ### Error details ``` Step "system/metrics" with "core" metrics are present in the datastreams ```
test
flaky test flaky test test name initializing end to end tests fleet debian system integration adding core system metrics integration to a policy – system integration artifact link pr none commit error details step system metrics with core metrics are present in the datastreams
1
151,791
12,058,753,167
IssuesEvent
2020-04-15 18:01:15
NixOS/nixpkgs
https://api.github.com/repos/NixOS/nixpkgs
closed
nixos/tests/flannel fails
0.kind: bug 6.topic: testing
**Describe the bug** `nixos/tests/flannel` does not succeed. **To Reproduce** `nix-build nixpkgs/nixos/tests/flannel.nix` **Expected behavior** The test should pass. **Additional context** Was attempting to port the test to python, couldn't find a useful test success case with the current setup or with several variations. Not sure if flannel is just broken or the original test was bad. **Metadata** ``` - system: `"x86_64-linux"` - host os: `Linux 5.3.14, NixOS, 19.09.1471.d387c2dd552 (Loris)` - multi-user?: `yes` - sandbox: `yes` - version: `nix-env (Nix) 2.3` - channels(root): `"nixos-19.09.1471.d387c2dd552, nixos-hardware, unstable-20.03pre203904.bb1013511e1"` - nixpkgs: `/nix/var/nix/profiles/per-user/root/channels/nixos` ``` Maintainer information: ```yaml # a list of nixpkgs attributes affected by the problem attribute: # a list of nixos modules affected by the problem module: ```
1.0
nixos/tests/flannel fails - **Describe the bug** `nixos/tests/flannel` does not succeed. **To Reproduce** `nix-build nixpkgs/nixos/tests/flannel.nix` **Expected behavior** The test should pass. **Additional context** Was attempting to port the test to python, couldn't find a useful test success case with the current setup or with several variations. Not sure if flannel is just broken or the original test was bad. **Metadata** ``` - system: `"x86_64-linux"` - host os: `Linux 5.3.14, NixOS, 19.09.1471.d387c2dd552 (Loris)` - multi-user?: `yes` - sandbox: `yes` - version: `nix-env (Nix) 2.3` - channels(root): `"nixos-19.09.1471.d387c2dd552, nixos-hardware, unstable-20.03pre203904.bb1013511e1"` - nixpkgs: `/nix/var/nix/profiles/per-user/root/channels/nixos` ``` Maintainer information: ```yaml # a list of nixpkgs attributes affected by the problem attribute: # a list of nixos modules affected by the problem module: ```
test
nixos tests flannel fails describe the bug nixos tests flannel does not succeed to reproduce nix build nixpkgs nixos tests flannel nix expected behavior the test should pass additional context was attempting to port the test to python couldn t find a useful test success case with the current setup or with several variations not sure if flannel is just broken or the original test was bad metadata system linux host os linux nixos loris multi user yes sandbox yes version nix env nix channels root nixos nixos hardware unstable nixpkgs nix var nix profiles per user root channels nixos maintainer information yaml a list of nixpkgs attributes affected by the problem attribute a list of nixos modules affected by the problem module
1
83,779
15,720,724,341
IssuesEvent
2021-03-29 01:00:18
billmcchesney1/foxtrot
https://api.github.com/repos/billmcchesney1/foxtrot
opened
CVE-2021-21295 (Medium) detected in netty-codec-http-4.1.13.Final.jar
security vulnerability
## CVE-2021-21295 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-http-4.1.13.Final.jar</b></p></summary> <p>Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers and clients.</p> <p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p> <p>Path to dependency file: foxtrot/foxtrot-server/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-codec-http/4.1.13.Final/netty-codec-http-4.1.13.Final.jar,/home/wss-scanner/.m2/repository/io/netty/netty-codec-http/4.1.13.Final/netty-codec-http-4.1.13.Final.jar,/home/wss-scanner/.m2/repository/io/netty/netty-codec-http/4.1.13.Final/netty-codec-http-4.1.13.Final.jar</p> <p> Dependency Hierarchy: - transport-6.0.1.jar (Root Library) - transport-netty4-client-6.0.1.jar - :x: **netty-codec-http-4.1.13.Final.jar** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Netty is an open-source, asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers & clients. In Netty (io.netty:netty-codec-http2) before version 4.1.60.Final there is a vulnerability that enables request smuggling. If a Content-Length header is present in the original HTTP/2 request, the field is not validated by `Http2MultiplexHandler` as it is propagated up. This is fine as long as the request is not proxied through as HTTP/1.1. If the request comes in as an HTTP/2 stream, gets converted into the HTTP/1.1 domain objects (`HttpRequest`, `HttpContent`, etc.) via `Http2StreamFrameToHttpObjectCodec `and then sent up to the child channel's pipeline and proxied through a remote peer as HTTP/1.1 this may result in request smuggling. In a proxy case, users may assume the content-length is validated somehow, which is not the case. If the request is forwarded to a backend channel that is a HTTP/1.1 connection, the Content-Length now has meaning and needs to be checked. An attacker can smuggle requests inside the body as it gets downgraded from HTTP/2 to HTTP/1.1. For an example attack refer to the linked GitHub Advisory. Users are only affected if all of this is true: `HTTP2MultiplexCodec` or `Http2FrameCodec` is used, `Http2StreamFrameToHttpObjectCodec` is used to convert to HTTP/1.1 objects, and these HTTP/1.1 objects are forwarded to another remote peer. This has been patched in 4.1.60.Final As a workaround, the user can do the validation by themselves by implementing a custom `ChannelInboundHandler` that is put in the `ChannelPipeline` behind `Http2StreamFrameToHttpObjectCodec`. <p>Publish Date: 2021-03-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21295>CVE-2021-21295</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-wm47-8v5p-wjpj">https://github.com/advisories/GHSA-wm47-8v5p-wjpj</a></p> <p>Release Date: 2021-03-09</p> <p>Fix Resolution: io.netty:netty-codec-http:4.1.60</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.13.Final","packageFilePaths":["/foxtrot-server/pom.xml","/foxtrot-sql/pom.xml","/foxtrot-core/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.elasticsearch.client:transport:6.0.1;org.elasticsearch.plugin:transport-netty4-client:6.0.1;io.netty:netty-codec-http:4.1.13.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.60"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-21295","vulnerabilityDetails":"Netty is an open-source, asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers \u0026 clients. In Netty (io.netty:netty-codec-http2) before version 4.1.60.Final there is a vulnerability that enables request smuggling. If a Content-Length header is present in the original HTTP/2 request, the field is not validated by `Http2MultiplexHandler` as it is propagated up. This is fine as long as the request is not proxied through as HTTP/1.1. If the request comes in as an HTTP/2 stream, gets converted into the HTTP/1.1 domain objects (`HttpRequest`, `HttpContent`, etc.) via `Http2StreamFrameToHttpObjectCodec `and then sent up to the child channel\u0027s pipeline and proxied through a remote peer as HTTP/1.1 this may result in request smuggling. In a proxy case, users may assume the content-length is validated somehow, which is not the case. If the request is forwarded to a backend channel that is a HTTP/1.1 connection, the Content-Length now has meaning and needs to be checked. An attacker can smuggle requests inside the body as it gets downgraded from HTTP/2 to HTTP/1.1. For an example attack refer to the linked GitHub Advisory. Users are only affected if all of this is true: `HTTP2MultiplexCodec` or `Http2FrameCodec` is used, `Http2StreamFrameToHttpObjectCodec` is used to convert to HTTP/1.1 objects, and these HTTP/1.1 objects are forwarded to another remote peer. This has been patched in 4.1.60.Final As a workaround, the user can do the validation by themselves by implementing a custom `ChannelInboundHandler` that is put in the `ChannelPipeline` behind `Http2StreamFrameToHttpObjectCodec`.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21295","cvss3Severity":"medium","cvss3Score":"5.9","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2021-21295 (Medium) detected in netty-codec-http-4.1.13.Final.jar - ## CVE-2021-21295 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-http-4.1.13.Final.jar</b></p></summary> <p>Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers and clients.</p> <p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p> <p>Path to dependency file: foxtrot/foxtrot-server/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-codec-http/4.1.13.Final/netty-codec-http-4.1.13.Final.jar,/home/wss-scanner/.m2/repository/io/netty/netty-codec-http/4.1.13.Final/netty-codec-http-4.1.13.Final.jar,/home/wss-scanner/.m2/repository/io/netty/netty-codec-http/4.1.13.Final/netty-codec-http-4.1.13.Final.jar</p> <p> Dependency Hierarchy: - transport-6.0.1.jar (Root Library) - transport-netty4-client-6.0.1.jar - :x: **netty-codec-http-4.1.13.Final.jar** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Netty is an open-source, asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers & clients. In Netty (io.netty:netty-codec-http2) before version 4.1.60.Final there is a vulnerability that enables request smuggling. If a Content-Length header is present in the original HTTP/2 request, the field is not validated by `Http2MultiplexHandler` as it is propagated up. This is fine as long as the request is not proxied through as HTTP/1.1. If the request comes in as an HTTP/2 stream, gets converted into the HTTP/1.1 domain objects (`HttpRequest`, `HttpContent`, etc.) via `Http2StreamFrameToHttpObjectCodec `and then sent up to the child channel's pipeline and proxied through a remote peer as HTTP/1.1 this may result in request smuggling. In a proxy case, users may assume the content-length is validated somehow, which is not the case. If the request is forwarded to a backend channel that is a HTTP/1.1 connection, the Content-Length now has meaning and needs to be checked. An attacker can smuggle requests inside the body as it gets downgraded from HTTP/2 to HTTP/1.1. For an example attack refer to the linked GitHub Advisory. Users are only affected if all of this is true: `HTTP2MultiplexCodec` or `Http2FrameCodec` is used, `Http2StreamFrameToHttpObjectCodec` is used to convert to HTTP/1.1 objects, and these HTTP/1.1 objects are forwarded to another remote peer. This has been patched in 4.1.60.Final As a workaround, the user can do the validation by themselves by implementing a custom `ChannelInboundHandler` that is put in the `ChannelPipeline` behind `Http2StreamFrameToHttpObjectCodec`. <p>Publish Date: 2021-03-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21295>CVE-2021-21295</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-wm47-8v5p-wjpj">https://github.com/advisories/GHSA-wm47-8v5p-wjpj</a></p> <p>Release Date: 2021-03-09</p> <p>Fix Resolution: io.netty:netty-codec-http:4.1.60</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.13.Final","packageFilePaths":["/foxtrot-server/pom.xml","/foxtrot-sql/pom.xml","/foxtrot-core/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.elasticsearch.client:transport:6.0.1;org.elasticsearch.plugin:transport-netty4-client:6.0.1;io.netty:netty-codec-http:4.1.13.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.60"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-21295","vulnerabilityDetails":"Netty is an open-source, asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers \u0026 clients. In Netty (io.netty:netty-codec-http2) before version 4.1.60.Final there is a vulnerability that enables request smuggling. If a Content-Length header is present in the original HTTP/2 request, the field is not validated by `Http2MultiplexHandler` as it is propagated up. This is fine as long as the request is not proxied through as HTTP/1.1. If the request comes in as an HTTP/2 stream, gets converted into the HTTP/1.1 domain objects (`HttpRequest`, `HttpContent`, etc.) via `Http2StreamFrameToHttpObjectCodec `and then sent up to the child channel\u0027s pipeline and proxied through a remote peer as HTTP/1.1 this may result in request smuggling. In a proxy case, users may assume the content-length is validated somehow, which is not the case. If the request is forwarded to a backend channel that is a HTTP/1.1 connection, the Content-Length now has meaning and needs to be checked. An attacker can smuggle requests inside the body as it gets downgraded from HTTP/2 to HTTP/1.1. For an example attack refer to the linked GitHub Advisory. Users are only affected if all of this is true: `HTTP2MultiplexCodec` or `Http2FrameCodec` is used, `Http2StreamFrameToHttpObjectCodec` is used to convert to HTTP/1.1 objects, and these HTTP/1.1 objects are forwarded to another remote peer. This has been patched in 4.1.60.Final As a workaround, the user can do the validation by themselves by implementing a custom `ChannelInboundHandler` that is put in the `ChannelPipeline` behind `Http2StreamFrameToHttpObjectCodec`.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21295","cvss3Severity":"medium","cvss3Score":"5.9","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_test
cve medium detected in netty codec http final jar cve medium severity vulnerability vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file foxtrot foxtrot server pom xml path to vulnerable library home wss scanner repository io netty netty codec http final netty codec http final jar home wss scanner repository io netty netty codec http final netty codec http final jar home wss scanner repository io netty netty codec http final netty codec http final jar dependency hierarchy transport jar root library transport client jar x netty codec http final jar vulnerable library found in base branch master vulnerability details netty is an open source asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers clients in netty io netty netty codec before version final there is a vulnerability that enables request smuggling if a content length header is present in the original http request the field is not validated by as it is propagated up this is fine as long as the request is not proxied through as http if the request comes in as an http stream gets converted into the http domain objects httprequest httpcontent etc via and then sent up to the child channel s pipeline and proxied through a remote peer as http this may result in request smuggling in a proxy case users may assume the content length is validated somehow which is not the case if the request is forwarded to a backend channel that is a http connection the content length now has meaning and needs to be checked an attacker can smuggle requests inside the body as it gets downgraded from http to http for an example attack refer to the linked github advisory users are only affected if all of this is true or is used is used to convert to http objects and these http objects are forwarded to another remote peer this has been patched in final as a workaround the user can do the validation by themselves by implementing a custom channelinboundhandler that is put in the channelpipeline behind publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io netty netty codec http isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree org elasticsearch client transport org elasticsearch plugin transport client io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty codec http basebranches vulnerabilityidentifier cve vulnerabilitydetails netty is an open source asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers clients in netty io netty netty codec before version final there is a vulnerability that enables request smuggling if a content length header is present in the original http request the field is not validated by as it is propagated up this is fine as long as the request is not proxied through as http if the request comes in as an http stream gets converted into the http domain objects httprequest httpcontent etc via and then sent up to the child channel pipeline and proxied through a remote peer as http this may result in request smuggling in a proxy case users may assume the content length is validated somehow which is not the case if the request is forwarded to a backend channel that is a http connection the content length now has meaning and needs to be checked an attacker can smuggle requests inside the body as it gets downgraded from http to http for an example attack refer to the linked github advisory users are only affected if all of this is true or is used is used to convert to http objects and these http objects are forwarded to another remote peer this has been patched in final as a workaround the user can do the validation by themselves by implementing a custom channelinboundhandler that is put in the channelpipeline behind vulnerabilityurl
0
161,946
6,143,644,517
IssuesEvent
2017-06-27 06:33:08
molgenis/molgenis
https://api.github.com/repos/molgenis/molgenis
closed
Error editing entity with attribute of type FILE
1.21.4 2.0 3.0 4.0 bug mod:core-ui priority-next
#### Reproduce - Import [file_datatype-test.xlsx](https://github.com/molgenis/molgenis/files/591337/file_datatype-test.xlsx) - Select entity 'File' in 'Data explorer' plugin - Select '+' button and create a new row with Description 'test' and Attachment 'file_datatype-test.xlsx' - Select 'Save changes' - Select 'Edit' button - Update Description to 'test2' #### Expected - Value is updated to 'test2' #### Observed - Value is updated to 'test2' and the following error occurs: ```Error! An error occurred. Please contact the administrator. Uncaught InvalidStateError: Failed to set the 'value' property on 'HTMLInputElement': This input element accepts a filename, which may only be programmatically set to the empty string.```
1.0
Error editing entity with attribute of type FILE - #### Reproduce - Import [file_datatype-test.xlsx](https://github.com/molgenis/molgenis/files/591337/file_datatype-test.xlsx) - Select entity 'File' in 'Data explorer' plugin - Select '+' button and create a new row with Description 'test' and Attachment 'file_datatype-test.xlsx' - Select 'Save changes' - Select 'Edit' button - Update Description to 'test2' #### Expected - Value is updated to 'test2' #### Observed - Value is updated to 'test2' and the following error occurs: ```Error! An error occurred. Please contact the administrator. Uncaught InvalidStateError: Failed to set the 'value' property on 'HTMLInputElement': This input element accepts a filename, which may only be programmatically set to the empty string.```
non_test
error editing entity with attribute of type file reproduce import select entity file in data explorer plugin select button and create a new row with description test and attachment file datatype test xlsx select save changes select edit button update description to expected value is updated to observed value is updated to and the following error occurs error an error occurred please contact the administrator uncaught invalidstateerror failed to set the value property on htmlinputelement this input element accepts a filename which may only be programmatically set to the empty string
0
2,933
2,649,057,835
IssuesEvent
2015-03-14 15:10:13
aj-r/CarpoolPlanner
https://api.github.com/repos/aj-r/CarpoolPlanner
closed
Switch to Twilio for SMS messages
enhancement to-test
TextNow is just too unreliable. Use Twilio for SMS messages instead. Unfortunately there will be "Send from Twilio trial account" at the beginning of each message, unless you pay $0.0075 per text. Which may be worth considering.
1.0
Switch to Twilio for SMS messages - TextNow is just too unreliable. Use Twilio for SMS messages instead. Unfortunately there will be "Send from Twilio trial account" at the beginning of each message, unless you pay $0.0075 per text. Which may be worth considering.
test
switch to twilio for sms messages textnow is just too unreliable use twilio for sms messages instead unfortunately there will be send from twilio trial account at the beginning of each message unless you pay per text which may be worth considering
1
149,304
19,574,447,740
IssuesEvent
2022-01-04 13:57:35
barranquerox/poei-01-2022-luis
https://api.github.com/repos/barranquerox/poei-01-2022-luis
opened
CVE-2021-43797 (Medium) detected in netty-codec-http-4.1.67.Final.jar
security vulnerability
## CVE-2021-43797 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-http-4.1.67.Final.jar</b></p></summary> <p></p> <p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p> <p>Path to dependency file: /build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.67.Final/e282137917c67332fa9a414df89f89a93487aede/netty-codec-http-4.1.67.Final.jar</p> <p> Dependency Hierarchy: - selenium-java-4.0.0.jar (Root Library) - selenium-remote-driver-4.0.0.jar - async-http-client-2.12.3.jar - netty-handler-proxy-4.1.60.Final.jar - :x: **netty-codec-http-4.1.67.Final.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/barranquerox/poei-01-2022-luis/commit/cc31fc254ee4fdbc689b97e05ecb6143add88c4a">cc31fc254ee4fdbc689b97e05ecb6143add88c4a</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers & clients. Netty prior to version 4.1.7.1.Final skips control chars when they are present at the beginning / end of the header name. It should instead fail fast as these are not allowed by the spec and could lead to HTTP request smuggling. Failing to do the validation might cause netty to "sanitize" header names before it forward these to another remote system when used as proxy. This remote system can't see the invalid usage anymore, and therefore does not do the validation itself. Users should upgrade to version 4.1.7.1.Final to receive a patch. <p>Publish Date: 2021-12-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43797>CVE-2021-43797</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="CVE-2021-43797">CVE-2021-43797</a></p> <p>Release Date: 2021-12-09</p> <p>Fix Resolution: io.netty:netty-codec-http:4.1.71.Final,io.netty:netty-all:4.1.71.Final</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-43797 (Medium) detected in netty-codec-http-4.1.67.Final.jar - ## CVE-2021-43797 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-http-4.1.67.Final.jar</b></p></summary> <p></p> <p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p> <p>Path to dependency file: /build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.67.Final/e282137917c67332fa9a414df89f89a93487aede/netty-codec-http-4.1.67.Final.jar</p> <p> Dependency Hierarchy: - selenium-java-4.0.0.jar (Root Library) - selenium-remote-driver-4.0.0.jar - async-http-client-2.12.3.jar - netty-handler-proxy-4.1.60.Final.jar - :x: **netty-codec-http-4.1.67.Final.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/barranquerox/poei-01-2022-luis/commit/cc31fc254ee4fdbc689b97e05ecb6143add88c4a">cc31fc254ee4fdbc689b97e05ecb6143add88c4a</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers & clients. Netty prior to version 4.1.7.1.Final skips control chars when they are present at the beginning / end of the header name. It should instead fail fast as these are not allowed by the spec and could lead to HTTP request smuggling. Failing to do the validation might cause netty to "sanitize" header names before it forward these to another remote system when used as proxy. This remote system can't see the invalid usage anymore, and therefore does not do the validation itself. Users should upgrade to version 4.1.7.1.Final to receive a patch. <p>Publish Date: 2021-12-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43797>CVE-2021-43797</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="CVE-2021-43797">CVE-2021-43797</a></p> <p>Release Date: 2021-12-09</p> <p>Fix Resolution: io.netty:netty-codec-http:4.1.71.Final,io.netty:netty-all:4.1.71.Final</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve medium detected in netty codec http final jar cve medium severity vulnerability vulnerable library netty codec http final jar library home page a href path to dependency file build gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy selenium java jar root library selenium remote driver jar async http client jar netty handler proxy final jar x netty codec http final jar vulnerable library found in head commit a href found in base branch master vulnerability details netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers clients netty prior to version final skips control chars when they are present at the beginning end of the header name it should instead fail fast as these are not allowed by the spec and could lead to http request smuggling failing to do the validation might cause netty to sanitize header names before it forward these to another remote system when used as proxy this remote system can t see the invalid usage anymore and therefore does not do the validation itself users should upgrade to version final to receive a patch publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin cve release date fix resolution io netty netty codec http final io netty netty all final step up your open source security game with whitesource
0
328,879
10,001,033,205
IssuesEvent
2019-07-12 14:43:35
wazuh/wazuh-kibana-app
https://api.github.com/repos/wazuh/wazuh-kibana-app
closed
Wazuh app doesn't open under custom spaces
bug community delayed priority/medium
As you know, 6.x introduced spaces as a way to manage collections of visualization, dashboards, etc. Selecting a custom space, and trying to open the wazuh app yields a blank screen. See screenshot below. Wazuh 3.7.2 + Kibana 6.5.4 ![image](https://user-images.githubusercontent.com/2720787/52646261-5c6e6e00-2eda-11e9-8a0c-73ff106a4987.png)
1.0
Wazuh app doesn't open under custom spaces - As you know, 6.x introduced spaces as a way to manage collections of visualization, dashboards, etc. Selecting a custom space, and trying to open the wazuh app yields a blank screen. See screenshot below. Wazuh 3.7.2 + Kibana 6.5.4 ![image](https://user-images.githubusercontent.com/2720787/52646261-5c6e6e00-2eda-11e9-8a0c-73ff106a4987.png)
non_test
wazuh app doesn t open under custom spaces as you know x introduced spaces as a way to manage collections of visualization dashboards etc selecting a custom space and trying to open the wazuh app yields a blank screen see screenshot below wazuh kibana
0
284,773
24,623,334,367
IssuesEvent
2022-10-16 07:24:07
roeszler/reabook
https://api.github.com/repos/roeszler/reabook
closed
User Story: Create User Access
feature test chore
As a **user**, I can **access the user sections of the site** so that **I can make and recall booking requests**.
1.0
User Story: Create User Access - As a **user**, I can **access the user sections of the site** so that **I can make and recall booking requests**.
test
user story create user access as a user i can access the user sections of the site so that i can make and recall booking requests
1
11,118
28,074,131,948
IssuesEvent
2023-03-29 21:33:58
jared-hughes/polygolf
https://api.github.com/repos/jared-hughes/polygolf
opened
Introduce the concept of "target types"
enhancement architecture
This is especially needed for int vs bigint distinction but can be useful for other cases like char vs length-1 string.
1.0
Introduce the concept of "target types" - This is especially needed for int vs bigint distinction but can be useful for other cases like char vs length-1 string.
non_test
introduce the concept of target types this is especially needed for int vs bigint distinction but can be useful for other cases like char vs length string
0
282,535
24,484,453,283
IssuesEvent
2022-10-09 08:38:32
infinitest/infinitest
https://api.github.com/repos/infinitest/infinitest
closed
[infinitest-intellij] Exception when adding Infinitest facet to IntelliJ 2018.1.3
type: bug comp:infinitest-intellij
Same was reported in 2017 as #228 ``` Assertion failed: Registering post-startup activity that will never be run: disposed=false; open=true; passed=true java.lang.Throwable: Assertion failed: Registering post-startup activity that will never be run: disposed=false; open=true; passed=true at com.intellij.openapi.diagnostic.Logger.assertTrue(Logger.java:163) at com.intellij.ide.startup.impl.StartupManagerImpl.registerPostStartupActivity(StartupManagerImpl.java:100) at org.infinitest.intellij.idea.window.InfinitestToolWindow.startInfinitestAfterStartup(InfinitestToolWindow.java:69) at org.infinitest.intellij.idea.window.InfinitestToolWindow.facetInitialized(InfinitestToolWindow.java:89) at org.infinitest.intellij.idea.facet.InfinitestFacet.initFacet(InfinitestFacet.java:47) at com.intellij.facet.FacetManagerImpl.commit(FacetManagerImpl.java:441) at com.intellij.facet.FacetManagerImpl.commit(FacetManagerImpl.java:381) at com.intellij.facet.impl.FacetModelImpl.commit(FacetModelImpl.java:96) at com.intellij.facet.impl.ProjectFacetsConfigurator.commitFacets(ProjectFacetsConfigurator.java:230) at com.intellij.openapi.roots.ui.configuration.ModulesConfigurator.a(ModulesConfigurator.java:319) at com.intellij.openapi.application.impl.ApplicationImpl.runWriteAction(ApplicationImpl.java:1010) at com.intellij.openapi.roots.ui.configuration.ModulesConfigurator.apply(ModulesConfigurator.java:287) at com.intellij.openapi.roots.ui.configuration.projectRoot.ModuleStructureConfigurable.apply(ModuleStructureConfigurable.java:372) at com.intellij.openapi.roots.ui.configuration.ProjectStructureConfigurable.apply(ProjectStructureConfigurable.java:332) at com.intellij.openapi.options.newEditor.ConfigurableEditor.apply(ConfigurableEditor.java:323) at com.intellij.openapi.options.newEditor.ConfigurableEditor.apply(ConfigurableEditor.java:144) at com.intellij.openapi.options.newEditor.SettingsDialog.doOKAction(SettingsDialog.java:159) at com.intellij.openapi.ui.DialogWrapper$OkAction.doAction(DialogWrapper.java:1868) at com.intellij.openapi.ui.DialogWrapper$DialogWrapperAction.actionPerformed(DialogWrapper.java:1828) at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:2022) at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2348) at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:402) at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:259) at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:252) at java.awt.Component.processMouseEvent(Component.java:6548) at javax.swing.JComponent.processMouseEvent(JComponent.java:3325) at java.awt.Component.processEvent(Component.java:6313) at java.awt.Container.processEvent(Container.java:2237) at java.awt.Component.dispatchEventImpl(Component.java:4903) at java.awt.Container.dispatchEventImpl(Container.java:2295) at java.awt.Component.dispatchEvent(Component.java:4725) at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4889) at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4526) at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4467) at java.awt.Container.dispatchEventImpl(Container.java:2281) at java.awt.Window.dispatchEventImpl(Window.java:2746) at java.awt.Component.dispatchEvent(Component.java:4725) at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:764) at java.awt.EventQueue.access$500(EventQueue.java:98) at java.awt.EventQueue$3.run(EventQueue.java:715) at java.awt.EventQueue$3.run(EventQueue.java:709) at java.security.AccessController.doPrivileged(Native Method) at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:80) at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:90) at java.awt.EventQueue$4.run(EventQueue.java:737) at java.awt.EventQueue$4.run(EventQueue.java:735) at java.security.AccessController.doPrivileged(Native Method) at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:80) at java.awt.EventQueue.dispatchEvent(EventQueue.java:734) at com.intellij.ide.IdeEventQueue.defaultDispatchEvent(IdeEventQueue.java:779) at com.intellij.ide.IdeEventQueue._dispatchEvent(IdeEventQueue.java:716) at com.intellij.ide.IdeEventQueue.dispatchEvent(IdeEventQueue.java:395) at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:201) at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116) at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:109) at java.awt.WaitDispatchSupport$2.run(WaitDispatchSupport.java:190) at java.awt.WaitDispatchSupport$4.run(WaitDispatchSupport.java:235) at java.awt.WaitDispatchSupport$4.run(WaitDispatchSupport.java:233) at java.security.AccessController.doPrivileged(Native Method) at java.awt.WaitDispatchSupport.enter(WaitDispatchSupport.java:233) at java.awt.Dialog.show(Dialog.java:1077) at com.intellij.openapi.ui.impl.DialogWrapperPeerImpl$MyDialog.show(DialogWrapperPeerImpl.java:694) at com.intellij.openapi.ui.impl.DialogWrapperPeerImpl.show(DialogWrapperPeerImpl.java:426) at com.intellij.openapi.ui.DialogWrapper.invokeShow(DialogWrapper.java:1688) at com.intellij.openapi.ui.DialogWrapper.show(DialogWrapper.java:1637) at com.intellij.openapi.options.newEditor.SettingsDialog.lambda$show$0(SettingsDialog.java:69) at com.intellij.openapi.application.TransactionGuardImpl.runSyncTransaction(TransactionGuardImpl.java:88) at com.intellij.openapi.application.TransactionGuardImpl.submitTransactionAndWait(TransactionGuardImpl.java:153) at com.intellij.openapi.options.newEditor.SettingsDialog.show(SettingsDialog.java:69) at com.intellij.openapi.ui.DialogWrapper.showAndGet(DialogWrapper.java:1652) at com.intellij.ide.actions.ShowSettingsUtilImpl.editConfigurable(ShowSettingsUtilImpl.java:241) at com.intellij.ide.actions.ShowSettingsUtilImpl.editConfigurable(ShowSettingsUtilImpl.java:207) at com.intellij.openapi.roots.ui.configuration.ModulesConfigurator.showDialog(ModulesConfigurator.java:532) at com.intellij.openapi.roots.ui.configuration.IdeaProjectSettingsService.openModuleSettings(IdeaProjectSettingsService.java:69) at com.intellij.ide.projectView.impl.nodes.PsiDirectoryNode.navigate(PsiDirectoryNode.java:295) at com.intellij.util.OpenSourceUtil.navigate(OpenSourceUtil.java:53) at com.intellij.ide.actions.BaseNavigateToSourceAction.actionPerformed(BaseNavigateToSourceAction.java:37) at com.intellij.openapi.actionSystem.ex.ActionUtil$1.run(ActionUtil.java:220) at com.intellij.openapi.actionSystem.ex.ActionUtil.performActionDumbAware(ActionUtil.java:237) at com.intellij.openapi.actionSystem.impl.ActionMenuItem$ActionTransmitter.lambda$actionPerformed$0(ActionMenuItem.java:301) at com.intellij.openapi.wm.impl.FocusManagerImpl.runOnOwnContext(FocusManagerImpl.java:307) at com.intellij.openapi.wm.impl.IdeFocusManagerImpl.runOnOwnContext(IdeFocusManagerImpl.java:104) at com.intellij.openapi.actionSystem.impl.ActionMenuItem$ActionTransmitter.actionPerformed(ActionMenuItem.java:291) at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:2022) at com.intellij.openapi.actionSystem.impl.ActionMenuItem.lambda$fireActionPerformed$0(ActionMenuItem.java:111) at com.intellij.openapi.application.TransactionGuardImpl.runSyncTransaction(TransactionGuardImpl.java:88) at com.intellij.openapi.application.TransactionGuardImpl.lambda$submitTransaction$1(TransactionGuardImpl.java:111) at com.intellij.openapi.application.TransactionGuardImpl.submitTransaction(TransactionGuardImpl.java:120) at com.intellij.openapi.application.TransactionGuard.submitTransaction(TransactionGuard.java:122) at com.intellij.openapi.actionSystem.impl.ActionMenuItem.fireActionPerformed(ActionMenuItem.java:111) at com.intellij.ui.plaf.beg.BegMenuItemUI.doClick(BegMenuItemUI.java:528) at com.intellij.ui.plaf.beg.BegMenuItemUI.access$300(BegMenuItemUI.java:48) at com.intellij.ui.plaf.beg.BegMenuItemUI$MyMouseInputHandler.mouseReleased(BegMenuItemUI.java:548) at java.awt.Component.processMouseEvent(Component.java:6548) at javax.swing.JComponent.processMouseEvent(JComponent.java:3325) at java.awt.Component.processEvent(Component.java:6313) at java.awt.Container.processEvent(Container.java:2237) at java.awt.Component.dispatchEventImpl(Component.java:4903) at java.awt.Container.dispatchEventImpl(Container.java:2295) at java.awt.Component.dispatchEvent(Component.java:4725) at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4889) at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4526) at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4467) at java.awt.Container.dispatchEventImpl(Container.java:2281) at java.awt.Window.dispatchEventImpl(Window.java:2746) at java.awt.Component.dispatchEvent(Component.java:4725) at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:764) at java.awt.EventQueue.access$500(EventQueue.java:98) at java.awt.EventQueue$3.run(EventQueue.java:715) at java.awt.EventQueue$3.run(EventQueue.java:709) at java.security.AccessController.doPrivileged(Native Method) at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:80) at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:90) at java.awt.EventQueue$4.run(EventQueue.java:737) at java.awt.EventQueue$4.run(EventQueue.java:735) at java.security.AccessController.doPrivileged(Native Method) at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:80) at java.awt.EventQueue.dispatchEvent(EventQueue.java:734) at com.intellij.ide.IdeEventQueue.defaultDispatchEvent(IdeEventQueue.java:779) at com.intellij.ide.IdeEventQueue._dispatchEvent(IdeEventQueue.java:716) at com.intellij.ide.IdeEventQueue.dispatchEvent(IdeEventQueue.java:395) at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:201) at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116) at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:105) at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101) at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:93) at java.awt.EventDispatchThread.run(EventDispatchThread.java:82) ```
1.0
[infinitest-intellij] Exception when adding Infinitest facet to IntelliJ 2018.1.3 - Same was reported in 2017 as #228 ``` Assertion failed: Registering post-startup activity that will never be run: disposed=false; open=true; passed=true java.lang.Throwable: Assertion failed: Registering post-startup activity that will never be run: disposed=false; open=true; passed=true at com.intellij.openapi.diagnostic.Logger.assertTrue(Logger.java:163) at com.intellij.ide.startup.impl.StartupManagerImpl.registerPostStartupActivity(StartupManagerImpl.java:100) at org.infinitest.intellij.idea.window.InfinitestToolWindow.startInfinitestAfterStartup(InfinitestToolWindow.java:69) at org.infinitest.intellij.idea.window.InfinitestToolWindow.facetInitialized(InfinitestToolWindow.java:89) at org.infinitest.intellij.idea.facet.InfinitestFacet.initFacet(InfinitestFacet.java:47) at com.intellij.facet.FacetManagerImpl.commit(FacetManagerImpl.java:441) at com.intellij.facet.FacetManagerImpl.commit(FacetManagerImpl.java:381) at com.intellij.facet.impl.FacetModelImpl.commit(FacetModelImpl.java:96) at com.intellij.facet.impl.ProjectFacetsConfigurator.commitFacets(ProjectFacetsConfigurator.java:230) at com.intellij.openapi.roots.ui.configuration.ModulesConfigurator.a(ModulesConfigurator.java:319) at com.intellij.openapi.application.impl.ApplicationImpl.runWriteAction(ApplicationImpl.java:1010) at com.intellij.openapi.roots.ui.configuration.ModulesConfigurator.apply(ModulesConfigurator.java:287) at com.intellij.openapi.roots.ui.configuration.projectRoot.ModuleStructureConfigurable.apply(ModuleStructureConfigurable.java:372) at com.intellij.openapi.roots.ui.configuration.ProjectStructureConfigurable.apply(ProjectStructureConfigurable.java:332) at com.intellij.openapi.options.newEditor.ConfigurableEditor.apply(ConfigurableEditor.java:323) at com.intellij.openapi.options.newEditor.ConfigurableEditor.apply(ConfigurableEditor.java:144) at com.intellij.openapi.options.newEditor.SettingsDialog.doOKAction(SettingsDialog.java:159) at com.intellij.openapi.ui.DialogWrapper$OkAction.doAction(DialogWrapper.java:1868) at com.intellij.openapi.ui.DialogWrapper$DialogWrapperAction.actionPerformed(DialogWrapper.java:1828) at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:2022) at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2348) at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:402) at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:259) at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:252) at java.awt.Component.processMouseEvent(Component.java:6548) at javax.swing.JComponent.processMouseEvent(JComponent.java:3325) at java.awt.Component.processEvent(Component.java:6313) at java.awt.Container.processEvent(Container.java:2237) at java.awt.Component.dispatchEventImpl(Component.java:4903) at java.awt.Container.dispatchEventImpl(Container.java:2295) at java.awt.Component.dispatchEvent(Component.java:4725) at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4889) at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4526) at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4467) at java.awt.Container.dispatchEventImpl(Container.java:2281) at java.awt.Window.dispatchEventImpl(Window.java:2746) at java.awt.Component.dispatchEvent(Component.java:4725) at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:764) at java.awt.EventQueue.access$500(EventQueue.java:98) at java.awt.EventQueue$3.run(EventQueue.java:715) at java.awt.EventQueue$3.run(EventQueue.java:709) at java.security.AccessController.doPrivileged(Native Method) at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:80) at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:90) at java.awt.EventQueue$4.run(EventQueue.java:737) at java.awt.EventQueue$4.run(EventQueue.java:735) at java.security.AccessController.doPrivileged(Native Method) at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:80) at java.awt.EventQueue.dispatchEvent(EventQueue.java:734) at com.intellij.ide.IdeEventQueue.defaultDispatchEvent(IdeEventQueue.java:779) at com.intellij.ide.IdeEventQueue._dispatchEvent(IdeEventQueue.java:716) at com.intellij.ide.IdeEventQueue.dispatchEvent(IdeEventQueue.java:395) at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:201) at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116) at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:109) at java.awt.WaitDispatchSupport$2.run(WaitDispatchSupport.java:190) at java.awt.WaitDispatchSupport$4.run(WaitDispatchSupport.java:235) at java.awt.WaitDispatchSupport$4.run(WaitDispatchSupport.java:233) at java.security.AccessController.doPrivileged(Native Method) at java.awt.WaitDispatchSupport.enter(WaitDispatchSupport.java:233) at java.awt.Dialog.show(Dialog.java:1077) at com.intellij.openapi.ui.impl.DialogWrapperPeerImpl$MyDialog.show(DialogWrapperPeerImpl.java:694) at com.intellij.openapi.ui.impl.DialogWrapperPeerImpl.show(DialogWrapperPeerImpl.java:426) at com.intellij.openapi.ui.DialogWrapper.invokeShow(DialogWrapper.java:1688) at com.intellij.openapi.ui.DialogWrapper.show(DialogWrapper.java:1637) at com.intellij.openapi.options.newEditor.SettingsDialog.lambda$show$0(SettingsDialog.java:69) at com.intellij.openapi.application.TransactionGuardImpl.runSyncTransaction(TransactionGuardImpl.java:88) at com.intellij.openapi.application.TransactionGuardImpl.submitTransactionAndWait(TransactionGuardImpl.java:153) at com.intellij.openapi.options.newEditor.SettingsDialog.show(SettingsDialog.java:69) at com.intellij.openapi.ui.DialogWrapper.showAndGet(DialogWrapper.java:1652) at com.intellij.ide.actions.ShowSettingsUtilImpl.editConfigurable(ShowSettingsUtilImpl.java:241) at com.intellij.ide.actions.ShowSettingsUtilImpl.editConfigurable(ShowSettingsUtilImpl.java:207) at com.intellij.openapi.roots.ui.configuration.ModulesConfigurator.showDialog(ModulesConfigurator.java:532) at com.intellij.openapi.roots.ui.configuration.IdeaProjectSettingsService.openModuleSettings(IdeaProjectSettingsService.java:69) at com.intellij.ide.projectView.impl.nodes.PsiDirectoryNode.navigate(PsiDirectoryNode.java:295) at com.intellij.util.OpenSourceUtil.navigate(OpenSourceUtil.java:53) at com.intellij.ide.actions.BaseNavigateToSourceAction.actionPerformed(BaseNavigateToSourceAction.java:37) at com.intellij.openapi.actionSystem.ex.ActionUtil$1.run(ActionUtil.java:220) at com.intellij.openapi.actionSystem.ex.ActionUtil.performActionDumbAware(ActionUtil.java:237) at com.intellij.openapi.actionSystem.impl.ActionMenuItem$ActionTransmitter.lambda$actionPerformed$0(ActionMenuItem.java:301) at com.intellij.openapi.wm.impl.FocusManagerImpl.runOnOwnContext(FocusManagerImpl.java:307) at com.intellij.openapi.wm.impl.IdeFocusManagerImpl.runOnOwnContext(IdeFocusManagerImpl.java:104) at com.intellij.openapi.actionSystem.impl.ActionMenuItem$ActionTransmitter.actionPerformed(ActionMenuItem.java:291) at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:2022) at com.intellij.openapi.actionSystem.impl.ActionMenuItem.lambda$fireActionPerformed$0(ActionMenuItem.java:111) at com.intellij.openapi.application.TransactionGuardImpl.runSyncTransaction(TransactionGuardImpl.java:88) at com.intellij.openapi.application.TransactionGuardImpl.lambda$submitTransaction$1(TransactionGuardImpl.java:111) at com.intellij.openapi.application.TransactionGuardImpl.submitTransaction(TransactionGuardImpl.java:120) at com.intellij.openapi.application.TransactionGuard.submitTransaction(TransactionGuard.java:122) at com.intellij.openapi.actionSystem.impl.ActionMenuItem.fireActionPerformed(ActionMenuItem.java:111) at com.intellij.ui.plaf.beg.BegMenuItemUI.doClick(BegMenuItemUI.java:528) at com.intellij.ui.plaf.beg.BegMenuItemUI.access$300(BegMenuItemUI.java:48) at com.intellij.ui.plaf.beg.BegMenuItemUI$MyMouseInputHandler.mouseReleased(BegMenuItemUI.java:548) at java.awt.Component.processMouseEvent(Component.java:6548) at javax.swing.JComponent.processMouseEvent(JComponent.java:3325) at java.awt.Component.processEvent(Component.java:6313) at java.awt.Container.processEvent(Container.java:2237) at java.awt.Component.dispatchEventImpl(Component.java:4903) at java.awt.Container.dispatchEventImpl(Container.java:2295) at java.awt.Component.dispatchEvent(Component.java:4725) at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4889) at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4526) at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4467) at java.awt.Container.dispatchEventImpl(Container.java:2281) at java.awt.Window.dispatchEventImpl(Window.java:2746) at java.awt.Component.dispatchEvent(Component.java:4725) at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:764) at java.awt.EventQueue.access$500(EventQueue.java:98) at java.awt.EventQueue$3.run(EventQueue.java:715) at java.awt.EventQueue$3.run(EventQueue.java:709) at java.security.AccessController.doPrivileged(Native Method) at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:80) at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:90) at java.awt.EventQueue$4.run(EventQueue.java:737) at java.awt.EventQueue$4.run(EventQueue.java:735) at java.security.AccessController.doPrivileged(Native Method) at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:80) at java.awt.EventQueue.dispatchEvent(EventQueue.java:734) at com.intellij.ide.IdeEventQueue.defaultDispatchEvent(IdeEventQueue.java:779) at com.intellij.ide.IdeEventQueue._dispatchEvent(IdeEventQueue.java:716) at com.intellij.ide.IdeEventQueue.dispatchEvent(IdeEventQueue.java:395) at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:201) at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116) at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:105) at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101) at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:93) at java.awt.EventDispatchThread.run(EventDispatchThread.java:82) ```
test
exception when adding infinitest facet to intellij same was reported in as assertion failed registering post startup activity that will never be run disposed false open true passed true java lang throwable assertion failed registering post startup activity that will never be run disposed false open true passed true at com intellij openapi diagnostic logger asserttrue logger java at com intellij ide startup impl startupmanagerimpl registerpoststartupactivity startupmanagerimpl java at org infinitest intellij idea window infinitesttoolwindow startinfinitestafterstartup infinitesttoolwindow java at org infinitest intellij idea window infinitesttoolwindow facetinitialized infinitesttoolwindow java at org infinitest intellij idea facet infinitestfacet initfacet infinitestfacet java at com intellij facet facetmanagerimpl commit facetmanagerimpl java at com intellij facet facetmanagerimpl commit facetmanagerimpl java at com intellij facet impl facetmodelimpl commit facetmodelimpl java at com intellij facet impl projectfacetsconfigurator commitfacets projectfacetsconfigurator java at com intellij openapi roots ui configuration modulesconfigurator a modulesconfigurator java at com intellij openapi application impl applicationimpl runwriteaction applicationimpl java at com intellij openapi roots ui configuration modulesconfigurator apply modulesconfigurator java at com intellij openapi roots ui configuration projectroot modulestructureconfigurable apply modulestructureconfigurable java at com intellij openapi roots ui configuration projectstructureconfigurable apply projectstructureconfigurable java at com intellij openapi options neweditor configurableeditor apply configurableeditor java at com intellij openapi options neweditor configurableeditor apply configurableeditor java at com intellij openapi options neweditor settingsdialog dookaction settingsdialog java at com intellij openapi ui dialogwrapper okaction doaction dialogwrapper java at com intellij openapi ui dialogwrapper dialogwrapperaction actionperformed dialogwrapper java at javax swing abstractbutton fireactionperformed abstractbutton java at javax swing abstractbutton handler actionperformed abstractbutton java at javax swing defaultbuttonmodel fireactionperformed defaultbuttonmodel java at javax swing defaultbuttonmodel setpressed defaultbuttonmodel java at javax swing plaf basic basicbuttonlistener mousereleased basicbuttonlistener java at java awt component processmouseevent component java at javax swing jcomponent processmouseevent jcomponent java at java awt component processevent component java at java awt container processevent container java at java awt component dispatcheventimpl component java at java awt container dispatcheventimpl container java at java awt component dispatchevent component java at java awt lightweightdispatcher retargetmouseevent container java at java awt lightweightdispatcher processmouseevent container java at java awt lightweightdispatcher dispatchevent container java at java awt container dispatcheventimpl container java at java awt window dispatcheventimpl window java at java awt component dispatchevent component java at java awt eventqueue dispatcheventimpl eventqueue java at java awt eventqueue access eventqueue java at java awt eventqueue run eventqueue java at java awt eventqueue run eventqueue java at java security accesscontroller doprivileged native method at java security protectiondomain javasecurityaccessimpl dointersectionprivilege protectiondomain java at java security protectiondomain javasecurityaccessimpl dointersectionprivilege protectiondomain java at java awt eventqueue run eventqueue java at java awt eventqueue run eventqueue java at java security accesscontroller doprivileged native method at java security protectiondomain javasecurityaccessimpl dointersectionprivilege protectiondomain java at java awt eventqueue dispatchevent eventqueue java at com intellij ide ideeventqueue defaultdispatchevent ideeventqueue java at com intellij ide ideeventqueue dispatchevent ideeventqueue java at com intellij ide ideeventqueue dispatchevent ideeventqueue java at java awt eventdispatchthread pumponeeventforfilters eventdispatchthread java at java awt eventdispatchthread pumpeventsforfilter eventdispatchthread java at java awt eventdispatchthread pumpeventsforfilter eventdispatchthread java at java awt waitdispatchsupport run waitdispatchsupport java at java awt waitdispatchsupport run waitdispatchsupport java at java awt waitdispatchsupport run waitdispatchsupport java at java security accesscontroller doprivileged native method at java awt waitdispatchsupport enter waitdispatchsupport java at java awt dialog show dialog java at com intellij openapi ui impl dialogwrapperpeerimpl mydialog show dialogwrapperpeerimpl java at com intellij openapi ui impl dialogwrapperpeerimpl show dialogwrapperpeerimpl java at com intellij openapi ui dialogwrapper invokeshow dialogwrapper java at com intellij openapi ui dialogwrapper show dialogwrapper java at com intellij openapi options neweditor settingsdialog lambda show settingsdialog java at com intellij openapi application transactionguardimpl runsynctransaction transactionguardimpl java at com intellij openapi application transactionguardimpl submittransactionandwait transactionguardimpl java at com intellij openapi options neweditor settingsdialog show settingsdialog java at com intellij openapi ui dialogwrapper showandget dialogwrapper java at com intellij ide actions showsettingsutilimpl editconfigurable showsettingsutilimpl java at com intellij ide actions showsettingsutilimpl editconfigurable showsettingsutilimpl java at com intellij openapi roots ui configuration modulesconfigurator showdialog modulesconfigurator java at com intellij openapi roots ui configuration ideaprojectsettingsservice openmodulesettings ideaprojectsettingsservice java at com intellij ide projectview impl nodes psidirectorynode navigate psidirectorynode java at com intellij util opensourceutil navigate opensourceutil java at com intellij ide actions basenavigatetosourceaction actionperformed basenavigatetosourceaction java at com intellij openapi actionsystem ex actionutil run actionutil java at com intellij openapi actionsystem ex actionutil performactiondumbaware actionutil java at com intellij openapi actionsystem impl actionmenuitem actiontransmitter lambda actionperformed actionmenuitem java at com intellij openapi wm impl focusmanagerimpl runonowncontext focusmanagerimpl java at com intellij openapi wm impl idefocusmanagerimpl runonowncontext idefocusmanagerimpl java at com intellij openapi actionsystem impl actionmenuitem actiontransmitter actionperformed actionmenuitem java at javax swing abstractbutton fireactionperformed abstractbutton java at com intellij openapi actionsystem impl actionmenuitem lambda fireactionperformed actionmenuitem java at com intellij openapi application transactionguardimpl runsynctransaction transactionguardimpl java at com intellij openapi application transactionguardimpl lambda submittransaction transactionguardimpl java at com intellij openapi application transactionguardimpl submittransaction transactionguardimpl java at com intellij openapi application transactionguard submittransaction transactionguard java at com intellij openapi actionsystem impl actionmenuitem fireactionperformed actionmenuitem java at com intellij ui plaf beg begmenuitemui doclick begmenuitemui java at com intellij ui plaf beg begmenuitemui access begmenuitemui java at com intellij ui plaf beg begmenuitemui mymouseinputhandler mousereleased begmenuitemui java at java awt component processmouseevent component java at javax swing jcomponent processmouseevent jcomponent java at java awt component processevent component java at java awt container processevent container java at java awt component dispatcheventimpl component java at java awt container dispatcheventimpl container java at java awt component dispatchevent component java at java awt lightweightdispatcher retargetmouseevent container java at java awt lightweightdispatcher processmouseevent container java at java awt lightweightdispatcher dispatchevent container java at java awt container dispatcheventimpl container java at java awt window dispatcheventimpl window java at java awt component dispatchevent component java at java awt eventqueue dispatcheventimpl eventqueue java at java awt eventqueue access eventqueue java at java awt eventqueue run eventqueue java at java awt eventqueue run eventqueue java at java security accesscontroller doprivileged native method at java security protectiondomain javasecurityaccessimpl dointersectionprivilege protectiondomain java at java security protectiondomain javasecurityaccessimpl dointersectionprivilege protectiondomain java at java awt eventqueue run eventqueue java at java awt eventqueue run eventqueue java at java security accesscontroller doprivileged native method at java security protectiondomain javasecurityaccessimpl dointersectionprivilege protectiondomain java at java awt eventqueue dispatchevent eventqueue java at com intellij ide ideeventqueue defaultdispatchevent ideeventqueue java at com intellij ide ideeventqueue dispatchevent ideeventqueue java at com intellij ide ideeventqueue dispatchevent ideeventqueue java at java awt eventdispatchthread pumponeeventforfilters eventdispatchthread java at java awt eventdispatchthread pumpeventsforfilter eventdispatchthread java at java awt eventdispatchthread pumpeventsforhierarchy eventdispatchthread java at java awt eventdispatchthread pumpevents eventdispatchthread java at java awt eventdispatchthread pumpevents eventdispatchthread java at java awt eventdispatchthread run eventdispatchthread java
1
615,578
19,268,666,378
IssuesEvent
2021-12-10 01:06:28
yukiHaga/regex-hunting
https://api.github.com/repos/yukiHaga/regex-hunting
closed
ログインモーダルの作成
Priority: high
## 概要 ログインモーダルを作成する。 ## やること - [x] Reactのフォーム記事(https://weseek.co.jp/tech/1238/#React_Hook_Form) を読む。多分この記事で紹介されているライブラリを使う。 - [x] モーダルをどう実装するかウーバーイーツの記事を読む。 - [ ] モーダルを実装しつつ、フォームを途中から実装していく。 ## 受け入れ条件 - 以下の画像のようなモーダルが作成できている。 <img src="https://i.gyazo.com/90ccff80853ae20ab4596cdc9b806542.png" width="300"> ## 保留 以下は一旦保留とする。 - 実際に作成したモーダルで、ログインができるかを確認する。 - ログインモーダルからログインができる。 - ログイン後、マイページへ遷移する。 ## 参考記事 - [Reactでフォームをスマートに実装](https://weseek.co.jp/tech/1238/#React_Hook_Form) - [React Hook Form](https://react-hook-form.com/)
1.0
ログインモーダルの作成 - ## 概要 ログインモーダルを作成する。 ## やること - [x] Reactのフォーム記事(https://weseek.co.jp/tech/1238/#React_Hook_Form) を読む。多分この記事で紹介されているライブラリを使う。 - [x] モーダルをどう実装するかウーバーイーツの記事を読む。 - [ ] モーダルを実装しつつ、フォームを途中から実装していく。 ## 受け入れ条件 - 以下の画像のようなモーダルが作成できている。 <img src="https://i.gyazo.com/90ccff80853ae20ab4596cdc9b806542.png" width="300"> ## 保留 以下は一旦保留とする。 - 実際に作成したモーダルで、ログインができるかを確認する。 - ログインモーダルからログインができる。 - ログイン後、マイページへ遷移する。 ## 参考記事 - [Reactでフォームをスマートに実装](https://weseek.co.jp/tech/1238/#React_Hook_Form) - [React Hook Form](https://react-hook-form.com/)
non_test
ログインモーダルの作成 概要 ログインモーダルを作成する。 やること reactのフォーム記事 を読む。多分この記事で紹介されているライブラリを使う。 モーダルをどう実装するかウーバーイーツの記事を読む。 モーダルを実装しつつ、フォームを途中から実装していく。 受け入れ条件 以下の画像のようなモーダルが作成できている。 保留 以下は一旦保留とする。 実際に作成したモーダルで、ログインができるかを確認する。 ログインモーダルからログインができる。 ログイン後、マイページへ遷移する。 参考記事
0
218,898
7,332,774,213
IssuesEvent
2018-03-05 17:15:42
NCEAS/metacat
https://api.github.com/repos/NCEAS/metacat
closed
The metacat configuration showed that it was done even though the dataONE configuration hadn't been touched
Component: Bugzilla-Id Priority: Normal Status: Resolved Tracker: Bug
--- Author Name: **Jing Tao** (Jing Tao) Original Redmine Issue: 6154, https://projects.ecoinformatics.org/ecoinfo/issues/6154 Original Date: 2013-10-16 Original Assignee: Jing Tao --- It seems the configuration ignore the dataONE part.
1.0
The metacat configuration showed that it was done even though the dataONE configuration hadn't been touched - --- Author Name: **Jing Tao** (Jing Tao) Original Redmine Issue: 6154, https://projects.ecoinformatics.org/ecoinfo/issues/6154 Original Date: 2013-10-16 Original Assignee: Jing Tao --- It seems the configuration ignore the dataONE part.
non_test
the metacat configuration showed that it was done even though the dataone configuration hadn t been touched author name jing tao jing tao original redmine issue original date original assignee jing tao it seems the configuration ignore the dataone part
0
443,055
30,872,157,003
IssuesEvent
2023-08-03 12:07:44
surveyjs/survey-library
https://api.github.com/repos/surveyjs/survey-library
opened
Expressions - New Date functions are not listed in documentation
enhancement documentation
Please list the new date functions ([Expressions: Built-in support for date functions](https://surveyjs.io/stay-updated/release-notes/v1.9.101#expressions-built-in-support-for-date-functions)) within the [Built-in Functions](https://surveyjs.io/form-library/documentation/design-survey/conditional-logic#built-in-functions) section in our docs.
1.0
Expressions - New Date functions are not listed in documentation - Please list the new date functions ([Expressions: Built-in support for date functions](https://surveyjs.io/stay-updated/release-notes/v1.9.101#expressions-built-in-support-for-date-functions)) within the [Built-in Functions](https://surveyjs.io/form-library/documentation/design-survey/conditional-logic#built-in-functions) section in our docs.
non_test
expressions new date functions are not listed in documentation please list the new date functions within the section in our docs
0
237,781
19,674,833,719
IssuesEvent
2022-01-11 11:12:09
WordPress/gutenberg
https://api.github.com/repos/WordPress/gutenberg
closed
Inline global styles are now printed in the footer (Gutenberg Trunk and WP5.9 Beta4)
Needs Testing
### Description Until Gutenberg 12.1.0, `<style id="global-styles-inline-css"></style>` was printed in the header. Now in Gutenberg and WP5.9 Beta 4, the style is printed in the footer. Since https://github.com/WordPress/gutenberg/pull/37335, WordPress core handles the style enqueuing and prints it in the footer, making it harder to override the CSS. Is it on purpose? ### Step-by-step reproduction instructions 1. Use Gutenberg 12.2.0 and the Twenty Twenty-Two theme. 2. Inspect the HTML code of a front page. 3. Observe that `<style id="global-styles-inline-css"></style>` is printed at the end of the document just before the closing `</body>` tag. ### Screenshots, screen recording, code snippet _No response_ ### Environment info WordPress 5.9 Beta 4, Gutenberg 12.2.0, Twenty Twenty-Two theme. ### Please confirm that you have searched existing issues in the repo. Yes ### Please confirm that you have tested with all plugins deactivated except Gutenberg. Yes
1.0
Inline global styles are now printed in the footer (Gutenberg Trunk and WP5.9 Beta4) - ### Description Until Gutenberg 12.1.0, `<style id="global-styles-inline-css"></style>` was printed in the header. Now in Gutenberg and WP5.9 Beta 4, the style is printed in the footer. Since https://github.com/WordPress/gutenberg/pull/37335, WordPress core handles the style enqueuing and prints it in the footer, making it harder to override the CSS. Is it on purpose? ### Step-by-step reproduction instructions 1. Use Gutenberg 12.2.0 and the Twenty Twenty-Two theme. 2. Inspect the HTML code of a front page. 3. Observe that `<style id="global-styles-inline-css"></style>` is printed at the end of the document just before the closing `</body>` tag. ### Screenshots, screen recording, code snippet _No response_ ### Environment info WordPress 5.9 Beta 4, Gutenberg 12.2.0, Twenty Twenty-Two theme. ### Please confirm that you have searched existing issues in the repo. Yes ### Please confirm that you have tested with all plugins deactivated except Gutenberg. Yes
test
inline global styles are now printed in the footer gutenberg trunk and description until gutenberg was printed in the header now in gutenberg and beta the style is printed in the footer since wordpress core handles the style enqueuing and prints it in the footer making it harder to override the css is it on purpose step by step reproduction instructions use gutenberg and the twenty twenty two theme inspect the html code of a front page observe that is printed at the end of the document just before the closing tag screenshots screen recording code snippet no response environment info wordpress beta gutenberg twenty twenty two theme please confirm that you have searched existing issues in the repo yes please confirm that you have tested with all plugins deactivated except gutenberg yes
1
180,148
13,923,011,559
IssuesEvent
2020-10-21 13:57:02
WoWManiaUK/Redemption
https://api.github.com/repos/WoWManiaUK/Redemption
closed
[Boss/Dungeon] Ley-Guardian Eregos' Planar Anomalies (The Oculus)
Fix - Tester Confirmed
**Links:** https://youtu.be/TZC-NyHZ8gU?t=125 https://youtu.be/4GigIXEA4KE?t=66 https://www.wow-mania.com/armory/?npc=30879 **What is Happening:** The [Planar Anomalies](https://www.wow-mania.com/armory/?npc=30879) are currently affected by gravity and pitifully fall to the ground rather quickly, rendering them fairly useless. **What Should happen:** > Planar Shift: At 60% and 20% hp, the boss will become transparent and immune to damage for 18 seconds. Planar Anomaly sparks will appear and follow each player. Fly far away from them! They will explode before the boss reappears. This also causes an aggro wipe. If there are whelps out during this phase, it is possible to target and kill them while you're flying away, assuming your group decides to fly clockwise or counterclockwise. This seems to be on Heroic Mode only. Planar Anomaly: Casts Planar Blast if within range of a player, dealing 42,750 damage to all players within 20 yards. At 60% and 20%, the boss phases correctly and 1 [Planar Anomaly](https://www.wow-mania.com/armory/?npc=30879) spawns for each player and chases their respective targets. Players have to fly away from them to avoid their deadly explosions.
1.0
[Boss/Dungeon] Ley-Guardian Eregos' Planar Anomalies (The Oculus) - **Links:** https://youtu.be/TZC-NyHZ8gU?t=125 https://youtu.be/4GigIXEA4KE?t=66 https://www.wow-mania.com/armory/?npc=30879 **What is Happening:** The [Planar Anomalies](https://www.wow-mania.com/armory/?npc=30879) are currently affected by gravity and pitifully fall to the ground rather quickly, rendering them fairly useless. **What Should happen:** > Planar Shift: At 60% and 20% hp, the boss will become transparent and immune to damage for 18 seconds. Planar Anomaly sparks will appear and follow each player. Fly far away from them! They will explode before the boss reappears. This also causes an aggro wipe. If there are whelps out during this phase, it is possible to target and kill them while you're flying away, assuming your group decides to fly clockwise or counterclockwise. This seems to be on Heroic Mode only. Planar Anomaly: Casts Planar Blast if within range of a player, dealing 42,750 damage to all players within 20 yards. At 60% and 20%, the boss phases correctly and 1 [Planar Anomaly](https://www.wow-mania.com/armory/?npc=30879) spawns for each player and chases their respective targets. Players have to fly away from them to avoid their deadly explosions.
test
ley guardian eregos planar anomalies the oculus links what is happening the are currently affected by gravity and pitifully fall to the ground rather quickly rendering them fairly useless what should happen planar shift at and hp the boss will become transparent and immune to damage for seconds planar anomaly sparks will appear and follow each player fly far away from them they will explode before the boss reappears this also causes an aggro wipe if there are whelps out during this phase it is possible to target and kill them while you re flying away assuming your group decides to fly clockwise or counterclockwise this seems to be on heroic mode only planar anomaly casts planar blast if within range of a player dealing damage to all players within yards at and the boss phases correctly and spawns for each player and chases their respective targets players have to fly away from them to avoid their deadly explosions
1
311,137
26,770,650,911
IssuesEvent
2023-01-31 13:54:41
nrwl/nx
https://api.github.com/repos/nrwl/nx
closed
Jest tests failed after NX migration "update-jest-config-extensions"
type: bug scope: testing tools
### Current Behavior ``` Jest: Failed to parse the TypeScript config file /var/teamcity/agent/work/511c4cac333fab38/libs/***/jest.config.ts [18:05:11] TypeError: registerer.enabled is not a function ``` ### Expected Behavior Before migration tests are passed without any errors. ### GitHub Repo _No response_ ### Steps to Reproduce 1. Run migration "update-jest-config-extensions" ### Nx Report ```shell Node : 16.14.2 OS : darwin x64 npm : 8.5.0 nx : 14.0.5 @nrwl/angular : 14.0.5 @nrwl/cypress : 14.0.5 @nrwl/detox : Not Found @nrwl/devkit : 14.0.5 @nrwl/eslint-plugin-nx : 14.0.5 @nrwl/express : Not Found @nrwl/jest : 14.0.5 @nrwl/js : Not Found @nrwl/linter : 14.0.5 @nrwl/nest : Not Found @nrwl/next : Not Found @nrwl/node : Not Found @nrwl/nx-cloud : Not Found @nrwl/nx-plugin : Not Found @nrwl/react : Not Found @nrwl/react-native : Not Found @nrwl/schematics : Not Found @nrwl/storybook : 14.0.5 @nrwl/web : Not Found @nrwl/workspace : 14.0.5 typescript : 4.6.4 rxjs : 7.4.0 --------------------------------------- Community plugins: @nguniversal/builders: 13.1.1 ``` ### Failure Logs _No response_ ### Additional Information https://stackoverflow.com/questions/68960179/jest-config-ts-registerer-enabled-is-not-a-function-error-when-running-jest-f
1.0
Jest tests failed after NX migration "update-jest-config-extensions" - ### Current Behavior ``` Jest: Failed to parse the TypeScript config file /var/teamcity/agent/work/511c4cac333fab38/libs/***/jest.config.ts [18:05:11] TypeError: registerer.enabled is not a function ``` ### Expected Behavior Before migration tests are passed without any errors. ### GitHub Repo _No response_ ### Steps to Reproduce 1. Run migration "update-jest-config-extensions" ### Nx Report ```shell Node : 16.14.2 OS : darwin x64 npm : 8.5.0 nx : 14.0.5 @nrwl/angular : 14.0.5 @nrwl/cypress : 14.0.5 @nrwl/detox : Not Found @nrwl/devkit : 14.0.5 @nrwl/eslint-plugin-nx : 14.0.5 @nrwl/express : Not Found @nrwl/jest : 14.0.5 @nrwl/js : Not Found @nrwl/linter : 14.0.5 @nrwl/nest : Not Found @nrwl/next : Not Found @nrwl/node : Not Found @nrwl/nx-cloud : Not Found @nrwl/nx-plugin : Not Found @nrwl/react : Not Found @nrwl/react-native : Not Found @nrwl/schematics : Not Found @nrwl/storybook : 14.0.5 @nrwl/web : Not Found @nrwl/workspace : 14.0.5 typescript : 4.6.4 rxjs : 7.4.0 --------------------------------------- Community plugins: @nguniversal/builders: 13.1.1 ``` ### Failure Logs _No response_ ### Additional Information https://stackoverflow.com/questions/68960179/jest-config-ts-registerer-enabled-is-not-a-function-error-when-running-jest-f
test
jest tests failed after nx migration update jest config extensions current behavior jest failed to parse the typescript config file var teamcity agent work libs jest config ts typeerror registerer enabled is not a function expected behavior before migration tests are passed without any errors github repo no response steps to reproduce run migration update jest config extensions nx report shell node os darwin npm nx nrwl angular nrwl cypress nrwl detox not found nrwl devkit nrwl eslint plugin nx nrwl express not found nrwl jest nrwl js not found nrwl linter nrwl nest not found nrwl next not found nrwl node not found nrwl nx cloud not found nrwl nx plugin not found nrwl react not found nrwl react native not found nrwl schematics not found nrwl storybook nrwl web not found nrwl workspace typescript rxjs community plugins nguniversal builders failure logs no response additional information
1
273,933
20,820,967,523
IssuesEvent
2022-03-18 15:19:48
numpy/numpy
https://api.github.com/repos/numpy/numpy
opened
DOC: Misspelling "numpy.squeeze"
04 - Documentation
### Issue with current documentation: In documentation of "numpy.squeeze" (https://numpy.org/devdocs/reference/generated/numpy.squeeze.html#numpy.squeeze), paragraph 1, "axis" was misspelling as "axes". I am sure there is no "axes" in numpy? ### Idea or request for content: _No response_
1.0
DOC: Misspelling "numpy.squeeze" - ### Issue with current documentation: In documentation of "numpy.squeeze" (https://numpy.org/devdocs/reference/generated/numpy.squeeze.html#numpy.squeeze), paragraph 1, "axis" was misspelling as "axes". I am sure there is no "axes" in numpy? ### Idea or request for content: _No response_
non_test
doc misspelling numpy squeeze issue with current documentation in documentation of numpy squeeze paragraph axis was misspelling as axes i am sure there is no axes in numpy idea or request for content no response
0
193,622
15,382,616,422
IssuesEvent
2021-03-03 01:00:27
distributeaid/distributeaid.org
https://api.github.com/repos/distributeaid/distributeaid.org
opened
Setup & Document Deployment
documentation enhancement
Built & hosted by Netlify. - [ ] Use WebHooks to link it up w/ Contentful and trigger a build when new content is published. - [ ] Document this setup & other deployment / production related notes in the README.
1.0
Setup & Document Deployment - Built & hosted by Netlify. - [ ] Use WebHooks to link it up w/ Contentful and trigger a build when new content is published. - [ ] Document this setup & other deployment / production related notes in the README.
non_test
setup document deployment built hosted by netlify use webhooks to link it up w contentful and trigger a build when new content is published document this setup other deployment production related notes in the readme
0
154,951
19,765,604,901
IssuesEvent
2022-01-17 01:33:30
tuanducdesign/reactjs-mern
https://api.github.com/repos/tuanducdesign/reactjs-mern
opened
WS-2021-0153 (High) detected in ejs-2.7.4.tgz
security vulnerability
## WS-2021-0153 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ejs-2.7.4.tgz</b></p></summary> <p>Embedded JavaScript templates</p> <p>Library home page: <a href="https://registry.npmjs.org/ejs/-/ejs-2.7.4.tgz">https://registry.npmjs.org/ejs/-/ejs-2.7.4.tgz</a></p> <p>Path to dependency file: /client/package.json</p> <p>Path to vulnerable library: /client/node_modules/ejs/package.json</p> <p> Dependency Hierarchy: - react-scripts-4.0.3.tgz (Root Library) - workbox-webpack-plugin-5.1.4.tgz - workbox-build-5.1.4.tgz - rollup-plugin-off-main-thread-1.4.2.tgz - :x: **ejs-2.7.4.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/tuanducdesign/reactjs-mern/commit/f007551a72d4d0cf443bff46b7c08c4977b36c10">f007551a72d4d0cf443bff46b7c08c4977b36c10</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Arbitrary Code Injection vulnerability was found in ejs before 3.1.6. Caused by filename which isn't sanitized for display. <p>Publish Date: 2021-01-22 <p>URL: <a href=https://github.com/mde/ejs/commit/abaee2be937236b1b8da9a1f55096c17dda905fd>WS-2021-0153</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/mde/ejs/issues/571">https://github.com/mde/ejs/issues/571</a></p> <p>Release Date: 2021-01-22</p> <p>Fix Resolution: ejs - 3.1.6</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
WS-2021-0153 (High) detected in ejs-2.7.4.tgz - ## WS-2021-0153 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ejs-2.7.4.tgz</b></p></summary> <p>Embedded JavaScript templates</p> <p>Library home page: <a href="https://registry.npmjs.org/ejs/-/ejs-2.7.4.tgz">https://registry.npmjs.org/ejs/-/ejs-2.7.4.tgz</a></p> <p>Path to dependency file: /client/package.json</p> <p>Path to vulnerable library: /client/node_modules/ejs/package.json</p> <p> Dependency Hierarchy: - react-scripts-4.0.3.tgz (Root Library) - workbox-webpack-plugin-5.1.4.tgz - workbox-build-5.1.4.tgz - rollup-plugin-off-main-thread-1.4.2.tgz - :x: **ejs-2.7.4.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/tuanducdesign/reactjs-mern/commit/f007551a72d4d0cf443bff46b7c08c4977b36c10">f007551a72d4d0cf443bff46b7c08c4977b36c10</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Arbitrary Code Injection vulnerability was found in ejs before 3.1.6. Caused by filename which isn't sanitized for display. <p>Publish Date: 2021-01-22 <p>URL: <a href=https://github.com/mde/ejs/commit/abaee2be937236b1b8da9a1f55096c17dda905fd>WS-2021-0153</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/mde/ejs/issues/571">https://github.com/mde/ejs/issues/571</a></p> <p>Release Date: 2021-01-22</p> <p>Fix Resolution: ejs - 3.1.6</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
ws high detected in ejs tgz ws high severity vulnerability vulnerable library ejs tgz embedded javascript templates library home page a href path to dependency file client package json path to vulnerable library client node modules ejs package json dependency hierarchy react scripts tgz root library workbox webpack plugin tgz workbox build tgz rollup plugin off main thread tgz x ejs tgz vulnerable library found in head commit a href found in base branch master vulnerability details arbitrary code injection vulnerability was found in ejs before caused by filename which isn t sanitized for display publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ejs step up your open source security game with whitesource
0
124,046
10,292,489,499
IssuesEvent
2019-08-27 14:33:18
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
teamcity: failed test: _fk-skip_direct=false
C-test-failure O-robot
The following tests appear to have failed on master (testrace): _fk-skip_direct=false You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+_fk-skip_direct=false). [#1451701](https://teamcity.cockroachdb.com/viewLog.html?buildId=1451701): ``` _fk-skip_direct=false ...cksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. W190823 18:30:48.153789 194 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. W190823 18:30:48.154256 194 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. W190823 18:30:48.201307 22611 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. W190823 18:30:48.211644 105 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. W190823 18:30:48.212107 105 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. I190823 18:30:48.374082 210 storage/compactor/compactor.go:325 [n1,s1,compactor] purging suggested compaction for range /Table/86/1 - /Table/88 that contains live data I190823 18:30:48.374340 210 storage/compactor/compactor.go:325 [n1,s1,compactor] purging suggested compaction for range /Table/88/1 - /Table/90 that contains live data I190823 18:30:48.374557 210 storage/compactor/compactor.go:325 [n1,s1,compactor] purging suggested compaction for range /Table/94/1 - /Table/96 that contains live data I190823 18:30:48.374725 210 storage/compactor/compactor.go:325 [n1,s1,compactor] purging suggested compaction for range /Table/96/1 - /Table/98 that contains live data I190823 18:30:48.374912 210 storage/compactor/compactor.go:325 [n1,s1,compactor] purging suggested compaction for range /Table/102/1 - /Table/104 that contains live data I190823 18:30:48.375086 210 storage/compactor/compactor.go:325 [n1,s1,compactor] purging suggested compaction for range /Table/104/1 - /Table/106 that contains live data I190823 18:30:48.375224 210 storage/compactor/compactor.go:325 [n1,s1,compactor] purging suggested compaction for range /Table/106/1 - /Table/109 that contains live data I190823 18:30:48.381440 22623 storage/replica_command.go:598 [n1,merge,s1,r82/1:/Table/1{09-11}] initiating a merge of r84:/Table/11{1-3} [(n1,s1):1, next=2, gen=34] into this range (lhs+rhs has (size=0 B+0 B qps=0.00+0.00 --> 0.00qps) below threshold (size=0 B, qps=0.00)) I190823 18:30:48.398833 22327 storage/replica_command.go:284 [n1,split,s1,r110/1:/Table/13{5/1-6/1}] initiating a split of this range at key /Table/136 [r112] (zone config) I190823 18:30:48.468829 22732 storage/replica_command.go:284 [n1,split,s1,r107/1:/Table/13{3/1-5/1}] initiating a split of this range at key /Table/135 [r113] (zone config) I190823 18:30:48.533514 169 storage/store.go:2593 [n1,s1,r82/1:/Table/1{09-11}] removing replica r84/1 I190823 18:30:48.866825 769 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:49446,user=root] publish: descID=136 (cities) version=3 mtime=2019-08-23 18:30:48.630896705 +0000 UTC I190823 18:30:48.945511 769 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:49446,user=root] publish: descID=135 (weather) version=3 mtime=2019-08-23 18:30:48.630896705 +0000 UTC I190823 18:30:49.081018 769 sql/event_log.go:130 [n1,client=127.0.0.1:49446,user=root] Event: "drop_database", target: 134, info: {DatabaseName:d41 Statement:DROP DATABASE d41 User:root DroppedSchemaObjects:[d41.public.cities d41.public.weather]} I190823 18:30:49.186134 769 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:49446,user=root,scExec] publish: descID=136 (cities) version=4 mtime=2019-08-23 18:30:49.18480121 +0000 UTC I190823 18:30:49.402274 769 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:49446,user=root,scExec] publish: descID=135 (weather) version=4 mtime=2019-08-23 18:30:49.399720022 +0000 UTC ``` Please assign, take a look and update the issue accordingly.
1.0
teamcity: failed test: _fk-skip_direct=false - The following tests appear to have failed on master (testrace): _fk-skip_direct=false You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+_fk-skip_direct=false). [#1451701](https://teamcity.cockroachdb.com/viewLog.html?buildId=1451701): ``` _fk-skip_direct=false ...cksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. W190823 18:30:48.153789 194 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. W190823 18:30:48.154256 194 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. W190823 18:30:48.201307 22611 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. W190823 18:30:48.211644 105 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. W190823 18:30:48.212107 105 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. I190823 18:30:48.374082 210 storage/compactor/compactor.go:325 [n1,s1,compactor] purging suggested compaction for range /Table/86/1 - /Table/88 that contains live data I190823 18:30:48.374340 210 storage/compactor/compactor.go:325 [n1,s1,compactor] purging suggested compaction for range /Table/88/1 - /Table/90 that contains live data I190823 18:30:48.374557 210 storage/compactor/compactor.go:325 [n1,s1,compactor] purging suggested compaction for range /Table/94/1 - /Table/96 that contains live data I190823 18:30:48.374725 210 storage/compactor/compactor.go:325 [n1,s1,compactor] purging suggested compaction for range /Table/96/1 - /Table/98 that contains live data I190823 18:30:48.374912 210 storage/compactor/compactor.go:325 [n1,s1,compactor] purging suggested compaction for range /Table/102/1 - /Table/104 that contains live data I190823 18:30:48.375086 210 storage/compactor/compactor.go:325 [n1,s1,compactor] purging suggested compaction for range /Table/104/1 - /Table/106 that contains live data I190823 18:30:48.375224 210 storage/compactor/compactor.go:325 [n1,s1,compactor] purging suggested compaction for range /Table/106/1 - /Table/109 that contains live data I190823 18:30:48.381440 22623 storage/replica_command.go:598 [n1,merge,s1,r82/1:/Table/1{09-11}] initiating a merge of r84:/Table/11{1-3} [(n1,s1):1, next=2, gen=34] into this range (lhs+rhs has (size=0 B+0 B qps=0.00+0.00 --> 0.00qps) below threshold (size=0 B, qps=0.00)) I190823 18:30:48.398833 22327 storage/replica_command.go:284 [n1,split,s1,r110/1:/Table/13{5/1-6/1}] initiating a split of this range at key /Table/136 [r112] (zone config) I190823 18:30:48.468829 22732 storage/replica_command.go:284 [n1,split,s1,r107/1:/Table/13{3/1-5/1}] initiating a split of this range at key /Table/135 [r113] (zone config) I190823 18:30:48.533514 169 storage/store.go:2593 [n1,s1,r82/1:/Table/1{09-11}] removing replica r84/1 I190823 18:30:48.866825 769 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:49446,user=root] publish: descID=136 (cities) version=3 mtime=2019-08-23 18:30:48.630896705 +0000 UTC I190823 18:30:48.945511 769 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:49446,user=root] publish: descID=135 (weather) version=3 mtime=2019-08-23 18:30:48.630896705 +0000 UTC I190823 18:30:49.081018 769 sql/event_log.go:130 [n1,client=127.0.0.1:49446,user=root] Event: "drop_database", target: 134, info: {DatabaseName:d41 Statement:DROP DATABASE d41 User:root DroppedSchemaObjects:[d41.public.cities d41.public.weather]} I190823 18:30:49.186134 769 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:49446,user=root,scExec] publish: descID=136 (cities) version=4 mtime=2019-08-23 18:30:49.18480121 +0000 UTC I190823 18:30:49.402274 769 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:49446,user=root,scExec] publish: descID=135 (weather) version=4 mtime=2019-08-23 18:30:49.399720022 +0000 UTC ``` Please assign, take a look and update the issue accordingly.
test
teamcity failed test fk skip direct false the following tests appear to have failed on master testrace fk skip direct false you may want to check fk skip direct false cksdb more existing levels in db than needed max bytes for level multiplier may not be guaranteed storage engine rocksdb go more existing levels in db than needed max bytes for level multiplier may not be guaranteed storage engine rocksdb go more existing levels in db than needed max bytes for level multiplier may not be guaranteed storage engine rocksdb go more existing levels in db than needed max bytes for level multiplier may not be guaranteed storage engine rocksdb go more existing levels in db than needed max bytes for level multiplier may not be guaranteed storage engine rocksdb go more existing levels in db than needed max bytes for level multiplier may not be guaranteed storage compactor compactor go purging suggested compaction for range table table that contains live data storage compactor compactor go purging suggested compaction for range table table that contains live data storage compactor compactor go purging suggested compaction for range table table that contains live data storage compactor compactor go purging suggested compaction for range table table that contains live data storage compactor compactor go purging suggested compaction for range table table that contains live data storage compactor compactor go purging suggested compaction for range table table that contains live data storage compactor compactor go purging suggested compaction for range table table that contains live data storage replica command go initiating a merge of table into this range lhs rhs has size b b qps below threshold size b qps storage replica command go initiating a split of this range at key table zone config storage replica command go initiating a split of this range at key table zone config storage store go removing replica sql sqlbase structured go publish descid cities version mtime utc sql sqlbase structured go publish descid weather version mtime utc sql event log go event drop database target info databasename statement drop database user root droppedschemaobjects sql sqlbase structured go publish descid cities version mtime utc sql sqlbase structured go publish descid weather version mtime utc please assign take a look and update the issue accordingly
1
331,560
29,042,434,101
IssuesEvent
2023-05-13 05:29:29
Cookie-AutoDelete/Cookie-AutoDelete
https://api.github.com/repos/Cookie-AutoDelete/Cookie-AutoDelete
closed
[Bug] Windows Ding on popup
browserbug/limitation untested bug/issue Firefox
### Acknowledgements - [X] I acknowledge that I have read the above items ### Describe the bug I made a mistake, inadvertently, by engaging updates on Windows 11 (cumulative 22h2 KB5026372), .and. on an extension (either CAD to 3.8.2 or containers ....), using firefox (also recently updated to 113) I am often in a browser, opening and closing windows, for work, and so the notification pops up regularly for deleted cookies and etc. But, until the updates I just did, they slid in quietly, and stacked (usually, 2 per tab/domain). Silence was a welcome treat, for the CAD notices, but I still want to get dinged on the irregular notifications. But, now .... no more silence. Am I missing something? I remember last week not noticing the CAD notices (other than the visual out of the corner of my eye). Is there a way to stop the ding, for CAD, but not for all notifications ? Another hazy memory ... i seem to remember that the CAD notice popped up on the monitor where I had FF open, or even within the browser window. Has all this changed? From FF ? CAD ? Win11 ? ### To Reproduce . ### Expected Behavior Silent notifications ### Screenshots _No response_ ### System Info - Operating System (OS) OS: x86-64 Windows (22621.1702) ### System Info - Browser Info Firefox 113 (20230504192738) ### System Info - CookieAutoDelete Version 3.8.2 ### Additional Context Acknowledgements I acknowledge that I have read the above items Describe the bug I made a mistake, inadvertently, by engaging updates on Windows 11 (cumulative 22h2 KB5026372), .and. on an extension (either CAD to 3.8.2 or containers ....), using firefox (also recently updated to 113) I am often in a browser, opening and closing windows, for work, and so the notification pops up regularly for deleted cookies and etc. But, until the updates I just did, they slid in quietly, and stacked (usually, 2 per tab/domain). Silence was a welcome treat, for the CAD notices, but I still want to get dinged on the irregular notifications. But, now .... no more silence. Am I missing something? I remember last week not noticing the CAD notices (other than the visual out of the corner of my eye). Is there a way to stop the ding, for CAD, but not for all notifications ? Another hazy memory ... i seem to remember that the CAD notice popped up on the monitor where I had FF open, or even within the browser window. Has all this changed? From FF ? CAD ? Win11 ? To Reproduce . Expected Behavior . Screenshots No response System Info - Operating System (OS) OS: x86-64 Windows (22621.1702) System Info - Browser Info Firefox 113 (20230504192738) System Info - CookieAutoDelete Version 3.8.2 Additional Context activeMode: true delayBeforeClean: 2 discardedCleanup: true domainChangeCleanup: false enableGreyListCleanup: true cleanCookiesFromOpenTabsOnStartup: false cleanExpiredCookies: true siteDataEmptyOnEnable: true cacheCleanup: true indexedDBCleanup: true localStorageCleanup: true pluginDataCleanup: false serviceWorkersCleanup: false contextualIdentities: true contextualIdentitiesAutoRemove: true statLogging: true showNumOfCookiesInIcon: true keepDefaultIcon: false showNotificationAfterCleanup: true manualNotifications: true notificationOnScreen: 3 enableNewVersionPopup: true sizePopup: 14 sizeSetting: 14 contextMenus: true debugMode: false
1.0
[Bug] Windows Ding on popup - ### Acknowledgements - [X] I acknowledge that I have read the above items ### Describe the bug I made a mistake, inadvertently, by engaging updates on Windows 11 (cumulative 22h2 KB5026372), .and. on an extension (either CAD to 3.8.2 or containers ....), using firefox (also recently updated to 113) I am often in a browser, opening and closing windows, for work, and so the notification pops up regularly for deleted cookies and etc. But, until the updates I just did, they slid in quietly, and stacked (usually, 2 per tab/domain). Silence was a welcome treat, for the CAD notices, but I still want to get dinged on the irregular notifications. But, now .... no more silence. Am I missing something? I remember last week not noticing the CAD notices (other than the visual out of the corner of my eye). Is there a way to stop the ding, for CAD, but not for all notifications ? Another hazy memory ... i seem to remember that the CAD notice popped up on the monitor where I had FF open, or even within the browser window. Has all this changed? From FF ? CAD ? Win11 ? ### To Reproduce . ### Expected Behavior Silent notifications ### Screenshots _No response_ ### System Info - Operating System (OS) OS: x86-64 Windows (22621.1702) ### System Info - Browser Info Firefox 113 (20230504192738) ### System Info - CookieAutoDelete Version 3.8.2 ### Additional Context Acknowledgements I acknowledge that I have read the above items Describe the bug I made a mistake, inadvertently, by engaging updates on Windows 11 (cumulative 22h2 KB5026372), .and. on an extension (either CAD to 3.8.2 or containers ....), using firefox (also recently updated to 113) I am often in a browser, opening and closing windows, for work, and so the notification pops up regularly for deleted cookies and etc. But, until the updates I just did, they slid in quietly, and stacked (usually, 2 per tab/domain). Silence was a welcome treat, for the CAD notices, but I still want to get dinged on the irregular notifications. But, now .... no more silence. Am I missing something? I remember last week not noticing the CAD notices (other than the visual out of the corner of my eye). Is there a way to stop the ding, for CAD, but not for all notifications ? Another hazy memory ... i seem to remember that the CAD notice popped up on the monitor where I had FF open, or even within the browser window. Has all this changed? From FF ? CAD ? Win11 ? To Reproduce . Expected Behavior . Screenshots No response System Info - Operating System (OS) OS: x86-64 Windows (22621.1702) System Info - Browser Info Firefox 113 (20230504192738) System Info - CookieAutoDelete Version 3.8.2 Additional Context activeMode: true delayBeforeClean: 2 discardedCleanup: true domainChangeCleanup: false enableGreyListCleanup: true cleanCookiesFromOpenTabsOnStartup: false cleanExpiredCookies: true siteDataEmptyOnEnable: true cacheCleanup: true indexedDBCleanup: true localStorageCleanup: true pluginDataCleanup: false serviceWorkersCleanup: false contextualIdentities: true contextualIdentitiesAutoRemove: true statLogging: true showNumOfCookiesInIcon: true keepDefaultIcon: false showNotificationAfterCleanup: true manualNotifications: true notificationOnScreen: 3 enableNewVersionPopup: true sizePopup: 14 sizeSetting: 14 contextMenus: true debugMode: false
test
windows ding on popup acknowledgements i acknowledge that i have read the above items describe the bug i made a mistake inadvertently by engaging updates on windows cumulative and on an extension either cad to or containers using firefox also recently updated to i am often in a browser opening and closing windows for work and so the notification pops up regularly for deleted cookies and etc but until the updates i just did they slid in quietly and stacked usually per tab domain silence was a welcome treat for the cad notices but i still want to get dinged on the irregular notifications but now no more silence am i missing something i remember last week not noticing the cad notices other than the visual out of the corner of my eye is there a way to stop the ding for cad but not for all notifications another hazy memory i seem to remember that the cad notice popped up on the monitor where i had ff open or even within the browser window has all this changed from ff cad to reproduce expected behavior silent notifications screenshots no response system info operating system os os windows system info browser info firefox system info cookieautodelete version additional context acknowledgements i acknowledge that i have read the above items describe the bug i made a mistake inadvertently by engaging updates on windows cumulative and on an extension either cad to or containers using firefox also recently updated to i am often in a browser opening and closing windows for work and so the notification pops up regularly for deleted cookies and etc but until the updates i just did they slid in quietly and stacked usually per tab domain silence was a welcome treat for the cad notices but i still want to get dinged on the irregular notifications but now no more silence am i missing something i remember last week not noticing the cad notices other than the visual out of the corner of my eye is there a way to stop the ding for cad but not for all notifications another hazy memory i seem to remember that the cad notice popped up on the monitor where i had ff open or even within the browser window has all this changed from ff cad to reproduce expected behavior screenshots no response system info operating system os os windows system info browser info firefox system info cookieautodelete version additional context activemode true delaybeforeclean discardedcleanup true domainchangecleanup false enablegreylistcleanup true cleancookiesfromopentabsonstartup false cleanexpiredcookies true sitedataemptyonenable true cachecleanup true indexeddbcleanup true localstoragecleanup true plugindatacleanup false serviceworkerscleanup false contextualidentities true contextualidentitiesautoremove true statlogging true shownumofcookiesinicon true keepdefaulticon false shownotificationaftercleanup true manualnotifications true notificationonscreen enablenewversionpopup true sizepopup sizesetting contextmenus true debugmode false
1
122,980
10,242,820,305
IssuesEvent
2019-08-20 06:30:06
ballerina-platform/ballerina-lang
https://api.github.com/repos/ballerina-platform/ballerina-lang
closed
No way to provide scopes and claims in jwt outbound
Area/StandardLibs BetaTesting Component/JWT Type/Bug
**Description:** There is no way of providing scopes or claims in jwt outbound configs **Steps to reproduce:** **Affected Versions:** **OS, DB, other environment details and versions:** **Related Issues (optional):** <!-- Any related issues such as sub tasks, issues reported in other repositories (e.g component repositories), similar problems, etc. --> **Suggested Labels (optional):** <!-- Optional comma separated list of suggested labels. Non committers can’t assign labels to issues, so this will help issue creators who are not a committer to suggest possible labels--> **Suggested Assignees (optional):** <!--Optional comma separated list of suggested team members who should attend the issue. Non committers can’t assign issues to assignees, so this will help issue creators who are not a committer to suggest possible assignees-->
1.0
No way to provide scopes and claims in jwt outbound - **Description:** There is no way of providing scopes or claims in jwt outbound configs **Steps to reproduce:** **Affected Versions:** **OS, DB, other environment details and versions:** **Related Issues (optional):** <!-- Any related issues such as sub tasks, issues reported in other repositories (e.g component repositories), similar problems, etc. --> **Suggested Labels (optional):** <!-- Optional comma separated list of suggested labels. Non committers can’t assign labels to issues, so this will help issue creators who are not a committer to suggest possible labels--> **Suggested Assignees (optional):** <!--Optional comma separated list of suggested team members who should attend the issue. Non committers can’t assign issues to assignees, so this will help issue creators who are not a committer to suggest possible assignees-->
test
no way to provide scopes and claims in jwt outbound description there is no way of providing scopes or claims in jwt outbound configs steps to reproduce affected versions os db other environment details and versions related issues optional suggested labels optional suggested assignees optional
1
214,472
24,077,713,090
IssuesEvent
2022-09-19 01:03:09
DavidSpek/pipelines
https://api.github.com/repos/DavidSpek/pipelines
opened
CVE-2022-35968 (Medium) detected in tensorflow-1.15.0-cp27-cp27mu-manylinux2010_x86_64.whl
security vulnerability
## CVE-2022-35968 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tensorflow-1.15.0-cp27-cp27mu-manylinux2010_x86_64.whl</b></p></summary> <p>TensorFlow is an open source machine learning framework for everyone.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/ec/98/f968caf5f65759e78873b900cbf0ae20b1699fb11268ecc0f892186419a7/tensorflow-1.15.0-cp27-cp27mu-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/ec/98/f968caf5f65759e78873b900cbf0ae20b1699fb11268ecc0f892186419a7/tensorflow-1.15.0-cp27-cp27mu-manylinux2010_x86_64.whl</a></p> <p>Path to dependency file: /contrib/components/openvino/ovms-deployer/containers/requirements.txt</p> <p>Path to vulnerable library: /contrib/components/openvino/ovms-deployer/containers/requirements.txt,/samples/core/ai_platform/training</p> <p> Dependency Hierarchy: - :x: **tensorflow-1.15.0-cp27-cp27mu-manylinux2010_x86_64.whl** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/DavidSpek/pipelines/commit/6f7433f006e282c4f25441e7502b80d73751e38f">6f7433f006e282c4f25441e7502b80d73751e38f</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> TensorFlow is an open source platform for machine learning. The implementation of `AvgPoolGrad` does not fully validate the input `orig_input_shape`. This results in a `CHECK` failure which can be used to trigger a denial of service attack. We have patched the issue in GitHub commit 3a6ac52664c6c095aa2b114e742b0aa17fdce78f. The fix will be included in TensorFlow 2.10.0. We will also cherrypick this commit on TensorFlow 2.9.1, TensorFlow 2.8.1, and TensorFlow 2.7.2, as these are also affected and still in supported range. There are no known workarounds for this issue. <p>Publish Date: 2022-09-16 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-35968>CVE-2022-35968</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-2475-53vw-vp25">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-2475-53vw-vp25</a></p> <p>Release Date: 2022-09-16</p> <p>Fix Resolution: tensorflow - 2.7.2,2.8.1,2.9.1,2.10.0, tensorflow-cpu - 2.7.2,2.8.1,2.9.1,2.10.0, tensorflow-gpu - 2.7.2,2.8.1,2.9.1,2.10.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-35968 (Medium) detected in tensorflow-1.15.0-cp27-cp27mu-manylinux2010_x86_64.whl - ## CVE-2022-35968 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tensorflow-1.15.0-cp27-cp27mu-manylinux2010_x86_64.whl</b></p></summary> <p>TensorFlow is an open source machine learning framework for everyone.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/ec/98/f968caf5f65759e78873b900cbf0ae20b1699fb11268ecc0f892186419a7/tensorflow-1.15.0-cp27-cp27mu-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/ec/98/f968caf5f65759e78873b900cbf0ae20b1699fb11268ecc0f892186419a7/tensorflow-1.15.0-cp27-cp27mu-manylinux2010_x86_64.whl</a></p> <p>Path to dependency file: /contrib/components/openvino/ovms-deployer/containers/requirements.txt</p> <p>Path to vulnerable library: /contrib/components/openvino/ovms-deployer/containers/requirements.txt,/samples/core/ai_platform/training</p> <p> Dependency Hierarchy: - :x: **tensorflow-1.15.0-cp27-cp27mu-manylinux2010_x86_64.whl** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/DavidSpek/pipelines/commit/6f7433f006e282c4f25441e7502b80d73751e38f">6f7433f006e282c4f25441e7502b80d73751e38f</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> TensorFlow is an open source platform for machine learning. The implementation of `AvgPoolGrad` does not fully validate the input `orig_input_shape`. This results in a `CHECK` failure which can be used to trigger a denial of service attack. We have patched the issue in GitHub commit 3a6ac52664c6c095aa2b114e742b0aa17fdce78f. The fix will be included in TensorFlow 2.10.0. We will also cherrypick this commit on TensorFlow 2.9.1, TensorFlow 2.8.1, and TensorFlow 2.7.2, as these are also affected and still in supported range. There are no known workarounds for this issue. <p>Publish Date: 2022-09-16 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-35968>CVE-2022-35968</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-2475-53vw-vp25">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-2475-53vw-vp25</a></p> <p>Release Date: 2022-09-16</p> <p>Fix Resolution: tensorflow - 2.7.2,2.8.1,2.9.1,2.10.0, tensorflow-cpu - 2.7.2,2.8.1,2.9.1,2.10.0, tensorflow-gpu - 2.7.2,2.8.1,2.9.1,2.10.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve medium detected in tensorflow whl cve medium severity vulnerability vulnerable library tensorflow whl tensorflow is an open source machine learning framework for everyone library home page a href path to dependency file contrib components openvino ovms deployer containers requirements txt path to vulnerable library contrib components openvino ovms deployer containers requirements txt samples core ai platform training dependency hierarchy x tensorflow whl vulnerable library found in head commit a href found in base branch master vulnerability details tensorflow is an open source platform for machine learning the implementation of avgpoolgrad does not fully validate the input orig input shape this results in a check failure which can be used to trigger a denial of service attack we have patched the issue in github commit the fix will be included in tensorflow we will also cherrypick this commit on tensorflow tensorflow and tensorflow as these are also affected and still in supported range there are no known workarounds for this issue publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tensorflow tensorflow cpu tensorflow gpu step up your open source security game with mend
0
5,956
8,780,741,483
IssuesEvent
2018-12-19 18:14:48
knative/serving
https://api.github.com/repos/knative/serving
closed
Have a backup repository for our released artifacts
area/test-and-release kind/process
<!-- Pro-tip: You can leave this block commented, and it still works! Select the appropriate areas for your issue: /area test-and-release Classify what kind of issue this is: /kind process /assign @jonjohnsonjr --> ## Expected Behavior We have a back up location for our Knative Serving artifacts. ## Actual Behavior We don't.
1.0
Have a backup repository for our released artifacts - <!-- Pro-tip: You can leave this block commented, and it still works! Select the appropriate areas for your issue: /area test-and-release Classify what kind of issue this is: /kind process /assign @jonjohnsonjr --> ## Expected Behavior We have a back up location for our Knative Serving artifacts. ## Actual Behavior We don't.
non_test
have a backup repository for our released artifacts pro tip you can leave this block commented and it still works select the appropriate areas for your issue area test and release classify what kind of issue this is kind process assign jonjohnsonjr expected behavior we have a back up location for our knative serving artifacts actual behavior we don t
0
187,685
14,429,064,166
IssuesEvent
2020-12-06 12:43:16
kalexmills/github-vet-tests-dec2020
https://api.github.com/repos/kalexmills/github-vet-tests-dec2020
closed
fibercrypto/fibercryptowallet: vendor/github.com/SkycoinProject/skycoin/src/api/integration/integration_test.go; 15 LoC
fresh small test vendored
Found a possible issue in [fibercrypto/fibercryptowallet](https://www.github.com/fibercrypto/fibercryptowallet) at [vendor/github.com/SkycoinProject/skycoin/src/api/integration/integration_test.go](https://github.com/fibercrypto/fibercryptowallet/blob/fb9e9d3455a254b9202d24ab8af689fbb6db083d/vendor/github.com/SkycoinProject/skycoin/src/api/integration/integration_test.go#L1540-L1554) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > function call which takes a reference to b at line 1541 may start a goroutine [Click here to see the code in its original context.](https://github.com/fibercrypto/fibercryptowallet/blob/fb9e9d3455a254b9202d24ab8af689fbb6db083d/vendor/github.com/SkycoinProject/skycoin/src/api/integration/integration_test.go#L1540-L1554) <details> <summary>Click here to show the 15 line(s) of Go which triggered the analyzer.</summary> ```go for idx, b := range blocks.Blocks { assertVerboseBlockFee(t, &b) if prevBlock != nil { require.Equal(t, prevBlock.Head.Hash, b.Head.PreviousHash) } bHash, err := c.BlockByHashVerbose(b.Head.Hash) require.Equal(t, uint64(idx)+start, b.Head.BkSeq) require.NoError(t, err) require.NotNil(t, bHash) require.Equal(t, b, *bHash) prevBlock = &blocks.Blocks[idx] } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: fb9e9d3455a254b9202d24ab8af689fbb6db083d
1.0
fibercrypto/fibercryptowallet: vendor/github.com/SkycoinProject/skycoin/src/api/integration/integration_test.go; 15 LoC - Found a possible issue in [fibercrypto/fibercryptowallet](https://www.github.com/fibercrypto/fibercryptowallet) at [vendor/github.com/SkycoinProject/skycoin/src/api/integration/integration_test.go](https://github.com/fibercrypto/fibercryptowallet/blob/fb9e9d3455a254b9202d24ab8af689fbb6db083d/vendor/github.com/SkycoinProject/skycoin/src/api/integration/integration_test.go#L1540-L1554) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > function call which takes a reference to b at line 1541 may start a goroutine [Click here to see the code in its original context.](https://github.com/fibercrypto/fibercryptowallet/blob/fb9e9d3455a254b9202d24ab8af689fbb6db083d/vendor/github.com/SkycoinProject/skycoin/src/api/integration/integration_test.go#L1540-L1554) <details> <summary>Click here to show the 15 line(s) of Go which triggered the analyzer.</summary> ```go for idx, b := range blocks.Blocks { assertVerboseBlockFee(t, &b) if prevBlock != nil { require.Equal(t, prevBlock.Head.Hash, b.Head.PreviousHash) } bHash, err := c.BlockByHashVerbose(b.Head.Hash) require.Equal(t, uint64(idx)+start, b.Head.BkSeq) require.NoError(t, err) require.NotNil(t, bHash) require.Equal(t, b, *bHash) prevBlock = &blocks.Blocks[idx] } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: fb9e9d3455a254b9202d24ab8af689fbb6db083d
test
fibercrypto fibercryptowallet vendor github com skycoinproject skycoin src api integration integration test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message function call which takes a reference to b at line may start a goroutine click here to show the line s of go which triggered the analyzer go for idx b range blocks blocks assertverboseblockfee t b if prevblock nil require equal t prevblock head hash b head previoushash bhash err c blockbyhashverbose b head hash require equal t idx start b head bkseq require noerror t err require notnil t bhash require equal t b bhash prevblock blocks blocks leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
1
285,436
8,758,943,820
IssuesEvent
2018-12-15 10:38:21
Qiskit/qiskit-terra
https://api.github.com/repos/Qiskit/qiskit-terra
closed
ctrl-c (or equiv Interrupt in a Jupyter notebook) does not stop the local_qasm_simulator
priority: low type: bug
<!-- ⚠️ If you do not respect this template, your issue will be closed --> <!-- ⚠️ Make sure to browse the opened and closed issues --> ### Information - **Qiskit Terra version**: latest master - **Python version**: 3.7 - **Operating system**: OSX ### What is the current behavior? Doing ctrl-c, or equivalently kernel->interrupt in a Jupyter notebook does not stop the `local_qasm_simulator`. Instead, you must restart the whole Python interpreter to kill the process. ### Steps to reproduce the problem ### What is the expected behavior? It should be possible to ctrl-c, or equiv kernel->interrupt, a `local_qasm_simulator`, and have the underlying process terminate. ### Suggested solutions
1.0
ctrl-c (or equiv Interrupt in a Jupyter notebook) does not stop the local_qasm_simulator - <!-- ⚠️ If you do not respect this template, your issue will be closed --> <!-- ⚠️ Make sure to browse the opened and closed issues --> ### Information - **Qiskit Terra version**: latest master - **Python version**: 3.7 - **Operating system**: OSX ### What is the current behavior? Doing ctrl-c, or equivalently kernel->interrupt in a Jupyter notebook does not stop the `local_qasm_simulator`. Instead, you must restart the whole Python interpreter to kill the process. ### Steps to reproduce the problem ### What is the expected behavior? It should be possible to ctrl-c, or equiv kernel->interrupt, a `local_qasm_simulator`, and have the underlying process terminate. ### Suggested solutions
non_test
ctrl c or equiv interrupt in a jupyter notebook does not stop the local qasm simulator information qiskit terra version latest master python version operating system osx what is the current behavior doing ctrl c or equivalently kernel interrupt in a jupyter notebook does not stop the local qasm simulator instead you must restart the whole python interpreter to kill the process steps to reproduce the problem what is the expected behavior it should be possible to ctrl c or equiv kernel interrupt a local qasm simulator and have the underlying process terminate suggested solutions
0
7,455
6,038,834,237
IssuesEvent
2017-06-09 22:48:03
coala/coala
https://api.github.com/repos/coala/coala
opened
Nextgen Core: Port function to class methods
area/core difficulty/medium type/performance
This could be a performance gain as we don't have to pass always the same variables around. Instead we can access them from `self`.
True
Nextgen Core: Port function to class methods - This could be a performance gain as we don't have to pass always the same variables around. Instead we can access them from `self`.
non_test
nextgen core port function to class methods this could be a performance gain as we don t have to pass always the same variables around instead we can access them from self
0
161,993
12,603,163,183
IssuesEvent
2020-06-11 13:02:47
Chiniga/pa-queue-ph
https://api.github.com/repos/Chiniga/pa-queue-ph
closed
Sorting should be alphabetical and case insensitive.
P2 enhancement ready for testing
This should apply to Queue, Players & Court list.
1.0
Sorting should be alphabetical and case insensitive. - This should apply to Queue, Players & Court list.
test
sorting should be alphabetical and case insensitive this should apply to queue players court list
1
136,172
11,042,479,409
IssuesEvent
2019-12-09 09:15:15
input-output-hk/chain-libs
https://api.github.com/repos/input-output-hk/chain-libs
opened
[Rewards] max limit parameter has no effect on fixed tax
test
When using **max_limit** parameter along with **fixed tax**, reward mechanism does not take **max_limit** parameter value into account. Example: Given stake pool registration: ``` StakePool { ..., rewards: TaxType { fixed: Value(100), ratio: Ratio { numerator: 0, denominator: 1 }, max_limit: Some(30) }, .... } ``` Actual reward: **100** Expected reward: **30**
1.0
[Rewards] max limit parameter has no effect on fixed tax - When using **max_limit** parameter along with **fixed tax**, reward mechanism does not take **max_limit** parameter value into account. Example: Given stake pool registration: ``` StakePool { ..., rewards: TaxType { fixed: Value(100), ratio: Ratio { numerator: 0, denominator: 1 }, max_limit: Some(30) }, .... } ``` Actual reward: **100** Expected reward: **30**
test
max limit parameter has no effect on fixed tax when using max limit parameter along with fixed tax reward mechanism does not take max limit parameter value into account example given stake pool registration stakepool rewards taxtype fixed value ratio ratio numerator denominator max limit some actual reward expected reward
1
338,090
30,279,697,593
IssuesEvent
2023-07-08 00:46:43
microsoft/AzureStorageExplorer
https://api.github.com/repos/microsoft/AzureStorageExplorer
closed
There is no error on 'Clone and Rehydrate' dialog when typing the source blob container name to 'Destination blob container' if typing a valid blob container name previously
:heavy_check_mark: merged 🧪 testing :gear: blobs :beetle: regression
**Storage Explorer Version**: 1.31.0-dev **Build Number**: 20230625.1 **Branch**: main **Platform/OS**: Windows 10/Linux Ubuntu 20.04/MacOS Ventura 13.4 (Apple M1 Pro) **Architecture**: x64/arm64 **How Found**: ad-hoc testing **Regression From**: Previous release (1.29.2) ## Steps to Reproduce ## 1. Expand one storage account -> Blob Containers. 2. Create two blob containers named 'bc01, bc02'-> Upload one blob to blob container 'bc01'. 3. Change the access of the blob to 'Archive'. 4. Right click the blob -> Click 'Clone and Rehydrate...'. 5. Type 'bc02' to 'Destination blob container' field -> The error for 'Destination blob name' field the disappears. 6. Delete 'bc02' -> Type 'bc01' to 'Destination blob container' field. 7. Check whether the error for 'Destination blob name' field appears. ## Expected Experience ## The error for 'Destination blob name' field appears. ![image](https://github.com/microsoft/AzureStorageExplorer/assets/41351993/d2e393d1-fcd1-43be-801c-0a9d554976a6) ## Actual Experience ## The error for 'Destination blob name' field doesn't appear. ![image](https://github.com/microsoft/AzureStorageExplorer/assets/41351993/01b46249-2ec2-4b89-b6fe-7a1b6bc2f582) ## Additional Context ## 1. This issue also reproduces when copying one archived blob. 2. This issue can't be reproduced consistently.
1.0
There is no error on 'Clone and Rehydrate' dialog when typing the source blob container name to 'Destination blob container' if typing a valid blob container name previously - **Storage Explorer Version**: 1.31.0-dev **Build Number**: 20230625.1 **Branch**: main **Platform/OS**: Windows 10/Linux Ubuntu 20.04/MacOS Ventura 13.4 (Apple M1 Pro) **Architecture**: x64/arm64 **How Found**: ad-hoc testing **Regression From**: Previous release (1.29.2) ## Steps to Reproduce ## 1. Expand one storage account -> Blob Containers. 2. Create two blob containers named 'bc01, bc02'-> Upload one blob to blob container 'bc01'. 3. Change the access of the blob to 'Archive'. 4. Right click the blob -> Click 'Clone and Rehydrate...'. 5. Type 'bc02' to 'Destination blob container' field -> The error for 'Destination blob name' field the disappears. 6. Delete 'bc02' -> Type 'bc01' to 'Destination blob container' field. 7. Check whether the error for 'Destination blob name' field appears. ## Expected Experience ## The error for 'Destination blob name' field appears. ![image](https://github.com/microsoft/AzureStorageExplorer/assets/41351993/d2e393d1-fcd1-43be-801c-0a9d554976a6) ## Actual Experience ## The error for 'Destination blob name' field doesn't appear. ![image](https://github.com/microsoft/AzureStorageExplorer/assets/41351993/01b46249-2ec2-4b89-b6fe-7a1b6bc2f582) ## Additional Context ## 1. This issue also reproduces when copying one archived blob. 2. This issue can't be reproduced consistently.
test
there is no error on clone and rehydrate dialog when typing the source blob container name to destination blob container if typing a valid blob container name previously storage explorer version dev build number branch main platform os windows linux ubuntu macos ventura apple pro architecture how found ad hoc testing regression from previous release steps to reproduce expand one storage account blob containers create two blob containers named upload one blob to blob container change the access of the blob to archive right click the blob click clone and rehydrate type to destination blob container field the error for destination blob name field the disappears delete type to destination blob container field check whether the error for destination blob name field appears expected experience the error for destination blob name field appears actual experience the error for destination blob name field doesn t appear additional context this issue also reproduces when copying one archived blob this issue can t be reproduced consistently
1
295,806
25,506,993,428
IssuesEvent
2022-11-28 10:11:36
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
pkg/sql/logictest/tests/cockroach-go-testserver-22.1-22.2/cockroach-go-testserver-22_1-22_2_test: TestLogic_testserver_test_22_1_22_2 failed
C-test-failure O-robot branch-master T-sql-queries
pkg/sql/logictest/tests/cockroach-go-testserver-22.1-22.2/cockroach-go-testserver-22_1-22_2_test.TestLogic_testserver_test_22_1_22_2 [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_SqlLogicTestHighVModuleNightlyBazel/7720729?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_SqlLogicTestHighVModuleNightlyBazel/7720729?buildTab=artifacts#/) on master @ [1a6e9f885baa124d5ff2996adb966ea15a1a9b2b](https://github.com/cockroachdb/cockroach/commits/1a6e9f885baa124d5ff2996adb966ea15a1a9b2b): ``` === RUN TestLogic_testserver_test_22_1_22_2 test_log_scope.go:161: test logs captured to: /artifacts/tmp/_tmp/a0cd2ed7268f1ad03abb8f1d57f5fcc9/logTestLogic_testserver_test_22_1_22_21972940462 test_log_scope.go:79: use -show-logs to present logs inline logic.go:1824: :0: error while processing logic.go:1824: Get "https://binaries.cockroachdb.com/cockroach-v22.1.6.linux-amd64.tgz?ci=true": dial tcp: lookup binaries.cockroachdb.com on 169.254.169.254:53: no such host panic.go:522: -- test log scope end -- test logs left over in: /artifacts/tmp/_tmp/a0cd2ed7268f1ad03abb8f1d57f5fcc9/logTestLogic_testserver_test_22_1_22_21972940462 --- FAIL: TestLogic_testserver_test_22_1_22_2 (0.04s) ``` <details><summary>Help</summary> <p> See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM) </p> </details> /cc @cockroachdb/sql-queries <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestLogic_testserver_test_22_1_22_2.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub> Jira issue: CRDB-21853
1.0
pkg/sql/logictest/tests/cockroach-go-testserver-22.1-22.2/cockroach-go-testserver-22_1-22_2_test: TestLogic_testserver_test_22_1_22_2 failed - pkg/sql/logictest/tests/cockroach-go-testserver-22.1-22.2/cockroach-go-testserver-22_1-22_2_test.TestLogic_testserver_test_22_1_22_2 [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_SqlLogicTestHighVModuleNightlyBazel/7720729?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_SqlLogicTestHighVModuleNightlyBazel/7720729?buildTab=artifacts#/) on master @ [1a6e9f885baa124d5ff2996adb966ea15a1a9b2b](https://github.com/cockroachdb/cockroach/commits/1a6e9f885baa124d5ff2996adb966ea15a1a9b2b): ``` === RUN TestLogic_testserver_test_22_1_22_2 test_log_scope.go:161: test logs captured to: /artifacts/tmp/_tmp/a0cd2ed7268f1ad03abb8f1d57f5fcc9/logTestLogic_testserver_test_22_1_22_21972940462 test_log_scope.go:79: use -show-logs to present logs inline logic.go:1824: :0: error while processing logic.go:1824: Get "https://binaries.cockroachdb.com/cockroach-v22.1.6.linux-amd64.tgz?ci=true": dial tcp: lookup binaries.cockroachdb.com on 169.254.169.254:53: no such host panic.go:522: -- test log scope end -- test logs left over in: /artifacts/tmp/_tmp/a0cd2ed7268f1ad03abb8f1d57f5fcc9/logTestLogic_testserver_test_22_1_22_21972940462 --- FAIL: TestLogic_testserver_test_22_1_22_2 (0.04s) ``` <details><summary>Help</summary> <p> See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM) </p> </details> /cc @cockroachdb/sql-queries <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestLogic_testserver_test_22_1_22_2.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub> Jira issue: CRDB-21853
test
pkg sql logictest tests cockroach go testserver cockroach go testserver test testlogic testserver test failed pkg sql logictest tests cockroach go testserver cockroach go testserver test testlogic testserver test with on master run testlogic testserver test test log scope go test logs captured to artifacts tmp tmp logtestlogic testserver test test log scope go use show logs to present logs inline logic go error while processing logic go get dial tcp lookup binaries cockroachdb com on no such host panic go test log scope end test logs left over in artifacts tmp tmp logtestlogic testserver test fail testlogic testserver test help see also cc cockroachdb sql queries jira issue crdb
1
7,856
25,839,328,491
IssuesEvent
2022-12-12 22:29:32
AzureAD/microsoft-authentication-library-for-objc
https://api.github.com/repos/AzureAD/microsoft-authentication-library-for-objc
closed
Automation tests failure
automation failure
@AzureAD/appleidentity Automation failed for [AzureAD/microsoft-authentication-library-for-objc](https://github.com/AzureAD/microsoft-authentication-library-for-objc) ran against commit : Add new allowGettingAccessTokenWithRefreshToken in MSALSilentTokenPar… (#1592) [2a9b2ab60bc9d5d3004990909e793dd73a6dfdca] Pipeline URL : [https://identitydivision.visualstudio.com/IDDP/_build/results?buildId=1023705&view=logs](https://identitydivision.visualstudio.com/IDDP/_build/results?buildId=1023705&view=logs)
1.0
Automation tests failure - @AzureAD/appleidentity Automation failed for [AzureAD/microsoft-authentication-library-for-objc](https://github.com/AzureAD/microsoft-authentication-library-for-objc) ran against commit : Add new allowGettingAccessTokenWithRefreshToken in MSALSilentTokenPar… (#1592) [2a9b2ab60bc9d5d3004990909e793dd73a6dfdca] Pipeline URL : [https://identitydivision.visualstudio.com/IDDP/_build/results?buildId=1023705&view=logs](https://identitydivision.visualstudio.com/IDDP/_build/results?buildId=1023705&view=logs)
non_test
automation tests failure azuread appleidentity automation failed for ran against commit add new allowgettingaccesstokenwithrefreshtoken in msalsilenttokenpar… pipeline url
0
211,827
16,371,300,576
IssuesEvent
2021-05-15 06:59:12
nrwl/nx
https://api.github.com/repos/nrwl/nx
closed
NX ERROR Something went wrong - create-nx-workspace@latest -
blocked: retry with latest scope: core type: bug
## Current Behavior Error when trying to create a workspace from the latest version. ## Expected Behavior Run the command without any error, correctly creating the workspace. ## Steps to Reproduce ```bash docker run --rm --interactive --tty node /bin/bash -c "cd && npx --yes create-nx-workspace@latest workspace --preset=empty --cli=nx --interactive=false --nx-cloud=false" ``` ### Failure Logs ```bash > NX Nx is creating your workspace. To make sure the command works reliably in all environments, and that the preset is applied correctly, Nx will run "npm install" several times. Please wait. ⠙ Creating your workspace > NX ERROR Something went wrong. Rerunning the command with verbose logging. /workspace is not an empty directory. > NX ERROR Something went wrong! v12.0.8 node:child_process:690 err = new Error(msg); ^ Error: Command failed: npx tao new workspace --preset=empty --no-interactive --no-nxCloud --appName= --linter=eslint --collection=@nrwl/workspace/collection.json --cli=nx --nxWorkspaceRoot="/" at checkExecSyncError (node:child_process:690:11) at Object.execSync (node:child_process:727:15) at /root/.npm/_npx/505743838affa773/node_modules/create-nx-workspace/bin/create-nx-workspace.js:464:29 at Generator.throw (<anonymous>) at rejected (/root/.npm/_npx/505743838affa773/node_modules/tslib/tslib.js:115:69) at processTicksAndRejections (node:internal/process/task_queues:94:5) { status: 1, signal: null, output: [ null, null, null ], pid: 104, stdout: null, stderr: null } npm notice npm notice New minor version of npm available! 7.6.3 -> 7.11.1 npm notice Changelog: https://github.com/npm/cli/releases/tag/v7.11.1 npm notice Run npm install -g npm@7.11.1 to update! npm notice ``` ### Environment ``` docker run --rm --interactive --tty node /bin/bash -c "node -v ; npx -v" v15.12.0 7.6.3 ```
1.0
NX ERROR Something went wrong - create-nx-workspace@latest - - ## Current Behavior Error when trying to create a workspace from the latest version. ## Expected Behavior Run the command without any error, correctly creating the workspace. ## Steps to Reproduce ```bash docker run --rm --interactive --tty node /bin/bash -c "cd && npx --yes create-nx-workspace@latest workspace --preset=empty --cli=nx --interactive=false --nx-cloud=false" ``` ### Failure Logs ```bash > NX Nx is creating your workspace. To make sure the command works reliably in all environments, and that the preset is applied correctly, Nx will run "npm install" several times. Please wait. ⠙ Creating your workspace > NX ERROR Something went wrong. Rerunning the command with verbose logging. /workspace is not an empty directory. > NX ERROR Something went wrong! v12.0.8 node:child_process:690 err = new Error(msg); ^ Error: Command failed: npx tao new workspace --preset=empty --no-interactive --no-nxCloud --appName= --linter=eslint --collection=@nrwl/workspace/collection.json --cli=nx --nxWorkspaceRoot="/" at checkExecSyncError (node:child_process:690:11) at Object.execSync (node:child_process:727:15) at /root/.npm/_npx/505743838affa773/node_modules/create-nx-workspace/bin/create-nx-workspace.js:464:29 at Generator.throw (<anonymous>) at rejected (/root/.npm/_npx/505743838affa773/node_modules/tslib/tslib.js:115:69) at processTicksAndRejections (node:internal/process/task_queues:94:5) { status: 1, signal: null, output: [ null, null, null ], pid: 104, stdout: null, stderr: null } npm notice npm notice New minor version of npm available! 7.6.3 -> 7.11.1 npm notice Changelog: https://github.com/npm/cli/releases/tag/v7.11.1 npm notice Run npm install -g npm@7.11.1 to update! npm notice ``` ### Environment ``` docker run --rm --interactive --tty node /bin/bash -c "node -v ; npx -v" v15.12.0 7.6.3 ```
test
nx error something went wrong create nx workspace latest current behavior error when trying to create a workspace from the latest version expected behavior run the command without any error correctly creating the workspace steps to reproduce bash docker run rm interactive tty node bin bash c cd npx yes create nx workspace latest workspace preset empty cli nx interactive false nx cloud false failure logs bash nx nx is creating your workspace to make sure the command works reliably in all environments and that the preset is applied correctly nx will run npm install several times please wait ⠙ creating your workspace nx error something went wrong rerunning the command with verbose logging workspace is not an empty directory nx error something went wrong node child process err new error msg error command failed npx tao new workspace preset empty no interactive no nxcloud appname linter eslint collection nrwl workspace collection json cli nx nxworkspaceroot at checkexecsyncerror node child process at object execsync node child process at root npm npx node modules create nx workspace bin create nx workspace js at generator throw at rejected root npm npx node modules tslib tslib js at processticksandrejections node internal process task queues status signal null output pid stdout null stderr null npm notice npm notice new minor version of npm available npm notice changelog npm notice run npm install g npm to update npm notice environment docker run rm interactive tty node bin bash c node v npx v
1
233,610
17,872,926,642
IssuesEvent
2021-09-06 19:11:45
fga-eps-mds/2021-1-Bot
https://api.github.com/repos/fga-eps-mds/2021-1-Bot
opened
Otimização da GH Page
documentation Time-PlusUltra GH Page
## Descrição da Issue Issue com o objetivo de melhorar a página do projeto, a fim de otimizá-la e melhorar a experiência do usuário. ## Tasks: - [ ] Melhorias na navegação - [ ] Colocar a logo - [ ] Paleta de cores no site ## Critérios de Aceitação: - [ ] A página possui uma melhor navegação - [ ] SIte possui a logo - [ ] Mudanças na paleta de cores
1.0
Otimização da GH Page - ## Descrição da Issue Issue com o objetivo de melhorar a página do projeto, a fim de otimizá-la e melhorar a experiência do usuário. ## Tasks: - [ ] Melhorias na navegação - [ ] Colocar a logo - [ ] Paleta de cores no site ## Critérios de Aceitação: - [ ] A página possui uma melhor navegação - [ ] SIte possui a logo - [ ] Mudanças na paleta de cores
non_test
otimização da gh page descrição da issue issue com o objetivo de melhorar a página do projeto a fim de otimizá la e melhorar a experiência do usuário tasks melhorias na navegação colocar a logo paleta de cores no site critérios de aceitação a página possui uma melhor navegação site possui a logo mudanças na paleta de cores
0
155,675
12,266,111,092
IssuesEvent
2020-05-07 08:24:03
moneta-rb/moneta
https://api.github.com/repos/moneta-rb/moneta
closed
Stop rerunning all specs with different microsecond start times
Chore Testing
This was intended to make for super-reliable specs, but it didn't work - specs still fail randomly and need to be restarted. Instead of rerunning all specs at different usecs, we could try running them at the most advantageous usec possible (i.e. either start or end of tres)
1.0
Stop rerunning all specs with different microsecond start times - This was intended to make for super-reliable specs, but it didn't work - specs still fail randomly and need to be restarted. Instead of rerunning all specs at different usecs, we could try running them at the most advantageous usec possible (i.e. either start or end of tres)
test
stop rerunning all specs with different microsecond start times this was intended to make for super reliable specs but it didn t work specs still fail randomly and need to be restarted instead of rerunning all specs at different usecs we could try running them at the most advantageous usec possible i e either start or end of tres
1
153,971
12,179,192,620
IssuesEvent
2020-04-28 10:15:00
AdventistCommons/adventistcommons.org
https://api.github.com/repos/AdventistCommons/adventistcommons.org
closed
Translation Error
To test bug
Mother language is German in my profile but I cannot translate. <img width="1172" alt="Screen Shot 2020-04-28 at 11 27 18 AM" src="https://user-images.githubusercontent.com/51795720/80464991-43b57100-8943-11ea-816d-735aea47ab60.png">
1.0
Translation Error - Mother language is German in my profile but I cannot translate. <img width="1172" alt="Screen Shot 2020-04-28 at 11 27 18 AM" src="https://user-images.githubusercontent.com/51795720/80464991-43b57100-8943-11ea-816d-735aea47ab60.png">
test
translation error mother language is german in my profile but i cannot translate img width alt screen shot at am src
1
71,808
15,208,435,628
IssuesEvent
2021-02-17 02:38:31
comit-network/xmr-btc-swap
https://api.github.com/repos/comit-network/xmr-btc-swap
closed
Protect against rogue key attack during key generation
enhancement security
## Situation Currently, Alice sends to Bob the following keys in `alice::Message0`: - Bitcoin public key `A`; - Monero public spend key `S_a^xmr`; - Bitcoin public key `S_a^btc`; and - Monero private view key `v_a`. Bob replies with his own `bob::Message0`: - Bitcoin public key `B`; - Monero public spend key `S_b^xmr`; - Bitcoin public key `S_b^btc`; and - Monero private view key `v_b`. I've been thinking about what Bob can do with the information in `alice::Message0` before he commits to his own keys. ## Possible attacks ### Monero shared output For instance, `S_a^xmr + S_b^xmr` is the public spend key of their Monero shared output, so Bob could theoretically adaptively choose his `S_b^xmr = S_r - S_a^xmr`, with knowledge of the the corresponding private key `s_r` for the _rogue key_ `S_r`, so that when combined `S_a^xmr + S_b^xmr = S_a^xmr + S_r - S_a^xmr = S_r` giving control to Bob over the shared output immediately. In practice, because we attach a DLEQ proof to `bob::Message0` which proves knowledge of the secret key for `S_b^xmr`, Bob doesn't actually have the ability to choose a key that would cancel out Alice's, because he cannot compute the private key corresponding to the public key `S_r - S_a^xmr` thanks to the discrete log problem being hard. I can't imagine this DLEQ proof ever being removed, so I would say this is safe. It's worth noting that Bob could also exploit prior knowledge of `S_a^btc` in the same way. Similarly, Alice is protected by the DLEQ proof that Bob must provide which proves knowledge of `s_b` so that Bob cannot choose `S_b^btc` adaptively. ### Bitcoin shared output The Bitcoin shared output is not an aggregation of keys, but a multisignature Bitcoin script, so in its current state it is not susceptible to the same kind of rogue key attack. On the other hand, this could change. As soon as [BIP340](https://en.bitcoin.it/wiki/BIP_0340) hits we would probably construct this output using Schnorr signatures to make the protocol truly scriptless. The output would be vulnerable to the rogue key attack outlined above and we would have to modify the protocol slightly. ### Monero private view keys The private view keys are already being exchanged publicly because there's nothing tricky to do with them. ## Conclusion In my opinion, the key generation phase is currently safe. I still recommend we future-proof our protocol before it becomes too cumbersome to modify it. I think the Monero output will always remain protected by the required DLEQ proof, but the Bitcoin one would be vulnerable after a predictable upgrade to the protocol (i.e. using BIP340). I would replace the current 2-round key generation phase with a 3-generation phase which uses a commitment scheme to protect Alice: 1. Alice sends commitment to her keys to Bob. 2. Bob sends his keys in the open to Alice. 3. Alice sends the opening of her commitment to Bob. This is the same procedure we followed for our [Grin-Bitcoin protocol](https://github.com/comit-network/grin-btc-poc). We could implement it in a similar way by sending a hash of all the keys concatenated in a particular order as the commitment.
True
Protect against rogue key attack during key generation - ## Situation Currently, Alice sends to Bob the following keys in `alice::Message0`: - Bitcoin public key `A`; - Monero public spend key `S_a^xmr`; - Bitcoin public key `S_a^btc`; and - Monero private view key `v_a`. Bob replies with his own `bob::Message0`: - Bitcoin public key `B`; - Monero public spend key `S_b^xmr`; - Bitcoin public key `S_b^btc`; and - Monero private view key `v_b`. I've been thinking about what Bob can do with the information in `alice::Message0` before he commits to his own keys. ## Possible attacks ### Monero shared output For instance, `S_a^xmr + S_b^xmr` is the public spend key of their Monero shared output, so Bob could theoretically adaptively choose his `S_b^xmr = S_r - S_a^xmr`, with knowledge of the the corresponding private key `s_r` for the _rogue key_ `S_r`, so that when combined `S_a^xmr + S_b^xmr = S_a^xmr + S_r - S_a^xmr = S_r` giving control to Bob over the shared output immediately. In practice, because we attach a DLEQ proof to `bob::Message0` which proves knowledge of the secret key for `S_b^xmr`, Bob doesn't actually have the ability to choose a key that would cancel out Alice's, because he cannot compute the private key corresponding to the public key `S_r - S_a^xmr` thanks to the discrete log problem being hard. I can't imagine this DLEQ proof ever being removed, so I would say this is safe. It's worth noting that Bob could also exploit prior knowledge of `S_a^btc` in the same way. Similarly, Alice is protected by the DLEQ proof that Bob must provide which proves knowledge of `s_b` so that Bob cannot choose `S_b^btc` adaptively. ### Bitcoin shared output The Bitcoin shared output is not an aggregation of keys, but a multisignature Bitcoin script, so in its current state it is not susceptible to the same kind of rogue key attack. On the other hand, this could change. As soon as [BIP340](https://en.bitcoin.it/wiki/BIP_0340) hits we would probably construct this output using Schnorr signatures to make the protocol truly scriptless. The output would be vulnerable to the rogue key attack outlined above and we would have to modify the protocol slightly. ### Monero private view keys The private view keys are already being exchanged publicly because there's nothing tricky to do with them. ## Conclusion In my opinion, the key generation phase is currently safe. I still recommend we future-proof our protocol before it becomes too cumbersome to modify it. I think the Monero output will always remain protected by the required DLEQ proof, but the Bitcoin one would be vulnerable after a predictable upgrade to the protocol (i.e. using BIP340). I would replace the current 2-round key generation phase with a 3-generation phase which uses a commitment scheme to protect Alice: 1. Alice sends commitment to her keys to Bob. 2. Bob sends his keys in the open to Alice. 3. Alice sends the opening of her commitment to Bob. This is the same procedure we followed for our [Grin-Bitcoin protocol](https://github.com/comit-network/grin-btc-poc). We could implement it in a similar way by sending a hash of all the keys concatenated in a particular order as the commitment.
non_test
protect against rogue key attack during key generation situation currently alice sends to bob the following keys in alice bitcoin public key a monero public spend key s a xmr bitcoin public key s a btc and monero private view key v a bob replies with his own bob bitcoin public key b monero public spend key s b xmr bitcoin public key s b btc and monero private view key v b i ve been thinking about what bob can do with the information in alice before he commits to his own keys possible attacks monero shared output for instance s a xmr s b xmr is the public spend key of their monero shared output so bob could theoretically adaptively choose his s b xmr s r s a xmr with knowledge of the the corresponding private key s r for the rogue key s r so that when combined s a xmr s b xmr s a xmr s r s a xmr s r giving control to bob over the shared output immediately in practice because we attach a dleq proof to bob which proves knowledge of the secret key for s b xmr bob doesn t actually have the ability to choose a key that would cancel out alice s because he cannot compute the private key corresponding to the public key s r s a xmr thanks to the discrete log problem being hard i can t imagine this dleq proof ever being removed so i would say this is safe it s worth noting that bob could also exploit prior knowledge of s a btc in the same way similarly alice is protected by the dleq proof that bob must provide which proves knowledge of s b so that bob cannot choose s b btc adaptively bitcoin shared output the bitcoin shared output is not an aggregation of keys but a multisignature bitcoin script so in its current state it is not susceptible to the same kind of rogue key attack on the other hand this could change as soon as hits we would probably construct this output using schnorr signatures to make the protocol truly scriptless the output would be vulnerable to the rogue key attack outlined above and we would have to modify the protocol slightly monero private view keys the private view keys are already being exchanged publicly because there s nothing tricky to do with them conclusion in my opinion the key generation phase is currently safe i still recommend we future proof our protocol before it becomes too cumbersome to modify it i think the monero output will always remain protected by the required dleq proof but the bitcoin one would be vulnerable after a predictable upgrade to the protocol i e using i would replace the current round key generation phase with a generation phase which uses a commitment scheme to protect alice alice sends commitment to her keys to bob bob sends his keys in the open to alice alice sends the opening of her commitment to bob this is the same procedure we followed for our we could implement it in a similar way by sending a hash of all the keys concatenated in a particular order as the commitment
0
204,712
15,528,580,513
IssuesEvent
2021-03-13 11:36:09
omapsapp/omapsapp
https://api.github.com/repos/omapsapp/omapsapp
opened
world_feed_integration_tests.cpp::WorldFeedIntegrationTests_FeedWithWrongStopsOrder fails
Tests
``` cmake . -B build \ -G Ninja \ -DCMAKE_BUILD_TYPE=Debug \ -DSKIP_LONG_TESTS=ON \ -DSKIP_GUI_TESTS=ON) cmake --build build --target tests git clone https://github.com/omapsapp/world_feed_integration_tests_data data/world_feed_integration_tests_data cd build && ln -s ../data data && ./world_feed_integration_tests ``` ``` Running world_feed_integration_tests.cpp::WorldFeedIntegrationTests_FeedWithWrongStopsOrder Inited generator with 0 start id and path to mappings data/world_feed_integration_tests_data/mapping.txt Mapping data/world_feed_integration_tests_data/mapping.txt is not yet created. Inited generator with 0 start id and path to mappings data/world_feed_integration_tests_data/mapping_edges.txt Mapping data/world_feed_integration_tests_data/mapping_edges.txt is not yet created. Filled networks. Filled routes. Filled lines and shapes. Deleted 0 sub-shapes. 1 left. Modified lines and shapes. Filled schedule for lines. Filled stop timetables and road graph edges. Error projecting stops to the shape. GTFS trip id t_1077833_b_26220_tn_1 shapeId 3 stopId 7 i 3 start index on shape 689 trips count 1 Modified shapes. Filled transfers. Filled gates. Updated edges weights. FAILED world_feed_integration_tests/world_feed_integration_tests.cpp:99 TEST(!m_globalFeed.SetFeed(std::move(feed))) Test took 10 ms 1 tests failed: world_feed_integration_tests.cpp::WorldFeedIntegrationTests_FeedWithWrongStopsOrder Some tests FAILED. ```
1.0
world_feed_integration_tests.cpp::WorldFeedIntegrationTests_FeedWithWrongStopsOrder fails - ``` cmake . -B build \ -G Ninja \ -DCMAKE_BUILD_TYPE=Debug \ -DSKIP_LONG_TESTS=ON \ -DSKIP_GUI_TESTS=ON) cmake --build build --target tests git clone https://github.com/omapsapp/world_feed_integration_tests_data data/world_feed_integration_tests_data cd build && ln -s ../data data && ./world_feed_integration_tests ``` ``` Running world_feed_integration_tests.cpp::WorldFeedIntegrationTests_FeedWithWrongStopsOrder Inited generator with 0 start id and path to mappings data/world_feed_integration_tests_data/mapping.txt Mapping data/world_feed_integration_tests_data/mapping.txt is not yet created. Inited generator with 0 start id and path to mappings data/world_feed_integration_tests_data/mapping_edges.txt Mapping data/world_feed_integration_tests_data/mapping_edges.txt is not yet created. Filled networks. Filled routes. Filled lines and shapes. Deleted 0 sub-shapes. 1 left. Modified lines and shapes. Filled schedule for lines. Filled stop timetables and road graph edges. Error projecting stops to the shape. GTFS trip id t_1077833_b_26220_tn_1 shapeId 3 stopId 7 i 3 start index on shape 689 trips count 1 Modified shapes. Filled transfers. Filled gates. Updated edges weights. FAILED world_feed_integration_tests/world_feed_integration_tests.cpp:99 TEST(!m_globalFeed.SetFeed(std::move(feed))) Test took 10 ms 1 tests failed: world_feed_integration_tests.cpp::WorldFeedIntegrationTests_FeedWithWrongStopsOrder Some tests FAILED. ```
test
world feed integration tests cpp worldfeedintegrationtests feedwithwrongstopsorder fails cmake b build g ninja dcmake build type debug dskip long tests on dskip gui tests on cmake build build target tests git clone data world feed integration tests data cd build ln s data data world feed integration tests running world feed integration tests cpp worldfeedintegrationtests feedwithwrongstopsorder inited generator with start id and path to mappings data world feed integration tests data mapping txt mapping data world feed integration tests data mapping txt is not yet created inited generator with start id and path to mappings data world feed integration tests data mapping edges txt mapping data world feed integration tests data mapping edges txt is not yet created filled networks filled routes filled lines and shapes deleted sub shapes left modified lines and shapes filled schedule for lines filled stop timetables and road graph edges error projecting stops to the shape gtfs trip id t b tn shapeid stopid i start index on shape trips count modified shapes filled transfers filled gates updated edges weights failed world feed integration tests world feed integration tests cpp test m globalfeed setfeed std move feed test took ms tests failed world feed integration tests cpp worldfeedintegrationtests feedwithwrongstopsorder some tests failed
1
196,679
6,937,486,760
IssuesEvent
2017-12-04 05:18:51
ruany/litebans-php
https://api.github.com/repos/ruany/litebans-php
closed
Multiple Server Support
Medium priority Suggestion
Add support for multiple servers, Should be easy and what I mean is add an extra column with what server the player was banned/muted/kicked on
1.0
Multiple Server Support - Add support for multiple servers, Should be easy and what I mean is add an extra column with what server the player was banned/muted/kicked on
non_test
multiple server support add support for multiple servers should be easy and what i mean is add an extra column with what server the player was banned muted kicked on
0
250,575
21,315,707,675
IssuesEvent
2022-04-16 08:27:24
Uuvana-Studios/longvinter-windows-client
https://api.github.com/repos/Uuvana-Studios/longvinter-windows-client
closed
텐트위에 다른 집이 보임
Bug Not Tested
**Describe the bug** A clear and concise description of what the bug is. **To Reproduce** Steps to reproduce the behavior: 1. Go to '...' 2. Click on '....' 3. Scroll down to '....' 4. See an error **Expected behavior** A clear and concise description of what ![HighresScreenshot00001](https://user-images.githubusercontent.com/103818187/163667778-e1b8d945-2de4-474a-9c85-d57cbf755c77.png) ![HighresScreenshot00002](https://user-images.githubusercontent.com/103818187/163667781-b392ccf4-48e2-4f18-94a5-0b2c570b0508.png) ![HighresScreenshot00000](https://user-images.githubusercontent.com/103818187/163667783-15452b75-08db-4239-8cc3-5d6e63a4fcb8.png) you expected to happen. **Screenshots** If applicable, add screenshots to help explain your problem. **Desktop (please complete the following information):** - OS: [e.g. Windows] - Game Version [e.g. 1.0] - Steam Version [e.g. 1.0] **Additional context** Add any other context about the problem here.
1.0
텐트위에 다른 집이 보임 - **Describe the bug** A clear and concise description of what the bug is. **To Reproduce** Steps to reproduce the behavior: 1. Go to '...' 2. Click on '....' 3. Scroll down to '....' 4. See an error **Expected behavior** A clear and concise description of what ![HighresScreenshot00001](https://user-images.githubusercontent.com/103818187/163667778-e1b8d945-2de4-474a-9c85-d57cbf755c77.png) ![HighresScreenshot00002](https://user-images.githubusercontent.com/103818187/163667781-b392ccf4-48e2-4f18-94a5-0b2c570b0508.png) ![HighresScreenshot00000](https://user-images.githubusercontent.com/103818187/163667783-15452b75-08db-4239-8cc3-5d6e63a4fcb8.png) you expected to happen. **Screenshots** If applicable, add screenshots to help explain your problem. **Desktop (please complete the following information):** - OS: [e.g. Windows] - Game Version [e.g. 1.0] - Steam Version [e.g. 1.0] **Additional context** Add any other context about the problem here.
test
텐트위에 다른 집이 보임 describe the bug a clear and concise description of what the bug is to reproduce steps to reproduce the behavior go to click on scroll down to see an error expected behavior a clear and concise description of what you expected to happen screenshots if applicable add screenshots to help explain your problem desktop please complete the following information os game version steam version additional context add any other context about the problem here
1
39,913
5,256,435,721
IssuesEvent
2017-02-02 17:54:42
pazz/alot
https://api.github.com/repos/pazz/alot
closed
Unit Tests
deployment important tests
Okay, lets discuss writing unit tests, since it sounds like we agree that is probably the next major milestone for alot. Here's some thoughts I have from experience writing a lot of unit tests, and writing unit tests for a fairly mature project. Writing a ton of new unit tests is a daunting amount of work, but writing them incrementally here and there is much easier. I'd suggest that new features should be required to come with unit tests immediately, and that the tests are reviewed and part of the review process. I also suggest that we turn on unit tests in travis immediately, and enforce a policy of no unit test breaks across a series (since that makes bisecting hard). It may also be worth breaking out the unit test milestone into smaller chunks if we want to track it (say per module or per feature) with a milestone target, though I don't know how much of that kind of tracking we want to do. The second thing I'll suggest is not using the built in unittest module for writing the tests. I've written a *lot* of tests, with unittest, nose, nose2 and pytest. I will say of those pytest is by far the best, nose2 is next (though last time I used it it was immature), nose, and then the builtin unittest module. The thing that makes pytest the best is that it is the most "batteries included" of the ones I've used, and removes the need to write a lot of boilerplate. I don't think its a problem (within reason) to pull in dependencies for testing. Mock is another module that is *incredibly* useful for writing good unit tests, though it's a bit tricky to know when to use mock. Also, I think it would be worth while for some of us newer contributers to work on the unit tests since it will force us to learn more of the code base.
1.0
Unit Tests - Okay, lets discuss writing unit tests, since it sounds like we agree that is probably the next major milestone for alot. Here's some thoughts I have from experience writing a lot of unit tests, and writing unit tests for a fairly mature project. Writing a ton of new unit tests is a daunting amount of work, but writing them incrementally here and there is much easier. I'd suggest that new features should be required to come with unit tests immediately, and that the tests are reviewed and part of the review process. I also suggest that we turn on unit tests in travis immediately, and enforce a policy of no unit test breaks across a series (since that makes bisecting hard). It may also be worth breaking out the unit test milestone into smaller chunks if we want to track it (say per module or per feature) with a milestone target, though I don't know how much of that kind of tracking we want to do. The second thing I'll suggest is not using the built in unittest module for writing the tests. I've written a *lot* of tests, with unittest, nose, nose2 and pytest. I will say of those pytest is by far the best, nose2 is next (though last time I used it it was immature), nose, and then the builtin unittest module. The thing that makes pytest the best is that it is the most "batteries included" of the ones I've used, and removes the need to write a lot of boilerplate. I don't think its a problem (within reason) to pull in dependencies for testing. Mock is another module that is *incredibly* useful for writing good unit tests, though it's a bit tricky to know when to use mock. Also, I think it would be worth while for some of us newer contributers to work on the unit tests since it will force us to learn more of the code base.
test
unit tests okay lets discuss writing unit tests since it sounds like we agree that is probably the next major milestone for alot here s some thoughts i have from experience writing a lot of unit tests and writing unit tests for a fairly mature project writing a ton of new unit tests is a daunting amount of work but writing them incrementally here and there is much easier i d suggest that new features should be required to come with unit tests immediately and that the tests are reviewed and part of the review process i also suggest that we turn on unit tests in travis immediately and enforce a policy of no unit test breaks across a series since that makes bisecting hard it may also be worth breaking out the unit test milestone into smaller chunks if we want to track it say per module or per feature with a milestone target though i don t know how much of that kind of tracking we want to do the second thing i ll suggest is not using the built in unittest module for writing the tests i ve written a lot of tests with unittest nose and pytest i will say of those pytest is by far the best is next though last time i used it it was immature nose and then the builtin unittest module the thing that makes pytest the best is that it is the most batteries included of the ones i ve used and removes the need to write a lot of boilerplate i don t think its a problem within reason to pull in dependencies for testing mock is another module that is incredibly useful for writing good unit tests though it s a bit tricky to know when to use mock also i think it would be worth while for some of us newer contributers to work on the unit tests since it will force us to learn more of the code base
1
79,365
28,132,276,280
IssuesEvent
2023-04-01 01:49:45
dotCMS/core
https://api.github.com/repos/dotCMS/core
opened
Template Builder Add explicit rows and revamp
Type : Defect Epic Team : Lunik Triage OKR : Core Features OKR : Sales OKR : Marketing Priority : 2 High Next Release
### Parent Issue _No response_ ### Problem Statement dotCMS template builder has been a source of frustration for marketers and developers, who struggle to create, edit, and reorganize page templates effectively. Lack the flexibility to move rows and columns around without breaking the entire template, which leads to wasted time and frustration. ### Steps to Reproduce N/A ### Acceptance Criteria N/A ### dotCMS Version master ### Proposed Objective Core Features ### Proposed Priority Priority 2 - Important ### External Links... Slack Conversations, Support Tickets, Figma Designs, etc. _No response_ ### Assumptions & Initiation Needs _No response_ ### Quality Assurance Notes & Workarounds _No response_ ### Sub-Tasks & Estimates _No response_
1.0
Template Builder Add explicit rows and revamp - ### Parent Issue _No response_ ### Problem Statement dotCMS template builder has been a source of frustration for marketers and developers, who struggle to create, edit, and reorganize page templates effectively. Lack the flexibility to move rows and columns around without breaking the entire template, which leads to wasted time and frustration. ### Steps to Reproduce N/A ### Acceptance Criteria N/A ### dotCMS Version master ### Proposed Objective Core Features ### Proposed Priority Priority 2 - Important ### External Links... Slack Conversations, Support Tickets, Figma Designs, etc. _No response_ ### Assumptions & Initiation Needs _No response_ ### Quality Assurance Notes & Workarounds _No response_ ### Sub-Tasks & Estimates _No response_
non_test
template builder add explicit rows and revamp parent issue no response problem statement dotcms template builder has been a source of frustration for marketers and developers who struggle to create edit and reorganize page templates effectively lack the flexibility to move rows and columns around without breaking the entire template which leads to wasted time and frustration steps to reproduce n a acceptance criteria n a dotcms version master proposed objective core features proposed priority priority important external links slack conversations support tickets figma designs etc no response assumptions initiation needs no response quality assurance notes workarounds no response sub tasks estimates no response
0
314,431
26,999,920,672
IssuesEvent
2023-02-10 06:37:46
rancher/dashboard
https://api.github.com/repos/rancher/dashboard
closed
[2.6.x backport]: Resource watch re-subscription causes API load / leader election
kind/bug [zube]: To Test priority/0 JIRA area/performance
We have a solution for the resource watch in 2.7.x that we need to backport to 2.6.x. 2.7.x work: https://github.com/rancher/dashboard/issues/5997 Internal reference: SURE-5317
1.0
[2.6.x backport]: Resource watch re-subscription causes API load / leader election - We have a solution for the resource watch in 2.7.x that we need to backport to 2.6.x. 2.7.x work: https://github.com/rancher/dashboard/issues/5997 Internal reference: SURE-5317
test
resource watch re subscription causes api load leader election we have a solution for the resource watch in x that we need to backport to x x work internal reference sure
1
331,642
29,044,891,052
IssuesEvent
2023-05-13 12:33:17
epage/pytest-rs
https://api.github.com/repos/epage/pytest-rs
opened
Custom arguments and built-in arguments can conflict
A-pytest C-enhancement
This creates a compatibility hazard Ideas - Namespace rust - Namespace plugins - `--ext-` - `--<name>::` and warn if one is used that isn't registerd
1.0
Custom arguments and built-in arguments can conflict - This creates a compatibility hazard Ideas - Namespace rust - Namespace plugins - `--ext-` - `--<name>::` and warn if one is used that isn't registerd
test
custom arguments and built in arguments can conflict this creates a compatibility hazard ideas namespace rust namespace plugins ext and warn if one is used that isn t registerd
1
221,629
17,361,495,577
IssuesEvent
2021-07-29 21:22:42
microsoft/vscode-python
https://api.github.com/repos/microsoft/vscode-python
closed
nosetests is run with incorrect args
area-testing good first issue needs PR reason-preexisting type-bug
From what I can tell, in [nosetest's `TestManagerRunner.runTest()`](https://github.com/Microsoft/vscode-python/blob/90aedaa/src/client/unittests/nosetest/runner.ts) `testPaths` is getting added to the args list twice. The first time is on line 60 and the second time on line 69. It looks like this slipped through in [a refactor last June](https://github.com/Microsoft/vscode-python/commit/e585a15aa79604c581bdcdef46c6f40087514e82#diff-7e430a2c0bec5dcadcab49a2bf49445bR69). - [ ] add a test for nosetest's `TestManagerRunner.runTest()` that fails due to the bug - [ ] fix the args list FYI, I noticed this while working on #4376.
1.0
nosetests is run with incorrect args - From what I can tell, in [nosetest's `TestManagerRunner.runTest()`](https://github.com/Microsoft/vscode-python/blob/90aedaa/src/client/unittests/nosetest/runner.ts) `testPaths` is getting added to the args list twice. The first time is on line 60 and the second time on line 69. It looks like this slipped through in [a refactor last June](https://github.com/Microsoft/vscode-python/commit/e585a15aa79604c581bdcdef46c6f40087514e82#diff-7e430a2c0bec5dcadcab49a2bf49445bR69). - [ ] add a test for nosetest's `TestManagerRunner.runTest()` that fails due to the bug - [ ] fix the args list FYI, I noticed this while working on #4376.
test
nosetests is run with incorrect args from what i can tell in testpaths is getting added to the args list twice the first time is on line and the second time on line it looks like this slipped through in add a test for nosetest s testmanagerrunner runtest that fails due to the bug fix the args list fyi i noticed this while working on
1
32,782
12,149,767,650
IssuesEvent
2020-04-24 16:43:57
vancopayments/rt-web-internal-docs
https://api.github.com/repos/vancopayments/rt-web-internal-docs
opened
CVE-2019-8331 (Medium) detected in bootstrap-3.0.0.min.js
security vulnerability
## CVE-2019-8331 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.0.0.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.0.0/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.0.0/js/bootstrap.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/rt-web-internal-docs/ll_CC/index.html</p> <p>Path to vulnerable library: /rt-web-internal-docs/ll_CC/index.html</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.0.0.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/vancopayments/rt-web-internal-docs/commits/cf2b6185bddfee72b6d2d022735e84c14c7bde3b">cf2b6185bddfee72b6d2d022735e84c14c7bde3b</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap before 3.4.1 and 4.3.x before 4.3.1, XSS is possible in the tooltip or popover data-template attribute. <p>Publish Date: 2019-02-20 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-8331>CVE-2019-8331</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/twbs/bootstrap/pull/28236">https://github.com/twbs/bootstrap/pull/28236</a></p> <p>Release Date: 2019-02-20</p> <p>Fix Resolution: bootstrap - 3.4.1,4.3.1;bootstrap-sass - 3.4.1,4.3.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.0.0","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.0.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"bootstrap - 3.4.1,4.3.1;bootstrap-sass - 3.4.1,4.3.1"}],"vulnerabilityIdentifier":"CVE-2019-8331","vulnerabilityDetails":"In Bootstrap before 3.4.1 and 4.3.x before 4.3.1, XSS is possible in the tooltip or popover data-template attribute.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-8331","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
True
CVE-2019-8331 (Medium) detected in bootstrap-3.0.0.min.js - ## CVE-2019-8331 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.0.0.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.0.0/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.0.0/js/bootstrap.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/rt-web-internal-docs/ll_CC/index.html</p> <p>Path to vulnerable library: /rt-web-internal-docs/ll_CC/index.html</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.0.0.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/vancopayments/rt-web-internal-docs/commits/cf2b6185bddfee72b6d2d022735e84c14c7bde3b">cf2b6185bddfee72b6d2d022735e84c14c7bde3b</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap before 3.4.1 and 4.3.x before 4.3.1, XSS is possible in the tooltip or popover data-template attribute. <p>Publish Date: 2019-02-20 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-8331>CVE-2019-8331</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/twbs/bootstrap/pull/28236">https://github.com/twbs/bootstrap/pull/28236</a></p> <p>Release Date: 2019-02-20</p> <p>Fix Resolution: bootstrap - 3.4.1,4.3.1;bootstrap-sass - 3.4.1,4.3.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.0.0","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.0.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"bootstrap - 3.4.1,4.3.1;bootstrap-sass - 3.4.1,4.3.1"}],"vulnerabilityIdentifier":"CVE-2019-8331","vulnerabilityDetails":"In Bootstrap before 3.4.1 and 4.3.x before 4.3.1, XSS is possible in the tooltip or popover data-template attribute.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-8331","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
non_test
cve medium detected in bootstrap min js cve medium severity vulnerability vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to dependency file tmp ws scm rt web internal docs ll cc index html path to vulnerable library rt web internal docs ll cc index html dependency hierarchy x bootstrap min js vulnerable library found in head commit a href vulnerability details in bootstrap before and x before xss is possible in the tooltip or popover data template attribute publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution bootstrap bootstrap sass isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails in bootstrap before and x before xss is possible in the tooltip or popover data template attribute vulnerabilityurl
0
189,951
14,529,299,654
IssuesEvent
2020-12-14 17:37:23
Azure/azure-sdk-for-net
https://api.github.com/repos/Azure/azure-sdk-for-net
opened
[Flaky test] BearerTokenAuthenticationPolicy_GatedConcurrentCallsFailed
Azure.Core Client test-reliability
``` Error message Expected: <System.InvalidOperationException: Error at Azure.Core.Tests.BearerTokenAuthenticationPolicyTests.<>c__DisplayClass13_0.<BearerTokenAuthenticationPolicy_GatedConcurrentCallsFailed>b__0(TokenRequestContext r, CancellationToken c) in /home/vsts/work/1/s/sdk/core/Azure.Core/tests/BearerTokenAuthenticationPolicyTests.cs:line 379 at Azure.Core.Tests.BearerTokenAuthenticationPolicyTests.TokenCredentialStub.GetToken(TokenRequestContext requestContext, CancellationToken cancellationToken) in /home/vsts/work/1/s/sdk/core/Azure.Core/tests/BearerTokenAuthenticationPolicyTests.cs:line 565 at Azure.Core.Pipeline.BearerTokenAuthenticationPolicy.AccessTokenCache.GetHeaderValueFromCredentialAsync(HttpMessage message, Boolean async, CancellationToken cancellationToken) in //sdk/core/Azure.Core/src/Pipeline/BearerTokenAuthenticationPolicy.cs:line 217 at Azure.Core.Pipeline.BearerTokenAuthenticationPolicy.AccessTokenCache.GetHeaderValueAsync(HttpMessage message, Boolean async) in //sdk/core/Azure.Core/src/Pipeline/BearerTokenAuthenticationPolicy.cs:line 110 at Azure.Core.Pipeline.BearerTokenAuthenticationPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) in /_/sdk/core/Azure.Core/src/Pipeline/BearerTokenAuthenticationPolicy.cs:line 61 at Azure.Core.Pipeline.BearerTokenAuthenticationPolicy.Process(HttpMessage message, ReadOnlyMemory1 pipeline) in //sdk/core/Azure.Core/src/Pipeline/BearerTokenAuthenticationPolicy.cs:line 51 at Azure.Core.Pipeline.HttpPipeline.Send(HttpMessage message, CancellationToken cancellationToken) in //sdk/core/Azure.Core/src/Pipeline/HttpPipeline.cs:line 81 at Azure.Core.TestFramework.SyncAsyncPolicyTestBase.SendRequestAsync(HttpPipeline pipeline, Action1 requestAction, Boolean bufferResponse, CancellationToken cancellationToken) in /home/vsts/work/1/s/sdk/core/Azure.Core.TestFramework/src/SyncAsyncPolicyTestBase.cs:line 32 at Azure.Core.TestFramework.SyncAsyncPolicyTestBase.SendRequestAsync(HttpPipelineTransport transport, Action1 requestAction, HttpPipelinePolicy policy, ResponseClassifier responseClassifier, Boolean bufferResponse, CancellationToken cancellationToken) in /home/vsts/work/1/s/sdk/core/Azure.Core.TestFramework/src/SyncAsyncPolicyTestBase.cs:line 43 at Azure.Core.TestFramework.SyncAsyncPolicyTestBase.SendGetRequest(HttpPipelineTransport transport, HttpPipelinePolicy policy, ResponseClassifier responseClassifier, Boolean bufferResponse, Uri uri, CancellationToken cancellationToken) in /home/vsts/work/1/s/sdk/core/Azure.Core.TestFramework/src/SyncAsyncPolicyTestBase.cs:line 48 at Azure.Core.Tests.BearerTokenAuthenticationPolicyTests.<>c__DisplayClass13_0.<<BearerTokenAuthenticationPolicy_GatedConcurrentCallsFailed>b__1>d.MoveNext() in /home/vsts/work/1/s/sdk/core/Azure.Core/tests/BearerTokenAuthenticationPolicyTests.cs:line 392 --- End of stack trace from previous location where exception was thrown --- at NUnit.Framework.Internal.TaskAwaitAdapter.GenericAdapter1.GetResult() at NUnit.Framework.Internal.AsyncToSyncAdapter.Await(Func1 invoke) at NUnit.Framework.Assert.ThrowsAsync(IResolveConstraint expression, AsyncTestDelegate code, String message, Object[] args)> But was: <System.InvalidOperationException: Error at Azure.Core.Tests.BearerTokenAuthenticationPolicyTests.<>c__DisplayClass13_0.<BearerTokenAuthenticationPolicy_GatedConcurrentCallsFailed>b__0(TokenRequestContext r, CancellationToken c) in /home/vsts/work/1/s/sdk/core/Azure.Core/tests/BearerTokenAuthenticationPolicyTests.cs:line 379 at Azure.Core.Tests.BearerTokenAuthenticationPolicyTests.TokenCredentialStub.GetToken(TokenRequestContext requestContext, CancellationToken cancellationToken) in /home/vsts/work/1/s/sdk/core/Azure.Core/tests/BearerTokenAuthenticationPolicyTests.cs:line 565 at Azure.Core.Pipeline.BearerTokenAuthenticationPolicy.AccessTokenCache.GetHeaderValueFromCredentialAsync(HttpMessage message, Boolean async, CancellationToken ca Stack trace at Azure.Core.Tests.BearerTokenAuthenticationPolicyTests.BearerTokenAuthenticationPolicy_GatedConcurrentCallsFailed() in /home/vsts/work/1/s/sdk/core/Azure.Core/tests/BearerTokenAuthenticationPolicyTests.cs:line 395 ```
1.0
[Flaky test] BearerTokenAuthenticationPolicy_GatedConcurrentCallsFailed - ``` Error message Expected: <System.InvalidOperationException: Error at Azure.Core.Tests.BearerTokenAuthenticationPolicyTests.<>c__DisplayClass13_0.<BearerTokenAuthenticationPolicy_GatedConcurrentCallsFailed>b__0(TokenRequestContext r, CancellationToken c) in /home/vsts/work/1/s/sdk/core/Azure.Core/tests/BearerTokenAuthenticationPolicyTests.cs:line 379 at Azure.Core.Tests.BearerTokenAuthenticationPolicyTests.TokenCredentialStub.GetToken(TokenRequestContext requestContext, CancellationToken cancellationToken) in /home/vsts/work/1/s/sdk/core/Azure.Core/tests/BearerTokenAuthenticationPolicyTests.cs:line 565 at Azure.Core.Pipeline.BearerTokenAuthenticationPolicy.AccessTokenCache.GetHeaderValueFromCredentialAsync(HttpMessage message, Boolean async, CancellationToken cancellationToken) in //sdk/core/Azure.Core/src/Pipeline/BearerTokenAuthenticationPolicy.cs:line 217 at Azure.Core.Pipeline.BearerTokenAuthenticationPolicy.AccessTokenCache.GetHeaderValueAsync(HttpMessage message, Boolean async) in //sdk/core/Azure.Core/src/Pipeline/BearerTokenAuthenticationPolicy.cs:line 110 at Azure.Core.Pipeline.BearerTokenAuthenticationPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) in /_/sdk/core/Azure.Core/src/Pipeline/BearerTokenAuthenticationPolicy.cs:line 61 at Azure.Core.Pipeline.BearerTokenAuthenticationPolicy.Process(HttpMessage message, ReadOnlyMemory1 pipeline) in //sdk/core/Azure.Core/src/Pipeline/BearerTokenAuthenticationPolicy.cs:line 51 at Azure.Core.Pipeline.HttpPipeline.Send(HttpMessage message, CancellationToken cancellationToken) in //sdk/core/Azure.Core/src/Pipeline/HttpPipeline.cs:line 81 at Azure.Core.TestFramework.SyncAsyncPolicyTestBase.SendRequestAsync(HttpPipeline pipeline, Action1 requestAction, Boolean bufferResponse, CancellationToken cancellationToken) in /home/vsts/work/1/s/sdk/core/Azure.Core.TestFramework/src/SyncAsyncPolicyTestBase.cs:line 32 at Azure.Core.TestFramework.SyncAsyncPolicyTestBase.SendRequestAsync(HttpPipelineTransport transport, Action1 requestAction, HttpPipelinePolicy policy, ResponseClassifier responseClassifier, Boolean bufferResponse, CancellationToken cancellationToken) in /home/vsts/work/1/s/sdk/core/Azure.Core.TestFramework/src/SyncAsyncPolicyTestBase.cs:line 43 at Azure.Core.TestFramework.SyncAsyncPolicyTestBase.SendGetRequest(HttpPipelineTransport transport, HttpPipelinePolicy policy, ResponseClassifier responseClassifier, Boolean bufferResponse, Uri uri, CancellationToken cancellationToken) in /home/vsts/work/1/s/sdk/core/Azure.Core.TestFramework/src/SyncAsyncPolicyTestBase.cs:line 48 at Azure.Core.Tests.BearerTokenAuthenticationPolicyTests.<>c__DisplayClass13_0.<<BearerTokenAuthenticationPolicy_GatedConcurrentCallsFailed>b__1>d.MoveNext() in /home/vsts/work/1/s/sdk/core/Azure.Core/tests/BearerTokenAuthenticationPolicyTests.cs:line 392 --- End of stack trace from previous location where exception was thrown --- at NUnit.Framework.Internal.TaskAwaitAdapter.GenericAdapter1.GetResult() at NUnit.Framework.Internal.AsyncToSyncAdapter.Await(Func1 invoke) at NUnit.Framework.Assert.ThrowsAsync(IResolveConstraint expression, AsyncTestDelegate code, String message, Object[] args)> But was: <System.InvalidOperationException: Error at Azure.Core.Tests.BearerTokenAuthenticationPolicyTests.<>c__DisplayClass13_0.<BearerTokenAuthenticationPolicy_GatedConcurrentCallsFailed>b__0(TokenRequestContext r, CancellationToken c) in /home/vsts/work/1/s/sdk/core/Azure.Core/tests/BearerTokenAuthenticationPolicyTests.cs:line 379 at Azure.Core.Tests.BearerTokenAuthenticationPolicyTests.TokenCredentialStub.GetToken(TokenRequestContext requestContext, CancellationToken cancellationToken) in /home/vsts/work/1/s/sdk/core/Azure.Core/tests/BearerTokenAuthenticationPolicyTests.cs:line 565 at Azure.Core.Pipeline.BearerTokenAuthenticationPolicy.AccessTokenCache.GetHeaderValueFromCredentialAsync(HttpMessage message, Boolean async, CancellationToken ca Stack trace at Azure.Core.Tests.BearerTokenAuthenticationPolicyTests.BearerTokenAuthenticationPolicy_GatedConcurrentCallsFailed() in /home/vsts/work/1/s/sdk/core/Azure.Core/tests/BearerTokenAuthenticationPolicyTests.cs:line 395 ```
test
bearertokenauthenticationpolicy gatedconcurrentcallsfailed error message expected system invalidoperationexception error at azure core tests bearertokenauthenticationpolicytests c b tokenrequestcontext r cancellationtoken c in home vsts work s sdk core azure core tests bearertokenauthenticationpolicytests cs line at azure core tests bearertokenauthenticationpolicytests tokencredentialstub gettoken tokenrequestcontext requestcontext cancellationtoken cancellationtoken in home vsts work s sdk core azure core tests bearertokenauthenticationpolicytests cs line at azure core pipeline bearertokenauthenticationpolicy accesstokencache getheadervaluefromcredentialasync httpmessage message boolean async cancellationtoken cancellationtoken in sdk core azure core src pipeline bearertokenauthenticationpolicy cs line at azure core pipeline bearertokenauthenticationpolicy accesstokencache getheadervalueasync httpmessage message boolean async in sdk core azure core src pipeline bearertokenauthenticationpolicy cs line at azure core pipeline bearertokenauthenticationpolicy processasync httpmessage message pipeline boolean async in sdk core azure core src pipeline bearertokenauthenticationpolicy cs line at azure core pipeline bearertokenauthenticationpolicy process httpmessage message pipeline in sdk core azure core src pipeline bearertokenauthenticationpolicy cs line at azure core pipeline httppipeline send httpmessage message cancellationtoken cancellationtoken in sdk core azure core src pipeline httppipeline cs line at azure core testframework syncasyncpolicytestbase sendrequestasync httppipeline pipeline requestaction boolean bufferresponse cancellationtoken cancellationtoken in home vsts work s sdk core azure core testframework src syncasyncpolicytestbase cs line at azure core testframework syncasyncpolicytestbase sendrequestasync httppipelinetransport transport requestaction httppipelinepolicy policy responseclassifier responseclassifier boolean bufferresponse cancellationtoken cancellationtoken in home vsts work s sdk core azure core testframework src syncasyncpolicytestbase cs line at azure core testframework syncasyncpolicytestbase sendgetrequest httppipelinetransport transport httppipelinepolicy policy responseclassifier responseclassifier boolean bufferresponse uri uri cancellationtoken cancellationtoken in home vsts work s sdk core azure core testframework src syncasyncpolicytestbase cs line at azure core tests bearertokenauthenticationpolicytests c b d movenext in home vsts work s sdk core azure core tests bearertokenauthenticationpolicytests cs line end of stack trace from previous location where exception was thrown at nunit framework internal taskawaitadapter getresult at nunit framework internal asynctosyncadapter await invoke at nunit framework assert throwsasync iresolveconstraint expression asynctestdelegate code string message object args but was system invalidoperationexception error at azure core tests bearertokenauthenticationpolicytests c b tokenrequestcontext r cancellationtoken c in home vsts work s sdk core azure core tests bearertokenauthenticationpolicytests cs line at azure core tests bearertokenauthenticationpolicytests tokencredentialstub gettoken tokenrequestcontext requestcontext cancellationtoken cancellationtoken in home vsts work s sdk core azure core tests bearertokenauthenticationpolicytests cs line at azure core pipeline bearertokenauthenticationpolicy accesstokencache getheadervaluefromcredentialasync httpmessage message boolean async cancellationtoken ca stack trace at azure core tests bearertokenauthenticationpolicytests bearertokenauthenticationpolicy gatedconcurrentcallsfailed in home vsts work s sdk core azure core tests bearertokenauthenticationpolicytests cs line
1
747,491
26,087,391,577
IssuesEvent
2022-12-26 05:57:47
canaltin-byte/SWE573-SDP-Can
https://api.github.com/repos/canaltin-byte/SWE573-SDP-Can
closed
Home Page should have a place to search contents
enhancement priority : Low Effort:High Home Page
Home Page should have a place to search contents
1.0
Home Page should have a place to search contents - Home Page should have a place to search contents
non_test
home page should have a place to search contents home page should have a place to search contents
0
35,750
14,861,757,661
IssuesEvent
2021-01-18 23:48:36
boto/boto3
https://api.github.com/repos/boto/boto3
closed
overwrite boto3 version for AWS Glue Python Shell jobs
bug closed-for-staleness service-api
I need to use a newer boto3 package for an AWS Glue Python3 shell job (Glue Version: 1.0). I included the wheel file downloaded from: https://pypi.org/project/boto3/1.13.21/#files: `boto3-1.13.21-py2.py3-none-any.whl` under Python Library Path. However, `boto3.__version__` prints out `1.9.203` even if the job log console says `boto3==1.13.21` was successfully installed. For some reason, Glue Python3 Shell job (Glue Version: 1.0) is not letting me overwrite the boto3 package version with the wheel file. Is there any way to overwrite?
1.0
overwrite boto3 version for AWS Glue Python Shell jobs - I need to use a newer boto3 package for an AWS Glue Python3 shell job (Glue Version: 1.0). I included the wheel file downloaded from: https://pypi.org/project/boto3/1.13.21/#files: `boto3-1.13.21-py2.py3-none-any.whl` under Python Library Path. However, `boto3.__version__` prints out `1.9.203` even if the job log console says `boto3==1.13.21` was successfully installed. For some reason, Glue Python3 Shell job (Glue Version: 1.0) is not letting me overwrite the boto3 package version with the wheel file. Is there any way to overwrite?
non_test
overwrite version for aws glue python shell jobs i need to use a newer package for an aws glue shell job glue version i included the wheel file downloaded from none any whl under python library path however version prints out even if the job log console says was successfully installed for some reason glue shell job glue version is not letting me overwrite the package version with the wheel file is there any way to overwrite
0
290,979
8,915,692,927
IssuesEvent
2019-01-19 08:53:39
UBC-Thunderbots/Software
https://api.github.com/repos/UBC-Thunderbots/Software
opened
Normalize the timestamping system
priority: high type: enhancement type: maintenance
**Is your feature request or enhancement related to a problem? What specifically could be better? Please describe.** After the change in #179 we now have several timestamping systems in use. We should normalize our useage of timestamps to a single system. Depends on #227 since managing timestamps will be easier after the nodes are callback-based. **Describe the solution you'd like** We should typedef `double` to `Timestamp`, and use this as the timestamp standard. Remove the old AITimestamp class (that wrapped std::chrono), and remove our use of std::chrono. **Describe alternatives you've considered** Using a different timestamping system like std::chrono. We should use the t_capture values so everything is as close to the "real world" timestamp as possible. **Additional context** None
1.0
Normalize the timestamping system - **Is your feature request or enhancement related to a problem? What specifically could be better? Please describe.** After the change in #179 we now have several timestamping systems in use. We should normalize our useage of timestamps to a single system. Depends on #227 since managing timestamps will be easier after the nodes are callback-based. **Describe the solution you'd like** We should typedef `double` to `Timestamp`, and use this as the timestamp standard. Remove the old AITimestamp class (that wrapped std::chrono), and remove our use of std::chrono. **Describe alternatives you've considered** Using a different timestamping system like std::chrono. We should use the t_capture values so everything is as close to the "real world" timestamp as possible. **Additional context** None
non_test
normalize the timestamping system is your feature request or enhancement related to a problem what specifically could be better please describe after the change in we now have several timestamping systems in use we should normalize our useage of timestamps to a single system depends on since managing timestamps will be easier after the nodes are callback based describe the solution you d like we should typedef double to timestamp and use this as the timestamp standard remove the old aitimestamp class that wrapped std chrono and remove our use of std chrono describe alternatives you ve considered using a different timestamping system like std chrono we should use the t capture values so everything is as close to the real world timestamp as possible additional context none
0
217,596
16,722,513,818
IssuesEvent
2021-06-10 09:01:52
apache/skywalking
https://api.github.com/repos/apache/skywalking
closed
Is this an error example in the document?
bug documentation good first issue
**When I follow the example from document to add the OAL to the /config/oal/core.oal** ![image](https://user-images.githubusercontent.com/34833891/121459005-098b1f80-c9dd-11eb-8cab-a54f12cfa6d0.png) ``` 2021-06-09 22:32:37,500 - org.apache.skywalking.oal.rt.OALRuntime - 420 [main] ERROR [] - Can't generate method doEndpointAbnormal for EndpointDispatcher. javassist.CannotCompileException: [source error] syntax error near "rce.get*()) ); org" at javassist.CtNewMethod.make(CtNewMethod.java:84) ~[javassist-3.25.0-GA.jar:?] at javassist.CtNewMethod.make(CtNewMethod.java:50) ~[javassist-3.25.0-GA.jar:?] at org.apache.skywalking.oal.rt.OALRuntime.generateDispatcherClass(OALRuntime.java:418) [oal-rt-8.5.0.jar:8.5.0] at org.apache.skywalking.oal.rt.OALRuntime.generateClassAtRuntime(OALRuntime.java:197) [oal-rt-8.5.0.jar:8.5.0] at org.apache.skywalking.oal.rt.OALRuntime.start(OALRuntime.java:166) [oal-rt-8.5.0.jar:8.5.0] at org.apache.skywalking.oap.server.core.oal.rt.OALEngineLoaderService.load(OALEngineLoaderService.java:65) [server-core-8.5.0.jar:8.5.0] at org.apache.skywalking.oap.server.analyzer.provider.AnalyzerModuleProvider.start(AnalyzerModuleProvider.java:116) [agent-analyzer-8.5.0.jar:8.5.0] at org.apache.skywalking.oap.server.library.module.BootstrapFlow.start(BootstrapFlow.java:49) [library-module-8.5.0.jar:8.5.0] at org.apache.skywalking.oap.server.library.module.ModuleManager.init(ModuleManager.java:60) [library-module-8.5.0.jar:8.5.0] at org.apache.skywalking.oap.server.starter.OAPServerBootstrap.start(OAPServerBootstrap.java:43) [server-bootstrap-8.5.0.jar:8.5.0] at org.apache.skywalking.oap.server.starter.OAPServerStartUp.main(OAPServerStartUp.java:26) [server-starter-8.5.0.jar:8.5.0] Caused by: javassist.compiler.SyntaxError: syntax error near "rce.get*()) ); org" at javassist.compiler.Parser.parsePrimaryExpr(Parser.java:1268) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parsePostfix(Parser.java:1045) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseUnaryExpr(Parser.java:900) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseBinaryExpr(Parser.java:790) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseConditionalExpr(Parser.java:735) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseExpression(Parser.java:715) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parsePrimaryExpr(Parser.java:1257) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parsePostfix(Parser.java:1045) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseCast(Parser.java:933) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseUnaryExpr(Parser.java:898) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.binaryExpr2(Parser.java:821) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseBinaryExpr(Parser.java:796) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseConditionalExpr(Parser.java:735) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseExpression(Parser.java:715) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parsePrimaryExpr(Parser.java:1257) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parsePostfix(Parser.java:1045) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseCast(Parser.java:933) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseUnaryExpr(Parser.java:898) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseCast(Parser.java:921) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseUnaryExpr(Parser.java:898) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseBinaryExpr(Parser.java:790) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseConditionalExpr(Parser.java:735) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseExpression(Parser.java:715) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseArgumentList(Parser.java:1340) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseMethodCall(Parser.java:1192) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parsePostfix(Parser.java:1050) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseUnaryExpr(Parser.java:900) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseBinaryExpr(Parser.java:790) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseConditionalExpr(Parser.java:735) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseExpression(Parser.java:715) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseDeclarationOrExpression(Parser.java:608) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseStatement(Parser.java:295) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseBlock(Parser.java:307) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseMethod2(Parser.java:172) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Javac.compileMethod(Javac.java:156) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Javac.compile(Javac.java:102) ~[javassist-3.25.0-GA.jar:?] at javassist.CtNewMethod.make(CtNewMethod.java:79) ~[javassist-3.25.0-GA.jar:?] ... 10 more 2021-06-09 22:32:37,504 - org.apache.skywalking.oal.rt.OALRuntime - 424 [main] ERROR [] - Method body as following private void doEndpointAbnormal(org.apache.skywalking.oap.server.core.source.Endpoint source) { org.apache.skywalking.oap.server.core.source.oal.rt.metrics.EndpointAbnormalMetrics metrics = new org.apache.skywalking.oap.server.core.source.oal.rt.metrics.EndpointAbnormalMetrics(); if (!new org.apache.skywalking.oap.server.core.analysis.metrics.expression.InMatch().match(source.getResponseCode(), new long[]{404,500,503})) { return; } metrics.setTimeBucket(source.getTimeBucket()); metrics.setEntityId(source.getEntityId()); metrics.setServiceId(source.getServiceId()); metrics.combine( (long)(source.get*()) ); org.apache.skywalking.oap.server.core.analysis.worker.MetricsStreamProcessor.getInstance().in(metrics); } ``` After I modified the `sum` to `count`, it works well. ### Question Is this an error example in the document? I use skywalking 8.5.0, centos 7 and jdk8
1.0
Is this an error example in the document? - **When I follow the example from document to add the OAL to the /config/oal/core.oal** ![image](https://user-images.githubusercontent.com/34833891/121459005-098b1f80-c9dd-11eb-8cab-a54f12cfa6d0.png) ``` 2021-06-09 22:32:37,500 - org.apache.skywalking.oal.rt.OALRuntime - 420 [main] ERROR [] - Can't generate method doEndpointAbnormal for EndpointDispatcher. javassist.CannotCompileException: [source error] syntax error near "rce.get*()) ); org" at javassist.CtNewMethod.make(CtNewMethod.java:84) ~[javassist-3.25.0-GA.jar:?] at javassist.CtNewMethod.make(CtNewMethod.java:50) ~[javassist-3.25.0-GA.jar:?] at org.apache.skywalking.oal.rt.OALRuntime.generateDispatcherClass(OALRuntime.java:418) [oal-rt-8.5.0.jar:8.5.0] at org.apache.skywalking.oal.rt.OALRuntime.generateClassAtRuntime(OALRuntime.java:197) [oal-rt-8.5.0.jar:8.5.0] at org.apache.skywalking.oal.rt.OALRuntime.start(OALRuntime.java:166) [oal-rt-8.5.0.jar:8.5.0] at org.apache.skywalking.oap.server.core.oal.rt.OALEngineLoaderService.load(OALEngineLoaderService.java:65) [server-core-8.5.0.jar:8.5.0] at org.apache.skywalking.oap.server.analyzer.provider.AnalyzerModuleProvider.start(AnalyzerModuleProvider.java:116) [agent-analyzer-8.5.0.jar:8.5.0] at org.apache.skywalking.oap.server.library.module.BootstrapFlow.start(BootstrapFlow.java:49) [library-module-8.5.0.jar:8.5.0] at org.apache.skywalking.oap.server.library.module.ModuleManager.init(ModuleManager.java:60) [library-module-8.5.0.jar:8.5.0] at org.apache.skywalking.oap.server.starter.OAPServerBootstrap.start(OAPServerBootstrap.java:43) [server-bootstrap-8.5.0.jar:8.5.0] at org.apache.skywalking.oap.server.starter.OAPServerStartUp.main(OAPServerStartUp.java:26) [server-starter-8.5.0.jar:8.5.0] Caused by: javassist.compiler.SyntaxError: syntax error near "rce.get*()) ); org" at javassist.compiler.Parser.parsePrimaryExpr(Parser.java:1268) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parsePostfix(Parser.java:1045) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseUnaryExpr(Parser.java:900) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseBinaryExpr(Parser.java:790) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseConditionalExpr(Parser.java:735) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseExpression(Parser.java:715) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parsePrimaryExpr(Parser.java:1257) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parsePostfix(Parser.java:1045) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseCast(Parser.java:933) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseUnaryExpr(Parser.java:898) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.binaryExpr2(Parser.java:821) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseBinaryExpr(Parser.java:796) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseConditionalExpr(Parser.java:735) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseExpression(Parser.java:715) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parsePrimaryExpr(Parser.java:1257) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parsePostfix(Parser.java:1045) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseCast(Parser.java:933) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseUnaryExpr(Parser.java:898) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseCast(Parser.java:921) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseUnaryExpr(Parser.java:898) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseBinaryExpr(Parser.java:790) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseConditionalExpr(Parser.java:735) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseExpression(Parser.java:715) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseArgumentList(Parser.java:1340) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseMethodCall(Parser.java:1192) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parsePostfix(Parser.java:1050) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseUnaryExpr(Parser.java:900) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseBinaryExpr(Parser.java:790) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseConditionalExpr(Parser.java:735) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseExpression(Parser.java:715) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseDeclarationOrExpression(Parser.java:608) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseStatement(Parser.java:295) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseBlock(Parser.java:307) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Parser.parseMethod2(Parser.java:172) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Javac.compileMethod(Javac.java:156) ~[javassist-3.25.0-GA.jar:?] at javassist.compiler.Javac.compile(Javac.java:102) ~[javassist-3.25.0-GA.jar:?] at javassist.CtNewMethod.make(CtNewMethod.java:79) ~[javassist-3.25.0-GA.jar:?] ... 10 more 2021-06-09 22:32:37,504 - org.apache.skywalking.oal.rt.OALRuntime - 424 [main] ERROR [] - Method body as following private void doEndpointAbnormal(org.apache.skywalking.oap.server.core.source.Endpoint source) { org.apache.skywalking.oap.server.core.source.oal.rt.metrics.EndpointAbnormalMetrics metrics = new org.apache.skywalking.oap.server.core.source.oal.rt.metrics.EndpointAbnormalMetrics(); if (!new org.apache.skywalking.oap.server.core.analysis.metrics.expression.InMatch().match(source.getResponseCode(), new long[]{404,500,503})) { return; } metrics.setTimeBucket(source.getTimeBucket()); metrics.setEntityId(source.getEntityId()); metrics.setServiceId(source.getServiceId()); metrics.combine( (long)(source.get*()) ); org.apache.skywalking.oap.server.core.analysis.worker.MetricsStreamProcessor.getInstance().in(metrics); } ``` After I modified the `sum` to `count`, it works well. ### Question Is this an error example in the document? I use skywalking 8.5.0, centos 7 and jdk8
non_test
is this an error example in the document when i follow the example from document to add the oal to the config oal core oal org apache skywalking oal rt oalruntime error can t generate method doendpointabnormal for endpointdispatcher javassist cannotcompileexception syntax error near rce get org at javassist ctnewmethod make ctnewmethod java at javassist ctnewmethod make ctnewmethod java at org apache skywalking oal rt oalruntime generatedispatcherclass oalruntime java at org apache skywalking oal rt oalruntime generateclassatruntime oalruntime java at org apache skywalking oal rt oalruntime start oalruntime java at org apache skywalking oap server core oal rt oalengineloaderservice load oalengineloaderservice java at org apache skywalking oap server analyzer provider analyzermoduleprovider start analyzermoduleprovider java at org apache skywalking oap server library module bootstrapflow start bootstrapflow java at org apache skywalking oap server library module modulemanager init modulemanager java at org apache skywalking oap server starter oapserverbootstrap start oapserverbootstrap java at org apache skywalking oap server starter oapserverstartup main oapserverstartup java caused by javassist compiler syntaxerror syntax error near rce get org at javassist compiler parser parseprimaryexpr parser java at javassist compiler parser parsepostfix parser java at javassist compiler parser parseunaryexpr parser java at javassist compiler parser parsebinaryexpr parser java at javassist compiler parser parseconditionalexpr parser java at javassist compiler parser parseexpression parser java at javassist compiler parser parseprimaryexpr parser java at javassist compiler parser parsepostfix parser java at javassist compiler parser parsecast parser java at javassist compiler parser parseunaryexpr parser java at javassist compiler parser parser java at javassist compiler parser parsebinaryexpr parser java at javassist compiler parser parseconditionalexpr parser java at javassist compiler parser parseexpression parser java at javassist compiler parser parseprimaryexpr parser java at javassist compiler parser parsepostfix parser java at javassist compiler parser parsecast parser java at javassist compiler parser parseunaryexpr parser java at javassist compiler parser parsecast parser java at javassist compiler parser parseunaryexpr parser java at javassist compiler parser parsebinaryexpr parser java at javassist compiler parser parseconditionalexpr parser java at javassist compiler parser parseexpression parser java at javassist compiler parser parseargumentlist parser java at javassist compiler parser parsemethodcall parser java at javassist compiler parser parsepostfix parser java at javassist compiler parser parseunaryexpr parser java at javassist compiler parser parsebinaryexpr parser java at javassist compiler parser parseconditionalexpr parser java at javassist compiler parser parseexpression parser java at javassist compiler parser parsedeclarationorexpression parser java at javassist compiler parser parsestatement parser java at javassist compiler parser parseblock parser java at javassist compiler parser parser java at javassist compiler javac compilemethod javac java at javassist compiler javac compile javac java at javassist ctnewmethod make ctnewmethod java more org apache skywalking oal rt oalruntime error method body as following private void doendpointabnormal org apache skywalking oap server core source endpoint source org apache skywalking oap server core source oal rt metrics endpointabnormalmetrics metrics new org apache skywalking oap server core source oal rt metrics endpointabnormalmetrics if new org apache skywalking oap server core analysis metrics expression inmatch match source getresponsecode new long return metrics settimebucket source gettimebucket metrics setentityid source getentityid metrics setserviceid source getserviceid metrics combine long source get org apache skywalking oap server core analysis worker metricsstreamprocessor getinstance in metrics after i modified the sum to count it works well question is this an error example in the document i use skywalking centos and
0
450,295
13,001,602,447
IssuesEvent
2020-07-24 00:15:06
googleapis/java-functions
https://api.github.com/repos/googleapis/java-functions
closed
Synthesis failed for java-functions
autosynth failure priority: p1 type: bug
Hello! Autosynth couldn't regenerate java-functions. :broken_heart: Here's the output from running `synth.py`: ``` ttp.bzl:296:16): - <builtin> - /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/com_google_api_gax_java/repositories.bzl:122:5 - /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/com_google_api_gax_java/repositories.bzl:60:5 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:160:1 DEBUG: Rule 'com_google_api_codegen' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "90dcc26d0b2e3e7955b7eece586af512e9a53fdda47636e08a52ca6deaf22ba3" DEBUG: Call stack for the definition of repository 'com_google_api_codegen' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16): - <builtin> - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:77:1 DEBUG: Rule 'com_google_protoc_java_resource_names_plugin' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "4b714b35ee04ba90f560ee60e64c7357428efcb6b0f3a298f343f8ec2c6d4a5d" DEBUG: Call stack for the definition of repository 'com_google_protoc_java_resource_names_plugin' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16): - <builtin> - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:177:1 DEBUG: Rule 'protoc_docs_plugin' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "33b387245455775e0de45869c7355cc5a9e98b396a6fc43b02812a63b75fee20" DEBUG: Call stack for the definition of repository 'protoc_docs_plugin' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16): - <builtin> - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:201:1 DEBUG: Rule 'gapic_generator_python' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "0539184b227155553d6cc9bc57450b5522d3350521aa5b53f491b4227d0c765b" DEBUG: Call stack for the definition of repository 'gapic_generator_python' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16): - <builtin> - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:224:1 DEBUG: Rule 'gapic_generator_typescript' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "fd6b7ed0c2d0239a4b9192bc8f50c2a3acd2b72ca6faa0a0b039186a48617769" DEBUG: Call stack for the definition of repository 'gapic_generator_typescript' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16): - <builtin> - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:280:1 DEBUG: Rule 'gapic_generator_csharp' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "40ddae63d2729ef5ccbd8b60123327ea200ce9400d0629238193ff530dcaea18" DEBUG: Call stack for the definition of repository 'gapic_generator_csharp' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16): - <builtin> - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:323:1 ERROR: Skipping '//google/cloud/functions/v1:google-cloud-functions-v1-java': no such target '//google/cloud/functions/v1:google-cloud-functions-v1-java': target 'google-cloud-functions-v1-java' not declared in package 'google/cloud/functions/v1' (did you mean 'google-cloud-functions-v1-php'?) defined by /home/kbuilder/.cache/synthtool/googleapis/google/cloud/functions/v1/BUILD.bazel WARNING: Target pattern parsing failed. ERROR: no such target '//google/cloud/functions/v1:google-cloud-functions-v1-java': target 'google-cloud-functions-v1-java' not declared in package 'google/cloud/functions/v1' (did you mean 'google-cloud-functions-v1-php'?) defined by /home/kbuilder/.cache/synthtool/googleapis/google/cloud/functions/v1/BUILD.bazel INFO: Elapsed time: 1.804s INFO: 0 processes. FAILED: Build did NOT complete successfully (0 packages loaded) FAILED: Build did NOT complete successfully (0 packages loaded) 2020-07-20 17:23:49,393 synthtool [DEBUG] > Wrote metadata to synth.metadata. Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module> main() File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main spec.loader.exec_module(synth_module) # type: ignore File "<frozen importlib._bootstrap_external>", line 678, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/home/kbuilder/.cache/synthtool/java-functions/synth.py", line 29, in <module> bazel_target=f'//google/cloud/{service}/{version}:google-cloud-{service}-{version}-java', File "/tmpfs/src/github/synthtool/synthtool/languages/java.py", line 310, in bazel_library library = gapic.java_library(service=service, version=version, **kwargs) File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 62, in java_library service, version, "java", tar_strip_components=0, **kwargs File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 183, in _generate_code shell.run(bazel_run_args) File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run raise exc File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run encoding="utf-8", File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run output=stdout, stderr=stderr) subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/cloud/functions/v1:google-cloud-functions-v1-java']' returned non-zero exit status 1. 2020-07-20 17:23:49,430 autosynth [ERROR] > Synthesis failed 2020-07-20 17:23:49,431 autosynth [DEBUG] > Running: git reset --hard HEAD HEAD is now at 8b4bef7 feat: initial code generation 2020-07-20 17:23:49,436 autosynth [DEBUG] > Running: git checkout autosynth Switched to branch 'autosynth' 2020-07-20 17:23:49,440 autosynth [DEBUG] > Running: git clean -fdx Removing __pycache__/ Traceback (most recent call last): File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 367, in synthesize_loop synthesize_inner_loop(fork, synthesizer) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 411, in synthesize_inner_loop synthesizer, len(toolbox.versions) - 1 File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 266, in synthesize_version_in_new_branch synthesizer.synthesize(synth_log_path, self.environ) File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize synth_proc.check_returncode() # Raise an exception. File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode self.stderr) subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 657, in <module> main() File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 506, in main return _inner_main(temp_dir) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 637, in _inner_main commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 375, in synthesize_loop synthesize_loop_single_pr(toolbox, change_pusher, synthesizer) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 401, in synthesize_loop_single_pr synthesize_inner_loop(toolbox, synthesizer) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 411, in synthesize_inner_loop synthesizer, len(toolbox.versions) - 1 File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 266, in synthesize_version_in_new_branch synthesizer.synthesize(synth_log_path, self.environ) File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize synth_proc.check_returncode() # Raise an exception. File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode self.stderr) subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1. ``` Google internal developers can see the full log [here](http://sponge2/results/invocations/05a24090-a619-4aa7-b499-33d7cfac411c/targets/github%2Fsynthtool;config=default/tests;query=java-functions;failed=false).
1.0
Synthesis failed for java-functions - Hello! Autosynth couldn't regenerate java-functions. :broken_heart: Here's the output from running `synth.py`: ``` ttp.bzl:296:16): - <builtin> - /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/com_google_api_gax_java/repositories.bzl:122:5 - /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/com_google_api_gax_java/repositories.bzl:60:5 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:160:1 DEBUG: Rule 'com_google_api_codegen' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "90dcc26d0b2e3e7955b7eece586af512e9a53fdda47636e08a52ca6deaf22ba3" DEBUG: Call stack for the definition of repository 'com_google_api_codegen' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16): - <builtin> - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:77:1 DEBUG: Rule 'com_google_protoc_java_resource_names_plugin' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "4b714b35ee04ba90f560ee60e64c7357428efcb6b0f3a298f343f8ec2c6d4a5d" DEBUG: Call stack for the definition of repository 'com_google_protoc_java_resource_names_plugin' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16): - <builtin> - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:177:1 DEBUG: Rule 'protoc_docs_plugin' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "33b387245455775e0de45869c7355cc5a9e98b396a6fc43b02812a63b75fee20" DEBUG: Call stack for the definition of repository 'protoc_docs_plugin' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16): - <builtin> - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:201:1 DEBUG: Rule 'gapic_generator_python' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "0539184b227155553d6cc9bc57450b5522d3350521aa5b53f491b4227d0c765b" DEBUG: Call stack for the definition of repository 'gapic_generator_python' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16): - <builtin> - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:224:1 DEBUG: Rule 'gapic_generator_typescript' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "fd6b7ed0c2d0239a4b9192bc8f50c2a3acd2b72ca6faa0a0b039186a48617769" DEBUG: Call stack for the definition of repository 'gapic_generator_typescript' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16): - <builtin> - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:280:1 DEBUG: Rule 'gapic_generator_csharp' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "40ddae63d2729ef5ccbd8b60123327ea200ce9400d0629238193ff530dcaea18" DEBUG: Call stack for the definition of repository 'gapic_generator_csharp' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16): - <builtin> - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:323:1 ERROR: Skipping '//google/cloud/functions/v1:google-cloud-functions-v1-java': no such target '//google/cloud/functions/v1:google-cloud-functions-v1-java': target 'google-cloud-functions-v1-java' not declared in package 'google/cloud/functions/v1' (did you mean 'google-cloud-functions-v1-php'?) defined by /home/kbuilder/.cache/synthtool/googleapis/google/cloud/functions/v1/BUILD.bazel WARNING: Target pattern parsing failed. ERROR: no such target '//google/cloud/functions/v1:google-cloud-functions-v1-java': target 'google-cloud-functions-v1-java' not declared in package 'google/cloud/functions/v1' (did you mean 'google-cloud-functions-v1-php'?) defined by /home/kbuilder/.cache/synthtool/googleapis/google/cloud/functions/v1/BUILD.bazel INFO: Elapsed time: 1.804s INFO: 0 processes. FAILED: Build did NOT complete successfully (0 packages loaded) FAILED: Build did NOT complete successfully (0 packages loaded) 2020-07-20 17:23:49,393 synthtool [DEBUG] > Wrote metadata to synth.metadata. Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module> main() File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main spec.loader.exec_module(synth_module) # type: ignore File "<frozen importlib._bootstrap_external>", line 678, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/home/kbuilder/.cache/synthtool/java-functions/synth.py", line 29, in <module> bazel_target=f'//google/cloud/{service}/{version}:google-cloud-{service}-{version}-java', File "/tmpfs/src/github/synthtool/synthtool/languages/java.py", line 310, in bazel_library library = gapic.java_library(service=service, version=version, **kwargs) File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 62, in java_library service, version, "java", tar_strip_components=0, **kwargs File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 183, in _generate_code shell.run(bazel_run_args) File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run raise exc File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run encoding="utf-8", File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run output=stdout, stderr=stderr) subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/cloud/functions/v1:google-cloud-functions-v1-java']' returned non-zero exit status 1. 2020-07-20 17:23:49,430 autosynth [ERROR] > Synthesis failed 2020-07-20 17:23:49,431 autosynth [DEBUG] > Running: git reset --hard HEAD HEAD is now at 8b4bef7 feat: initial code generation 2020-07-20 17:23:49,436 autosynth [DEBUG] > Running: git checkout autosynth Switched to branch 'autosynth' 2020-07-20 17:23:49,440 autosynth [DEBUG] > Running: git clean -fdx Removing __pycache__/ Traceback (most recent call last): File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 367, in synthesize_loop synthesize_inner_loop(fork, synthesizer) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 411, in synthesize_inner_loop synthesizer, len(toolbox.versions) - 1 File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 266, in synthesize_version_in_new_branch synthesizer.synthesize(synth_log_path, self.environ) File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize synth_proc.check_returncode() # Raise an exception. File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode self.stderr) subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 657, in <module> main() File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 506, in main return _inner_main(temp_dir) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 637, in _inner_main commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 375, in synthesize_loop synthesize_loop_single_pr(toolbox, change_pusher, synthesizer) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 401, in synthesize_loop_single_pr synthesize_inner_loop(toolbox, synthesizer) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 411, in synthesize_inner_loop synthesizer, len(toolbox.versions) - 1 File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 266, in synthesize_version_in_new_branch synthesizer.synthesize(synth_log_path, self.environ) File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize synth_proc.check_returncode() # Raise an exception. File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode self.stderr) subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1. ``` Google internal developers can see the full log [here](http://sponge2/results/invocations/05a24090-a619-4aa7-b499-33d7cfac411c/targets/github%2Fsynthtool;config=default/tests;query=java-functions;failed=false).
non_test
synthesis failed for java functions hello autosynth couldn t regenerate java functions broken heart here s the output from running synth py ttp bzl home kbuilder cache bazel bazel kbuilder external com google api gax java repositories bzl home kbuilder cache bazel bazel kbuilder external com google api gax java repositories bzl home kbuilder cache synthtool googleapis workspace debug rule com google api codegen indicated that a canonical reproducible form can be obtained by modifying arguments debug call stack for the definition of repository com google api codegen which is a http archive rule definition at home kbuilder cache bazel bazel kbuilder external bazel tools tools build defs repo http bzl home kbuilder cache synthtool googleapis workspace debug rule com google protoc java resource names plugin indicated that a canonical reproducible form can be obtained by modifying arguments debug call stack for the definition of repository com google protoc java resource names plugin which is a http archive rule definition at home kbuilder cache bazel bazel kbuilder external bazel tools tools build defs repo http bzl home kbuilder cache synthtool googleapis workspace debug rule protoc docs plugin indicated that a canonical reproducible form can be obtained by modifying arguments debug call stack for the definition of repository protoc docs plugin which is a http archive rule definition at home kbuilder cache bazel bazel kbuilder external bazel tools tools build defs repo http bzl home kbuilder cache synthtool googleapis workspace debug rule gapic generator python indicated that a canonical reproducible form can be obtained by modifying arguments debug call stack for the definition of repository gapic generator python which is a http archive rule definition at home kbuilder cache bazel bazel kbuilder external bazel tools tools build defs repo http bzl home kbuilder cache synthtool googleapis workspace debug rule gapic generator typescript indicated that a canonical reproducible form can be obtained by modifying arguments debug call stack for the definition of repository gapic generator typescript which is a http archive rule definition at home kbuilder cache bazel bazel kbuilder external bazel tools tools build defs repo http bzl home kbuilder cache synthtool googleapis workspace debug rule gapic generator csharp indicated that a canonical reproducible form can be obtained by modifying arguments debug call stack for the definition of repository gapic generator csharp which is a http archive rule definition at home kbuilder cache bazel bazel kbuilder external bazel tools tools build defs repo http bzl home kbuilder cache synthtool googleapis workspace error skipping google cloud functions google cloud functions java no such target google cloud functions google cloud functions java target google cloud functions java not declared in package google cloud functions did you mean google cloud functions php defined by home kbuilder cache synthtool googleapis google cloud functions build bazel warning target pattern parsing failed error no such target google cloud functions google cloud functions java target google cloud functions java not declared in package google cloud functions did you mean google cloud functions php defined by home kbuilder cache synthtool googleapis google cloud functions build bazel info elapsed time info processes failed build did not complete successfully packages loaded failed build did not complete successfully packages loaded synthtool wrote metadata to synth metadata traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src github synthtool synthtool main py line in main file tmpfs src github synthtool env lib site packages click core py line in call return self main args kwargs file tmpfs src github synthtool env lib site packages click core py line in main rv self invoke ctx file tmpfs src github synthtool env lib site packages click core py line in invoke return ctx invoke self callback ctx params file tmpfs src github synthtool env lib site packages click core py line in invoke return callback args kwargs file tmpfs src github synthtool synthtool main py line in main spec loader exec module synth module type ignore file line in exec module file line in call with frames removed file home kbuilder cache synthtool java functions synth py line in bazel target f google cloud service version google cloud service version java file tmpfs src github synthtool synthtool languages java py line in bazel library library gapic java library service service version version kwargs file tmpfs src github synthtool synthtool gcp gapic bazel py line in java library service version java tar strip components kwargs file tmpfs src github synthtool synthtool gcp gapic bazel py line in generate code shell run bazel run args file tmpfs src github synthtool synthtool shell py line in run raise exc file tmpfs src github synthtool synthtool shell py line in run encoding utf file home kbuilder pyenv versions lib subprocess py line in run output stdout stderr stderr subprocess calledprocesserror command returned non zero exit status autosynth synthesis failed autosynth running git reset hard head head is now at feat initial code generation autosynth running git checkout autosynth switched to branch autosynth autosynth running git clean fdx removing pycache traceback most recent call last file tmpfs src github synthtool autosynth synth py line in synthesize loop synthesize inner loop fork synthesizer file tmpfs src github synthtool autosynth synth py line in synthesize inner loop synthesizer len toolbox versions file tmpfs src github synthtool autosynth synth py line in synthesize version in new branch synthesizer synthesize synth log path self environ file tmpfs src github synthtool autosynth synthesizer py line in synthesize synth proc check returncode raise an exception file home kbuilder pyenv versions lib subprocess py line in check returncode self stderr subprocess calledprocesserror command returned non zero exit status during handling of the above exception another exception occurred traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src github synthtool autosynth synth py line in main file tmpfs src github synthtool autosynth synth py line in main return inner main temp dir file tmpfs src github synthtool autosynth synth py line in inner main commit count synthesize loop x multiple prs change pusher synthesizer file tmpfs src github synthtool autosynth synth py line in synthesize loop synthesize loop single pr toolbox change pusher synthesizer file tmpfs src github synthtool autosynth synth py line in synthesize loop single pr synthesize inner loop toolbox synthesizer file tmpfs src github synthtool autosynth synth py line in synthesize inner loop synthesizer len toolbox versions file tmpfs src github synthtool autosynth synth py line in synthesize version in new branch synthesizer synthesize synth log path self environ file tmpfs src github synthtool autosynth synthesizer py line in synthesize synth proc check returncode raise an exception file home kbuilder pyenv versions lib subprocess py line in check returncode self stderr subprocess calledprocesserror command returned non zero exit status google internal developers can see the full log
0
54,751
3,071,177,176
IssuesEvent
2015-08-19 10:18:22
pavel-pimenov/flylinkdc-r5xx
https://api.github.com/repos/pavel-pimenov/flylinkdc-r5xx
closed
Баг с отображением времени работы флая в трее
bug imported Priority-Medium
_From [kirill.B...@gmail.com](https://code.google.com/u/118374335061098442652/) on May 12, 2010 17:22:46_ Версия флая r500a48 . При наведении курсора на иконку флая в трее, время отображается с доп. символами. Баг был замечен на ОС: Windows XP x64, Windows 7 x64. Имеет плавающий характер, т.е. не всегда появляется. **Attachment:** [r500a48_Time_Bag_1.jpg r500a48_Time_Bag_2.jpg](http://code.google.com/p/flylinkdc/issues/detail?id=90) _Original issue: http://code.google.com/p/flylinkdc/issues/detail?id=90_
1.0
Баг с отображением времени работы флая в трее - _From [kirill.B...@gmail.com](https://code.google.com/u/118374335061098442652/) on May 12, 2010 17:22:46_ Версия флая r500a48 . При наведении курсора на иконку флая в трее, время отображается с доп. символами. Баг был замечен на ОС: Windows XP x64, Windows 7 x64. Имеет плавающий характер, т.е. не всегда появляется. **Attachment:** [r500a48_Time_Bag_1.jpg r500a48_Time_Bag_2.jpg](http://code.google.com/p/flylinkdc/issues/detail?id=90) _Original issue: http://code.google.com/p/flylinkdc/issues/detail?id=90_
non_test
баг с отображением времени работы флая в трее from on may версия флая при наведении курсора на иконку флая в трее время отображается с доп символами баг был замечен на ос windows xp windows имеет плавающий характер т е не всегда появляется attachment original issue
0
242,407
20,246,618,599
IssuesEvent
2022-02-14 14:18:36
rspott/WAF-test02
https://api.github.com/repos/rspott/WAF-test02
opened
Adaptive network hardening recommendations should be applied on internet facing virtual machines for 1 Virtual machine(s)
WARP-Import test1 Security Azure Advisor
<a href="https://aka.ms/azure-advisor-portal">Adaptive network hardening recommendations should be applied on internet facing virtual machines for 1 Virtual machine(s)</a>
1.0
Adaptive network hardening recommendations should be applied on internet facing virtual machines for 1 Virtual machine(s) - <a href="https://aka.ms/azure-advisor-portal">Adaptive network hardening recommendations should be applied on internet facing virtual machines for 1 Virtual machine(s)</a>
test
adaptive network hardening recommendations should be applied on internet facing virtual machines for virtual machine s
1
162,625
12,683,405,928
IssuesEvent
2020-06-19 19:40:44
GaloisInc/cryptol
https://api.github.com/repos/GaloisInc/cryptol
closed
Solver configuration in CI
testing
Right now the cryptol test suite appears only to exercise symbolic solving via Z3. It would be nice to test other solvers in the test suite (e.g., CVC4, yices). Currently the travis build for cryptol just `curl`s a pretty old copy of Z3 from `saw.galois.com`, which seems suboptimal. I believe `fryingpan` builds can install other solvers, but I don't know how to set that up. @kquick, any thoughts on the best way to handle this?
1.0
Solver configuration in CI - Right now the cryptol test suite appears only to exercise symbolic solving via Z3. It would be nice to test other solvers in the test suite (e.g., CVC4, yices). Currently the travis build for cryptol just `curl`s a pretty old copy of Z3 from `saw.galois.com`, which seems suboptimal. I believe `fryingpan` builds can install other solvers, but I don't know how to set that up. @kquick, any thoughts on the best way to handle this?
test
solver configuration in ci right now the cryptol test suite appears only to exercise symbolic solving via it would be nice to test other solvers in the test suite e g yices currently the travis build for cryptol just curl s a pretty old copy of from saw galois com which seems suboptimal i believe fryingpan builds can install other solvers but i don t know how to set that up kquick any thoughts on the best way to handle this
1
227,965
18,143,805,785
IssuesEvent
2021-09-25 04:01:29
hyphacoop/cosmos-organizing
https://api.github.com/repos/hyphacoop/cosmos-organizing
opened
Create shell script that reads a config file to modify a genesis.json file
testnet-work
- chain config - old chain id and what to modify it to - validators (can be many) - old address and new address - modify delegation power - modify governance parameters
1.0
Create shell script that reads a config file to modify a genesis.json file - - chain config - old chain id and what to modify it to - validators (can be many) - old address and new address - modify delegation power - modify governance parameters
test
create shell script that reads a config file to modify a genesis json file chain config old chain id and what to modify it to validators can be many old address and new address modify delegation power modify governance parameters
1
225,184
17,796,862,842
IssuesEvent
2021-08-31 23:58:07
pandas-dev/pandas
https://api.github.com/repos/pandas-dev/pandas
closed
SeriesGroupBy.first / last loses categorical dtype
Groupby Dtype Conversions Regression Categorical good first issue Needs Tests
On 1.0.3 and master: ```python import pandas as pd df = pd.DataFrame({"a": [1, 2, 3]}) df["b"] = df["a"].astype("category") print(df.groupby("a")["b"].first()) print(df.groupby("a")["b"].last()) ``` gives ```python a 1 1 2 2 3 3 Name: b, dtype: int64 a 1 1 2 2 3 3 Name: b, dtype: int64 ``` but the dtype should still be categorical and not int64. This seemingly wrong output is explicitly tested for here: https://github.com/pandas-dev/pandas/blob/master/pandas/tests/groupby/aggregate/test_aggregate.py#L461
1.0
SeriesGroupBy.first / last loses categorical dtype - On 1.0.3 and master: ```python import pandas as pd df = pd.DataFrame({"a": [1, 2, 3]}) df["b"] = df["a"].astype("category") print(df.groupby("a")["b"].first()) print(df.groupby("a")["b"].last()) ``` gives ```python a 1 1 2 2 3 3 Name: b, dtype: int64 a 1 1 2 2 3 3 Name: b, dtype: int64 ``` but the dtype should still be categorical and not int64. This seemingly wrong output is explicitly tested for here: https://github.com/pandas-dev/pandas/blob/master/pandas/tests/groupby/aggregate/test_aggregate.py#L461
test
seriesgroupby first last loses categorical dtype on and master python import pandas as pd df pd dataframe a df df astype category print df groupby a first print df groupby a last gives python a name b dtype a name b dtype but the dtype should still be categorical and not this seemingly wrong output is explicitly tested for here
1
352,438
32,068,192,619
IssuesEvent
2023-09-25 05:51:51
lewisKendall-Jones/azure-sdk-for-java-jdk8-httpurlconnection-client
https://api.github.com/repos/lewisKendall-Jones/azure-sdk-for-java-jdk8-httpurlconnection-client
closed
Reinstate removed tests
Test
During the test migration, some of the tests were removed due to incompatible features. These tests will need to be reinstated as we are now using those features (options)
1.0
Reinstate removed tests - During the test migration, some of the tests were removed due to incompatible features. These tests will need to be reinstated as we are now using those features (options)
test
reinstate removed tests during the test migration some of the tests were removed due to incompatible features these tests will need to be reinstated as we are now using those features options
1
671,937
22,781,488,275
IssuesEvent
2022-07-08 20:21:43
COS301-SE-2022/Training-Buddy
https://api.github.com/repos/COS301-SE-2022/Training-Buddy
closed
Data Access: Activity Schedules
priority:medium scope:db
- [ ] create activity schedule - [ ] get all scheduled activities - [ ] create an invitation - [ ] view outgoing invitations - [ ] view incoming invitations - [ ] accept/reject invitation - [ ] mark activity as complete
1.0
Data Access: Activity Schedules - - [ ] create activity schedule - [ ] get all scheduled activities - [ ] create an invitation - [ ] view outgoing invitations - [ ] view incoming invitations - [ ] accept/reject invitation - [ ] mark activity as complete
non_test
data access activity schedules create activity schedule get all scheduled activities create an invitation view outgoing invitations view incoming invitations accept reject invitation mark activity as complete
0
324,765
9,912,239,529
IssuesEvent
2019-06-28 08:30:22
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
mail.ru - see bug description
browser-firefox engine-gecko priority-critical
<!-- @browser: Firefox 66.0 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; WOW64; rv:66.0) Gecko/20100101 Firefox/66.0 --> <!-- @reported_with: desktop-reporter --> **URL**: https://mail.ru/ **Browser / Version**: Firefox 66.0 **Operating System**: Windows 10 **Tested Another Browser**: No **Problem type**: Something else **Description**: redirecting problem **Steps to Reproduce**: [![Screenshot Description](https://webcompat.com/uploads/2019/6/4afc42a4-7a06-40be-b107-680848175aba-thumb.jpeg)](https://webcompat.com/uploads/2019/6/4afc42a4-7a06-40be-b107-680848175aba.jpeg) <details> <summary>Browser Configuration</summary> <ul> <li>mixed active content blocked: false</li><li>image.mem.shared: true</li><li>buildID: 20190204181317</li><li>tracking content blocked: false</li><li>gfx.webrender.blob-images: true</li><li>hasTouchScreen: false</li><li>mixed passive content blocked: false</li><li>gfx.webrender.enabled: false</li><li>gfx.webrender.all: false</li><li>channel: beta</li> </ul> <p>Console Messages:</p> <pre> [u'[JavaScript Warning: "Content Security Policy: Directive child-src has been deprecated. Please use directive worker-src to control workers, or directive frame-src to control frames respectively."]', u'[console.time(headline.inline.js) https://img.imgsmail.ru/ph/0.58.22/inline.js:168:92]', u'[console.timeEnd(headline.inline.js) https://img.imgsmail.ru/ph/0.58.22/inline.js:224:398]', u'[JavaScript Warning: "Request to access cookie or storage on https://www.tns-counter.ru/V13a***R%3E*mail_ru/ru/UTF-8/tmsec=mail_main/630649319 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://counter.yadro.ru/hit;mail-splash/pc?r;s1366*768*24;uhttps%3A//mail.ru/;0.896513189111155 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "The script from https://top-fwz1.mail.ru/js/code.js was loaded even though its MIME type (text/plain) is not a valid JavaScript MIME type." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://saltcdn2.googleapis.com/loader.js." {file: "https://mail.ru/" line: 1}]', u'[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at https://saltcdn2.googleapis.com/log.html (frame-src)."]', u'[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at https://saltcdn2.googleapis.com/loader.js (script-src)."]', u'[JavaScript Warning: "Request to access cookie or storage on https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[console.time(headline.external.js) https://img.imgsmail.ru/ph/0.58.22/external.min.js:25:346]', u'[console.timeEnd(headline.external.js) https://img.imgsmail.ru/ph/0.58.22/external.min.js:231:303]', u'[JavaScript Warning: "Request to access cookie or storage on https://www.tns-counter.ru/V13a***R%3E*mail_ru/ru/UTF-8/tmsec=mail_main/630649319 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://adservice.google.com.bd/adsid/integrator.js?domain=ad.mail.ru was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://adservice.google.com/adsid/integrator.js?domain=ad.mail.ru was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://pagead2.googlesyndication.com/pagead/js/r20190624/r20190131/show_ads_impl.js was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://pagead2.googlesyndication.com/pub-config/r20160913/ca-pub-4831681952934476.js was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://googleads.g.doubleclick.net/pagead/html/r20190624/r20190131/zrt_lookup.html# was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://googleads.g.doubleclick.net/pagead/html/r20190624/r20190131/zrt_lookup.html# was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://counter.yadro.ru/hit;mail-splash/pc?r;s1366*768*24;uhttps%3A//mail.ru/;0.896513189111155 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://www.tns-counter.ru/V13b***R%3E*mail_ru/ru/UTF-8/tmsec=mail_main/630649319 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://counter.yadro.ru/hit;mail-splash/pc?q;r;s1366*768*24;uhttps%3A//mail.ru/;0.896513189111155 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://googleads.g.doubleclick.net/pagead/ads?client=ca-pub-4831681952934476&output=html&h=250&slotname=7515917252&adk=3858467216&adf=3279755397&w=300&guci=2.2.0.0.2.2.0.0&format=300x250&url=mail.ru&ea=0&flash=0&avail_w=300&wgl=1&dt=1561710141886&bpp=14&bdt=2892&fdt=1055&idt=1055&shv=r20190624&cbv=r20190131&saldr=aa&correlator=8772594697654&frm=24&ife=1&pv=2&ga_vid=715693380.1561710143&ga_sid=1561710143&ga_hid=1976816211&ga_fc=0&icsg=682&nhd=1&dssz=6&mdo=0&mso=0&u_tz=360&u_his=2&u_java=0&u_h=768&u_w=1366&u_ah=728&u_aw=1366&u_cd=24&u_nplug=0&u_nmime=0&adx=0&ady=0&biw=-12245933&bih=-12245933&isw=300&ish=250&ifk=705312705&scr_x=-12245933&scr_y=-12245933&eid=21060853%2C151527001%2C151527201%2C229739147%2C229739149%2C633794004%2C633794005&oid=3&loc=https%3A%2F%2Fmail.ru%2F&rx=0&eae=2&brdim=-8%2C-8%2C-8%2C-8%2C1366%2C0%2C1382%2C744%2C300%2C250&vis=1&rsz=%7C%7CcE%7C&abl=NS&pfx=0&fu=16&bc=29&ifi=1&uci=1.716n8wm4amc5&dtd=1081 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://www.googletagservices.com/activeview/js/current/osd.js?cb=%2Fr20100101 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://counter.yadro.ru/hit;mail-splash/pc?q;r;s1366*768*24;uhttps%3A//mail.ru/;0.896513189111155 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://googleads.g.doubleclick.net/pagead/ads?client=ca-pub-4831681952934476&output=html&h=250&slotname=7515917252&adk=3858467216&adf=3279755397&w=300&guci=2.2.0.0.2.2.0.0&format=300x250&url=mail.ru&ea=0&flash=0&avail_w=300&wgl=1&dt=1561710141886&bpp=14&bdt=2892&fdt=1055&idt=1055&shv=r20190624&cbv=r20190131&saldr=aa&correlator=8772594697654&frm=24&ife=1&pv=2&ga_vid=715693380.1561710143&ga_sid=1561710143&ga_hid=1976816211&ga_fc=0&icsg=682&nhd=1&dssz=6&mdo=0&mso=0&u_tz=360&u_his=2&u_java=0&u_h=768&u_w=1366&u_ah=728&u_aw=1366&u_cd=24&u_nplug=0&u_nmime=0&adx=0&ady=0&biw=-12245933&bih=-12245933&isw=300&ish=250&ifk=705312705&scr_x=-12245933&scr_y=-12245933&eid=21060853%2C151527001%2C151527201%2C229739147%2C229739149%2C633794004%2C633794005&oid=3&loc=https%3A%2F%2Fmail.ru%2F&rx=0&eae=2&brdim=-8%2C-8%2C-8%2C-8%2C1366%2C0%2C1382%2C744%2C300%2C250&vis=1&rsz=%7C%7CcE%7C&abl=NS&pfx=0&fu=16&bc=29&ifi=1&uci=1.716n8wm4amc5&dtd=1081 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]'] </pre> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
mail.ru - see bug description - <!-- @browser: Firefox 66.0 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; WOW64; rv:66.0) Gecko/20100101 Firefox/66.0 --> <!-- @reported_with: desktop-reporter --> **URL**: https://mail.ru/ **Browser / Version**: Firefox 66.0 **Operating System**: Windows 10 **Tested Another Browser**: No **Problem type**: Something else **Description**: redirecting problem **Steps to Reproduce**: [![Screenshot Description](https://webcompat.com/uploads/2019/6/4afc42a4-7a06-40be-b107-680848175aba-thumb.jpeg)](https://webcompat.com/uploads/2019/6/4afc42a4-7a06-40be-b107-680848175aba.jpeg) <details> <summary>Browser Configuration</summary> <ul> <li>mixed active content blocked: false</li><li>image.mem.shared: true</li><li>buildID: 20190204181317</li><li>tracking content blocked: false</li><li>gfx.webrender.blob-images: true</li><li>hasTouchScreen: false</li><li>mixed passive content blocked: false</li><li>gfx.webrender.enabled: false</li><li>gfx.webrender.all: false</li><li>channel: beta</li> </ul> <p>Console Messages:</p> <pre> [u'[JavaScript Warning: "Content Security Policy: Directive child-src has been deprecated. Please use directive worker-src to control workers, or directive frame-src to control frames respectively."]', u'[console.time(headline.inline.js) https://img.imgsmail.ru/ph/0.58.22/inline.js:168:92]', u'[console.timeEnd(headline.inline.js) https://img.imgsmail.ru/ph/0.58.22/inline.js:224:398]', u'[JavaScript Warning: "Request to access cookie or storage on https://www.tns-counter.ru/V13a***R%3E*mail_ru/ru/UTF-8/tmsec=mail_main/630649319 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://counter.yadro.ru/hit;mail-splash/pc?r;s1366*768*24;uhttps%3A//mail.ru/;0.896513189111155 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "The script from https://top-fwz1.mail.ru/js/code.js was loaded even though its MIME type (text/plain) is not a valid JavaScript MIME type." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://saltcdn2.googleapis.com/loader.js." {file: "https://mail.ru/" line: 1}]', u'[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at https://saltcdn2.googleapis.com/log.html (frame-src)."]', u'[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at https://saltcdn2.googleapis.com/loader.js (script-src)."]', u'[JavaScript Warning: "Request to access cookie or storage on https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[console.time(headline.external.js) https://img.imgsmail.ru/ph/0.58.22/external.min.js:25:346]', u'[console.timeEnd(headline.external.js) https://img.imgsmail.ru/ph/0.58.22/external.min.js:231:303]', u'[JavaScript Warning: "Request to access cookie or storage on https://www.tns-counter.ru/V13a***R%3E*mail_ru/ru/UTF-8/tmsec=mail_main/630649319 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://adservice.google.com.bd/adsid/integrator.js?domain=ad.mail.ru was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://adservice.google.com/adsid/integrator.js?domain=ad.mail.ru was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://pagead2.googlesyndication.com/pagead/js/r20190624/r20190131/show_ads_impl.js was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://pagead2.googlesyndication.com/pub-config/r20160913/ca-pub-4831681952934476.js was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://googleads.g.doubleclick.net/pagead/html/r20190624/r20190131/zrt_lookup.html# was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://googleads.g.doubleclick.net/pagead/html/r20190624/r20190131/zrt_lookup.html# was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://counter.yadro.ru/hit;mail-splash/pc?r;s1366*768*24;uhttps%3A//mail.ru/;0.896513189111155 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://www.tns-counter.ru/V13b***R%3E*mail_ru/ru/UTF-8/tmsec=mail_main/630649319 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://counter.yadro.ru/hit;mail-splash/pc?q;r;s1366*768*24;uhttps%3A//mail.ru/;0.896513189111155 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://googleads.g.doubleclick.net/pagead/ads?client=ca-pub-4831681952934476&output=html&h=250&slotname=7515917252&adk=3858467216&adf=3279755397&w=300&guci=2.2.0.0.2.2.0.0&format=300x250&url=mail.ru&ea=0&flash=0&avail_w=300&wgl=1&dt=1561710141886&bpp=14&bdt=2892&fdt=1055&idt=1055&shv=r20190624&cbv=r20190131&saldr=aa&correlator=8772594697654&frm=24&ife=1&pv=2&ga_vid=715693380.1561710143&ga_sid=1561710143&ga_hid=1976816211&ga_fc=0&icsg=682&nhd=1&dssz=6&mdo=0&mso=0&u_tz=360&u_his=2&u_java=0&u_h=768&u_w=1366&u_ah=728&u_aw=1366&u_cd=24&u_nplug=0&u_nmime=0&adx=0&ady=0&biw=-12245933&bih=-12245933&isw=300&ish=250&ifk=705312705&scr_x=-12245933&scr_y=-12245933&eid=21060853%2C151527001%2C151527201%2C229739147%2C229739149%2C633794004%2C633794005&oid=3&loc=https%3A%2F%2Fmail.ru%2F&rx=0&eae=2&brdim=-8%2C-8%2C-8%2C-8%2C1366%2C0%2C1382%2C744%2C300%2C250&vis=1&rsz=%7C%7CcE%7C&abl=NS&pfx=0&fu=16&bc=29&ifi=1&uci=1.716n8wm4amc5&dtd=1081 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://www.googletagservices.com/activeview/js/current/osd.js?cb=%2Fr20100101 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://counter.yadro.ru/hit;mail-splash/pc?q;r;s1366*768*24;uhttps%3A//mail.ru/;0.896513189111155 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]', u'[JavaScript Warning: "Request to access cookie or storage on https://googleads.g.doubleclick.net/pagead/ads?client=ca-pub-4831681952934476&output=html&h=250&slotname=7515917252&adk=3858467216&adf=3279755397&w=300&guci=2.2.0.0.2.2.0.0&format=300x250&url=mail.ru&ea=0&flash=0&avail_w=300&wgl=1&dt=1561710141886&bpp=14&bdt=2892&fdt=1055&idt=1055&shv=r20190624&cbv=r20190131&saldr=aa&correlator=8772594697654&frm=24&ife=1&pv=2&ga_vid=715693380.1561710143&ga_sid=1561710143&ga_hid=1976816211&ga_fc=0&icsg=682&nhd=1&dssz=6&mdo=0&mso=0&u_tz=360&u_his=2&u_java=0&u_h=768&u_w=1366&u_ah=728&u_aw=1366&u_cd=24&u_nplug=0&u_nmime=0&adx=0&ady=0&biw=-12245933&bih=-12245933&isw=300&ish=250&ifk=705312705&scr_x=-12245933&scr_y=-12245933&eid=21060853%2C151527001%2C151527201%2C229739147%2C229739149%2C633794004%2C633794005&oid=3&loc=https%3A%2F%2Fmail.ru%2F&rx=0&eae=2&brdim=-8%2C-8%2C-8%2C-8%2C1366%2C0%2C1382%2C744%2C300%2C250&vis=1&rsz=%7C%7CcE%7C&abl=NS&pfx=0&fu=16&bc=29&ifi=1&uci=1.716n8wm4amc5&dtd=1081 was blocked because it came from a tracker and content blocking is enabled." {file: "https://mail.ru/" line: 0}]'] </pre> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
non_test
mail ru see bug description url browser version firefox operating system windows tested another browser no problem type something else description redirecting problem steps to reproduce browser configuration mixed active content blocked false image mem shared true buildid tracking content blocked false gfx webrender blob images true hastouchscreen false mixed passive content blocked false gfx webrender enabled false gfx webrender all false channel beta console messages u u u u u u u u u u u u u u u u u u u u u u u u u from with ❤️
0
351,569
32,010,293,816
IssuesEvent
2023-09-21 17:30:07
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
opened
: failed
C-test-failure O-robot branch-master release-blocker T-testeng
. [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Ci_TestsAwsLinuxArm64_UnitTests/11875053?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Ci_TestsAwsLinuxArm64_UnitTests/11875053?buildTab=artifacts#/) on master @ [15ad9891482d63f602c8296ac5c633d49361e45c](https://github.com/cockroachdb/cockroach/commits/15ad9891482d63f602c8296ac5c633d49361e45c): ``` /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__brace-expansion__1.1.11 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:58771:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__fresh__0.5.2 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:78571:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__mime-types__2.1.35 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:97012:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__randexp__0.4.6 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:104312:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__d3-array__3.2.4 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:64686:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__at_types_redux__3.6.0 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:47807:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__chalk__1.1.3 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:60592:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__deep-object-diff__1.1.9 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:65456:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_impor ``` <details><summary>Help</summary> <p> See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM) </p> </details> /cc @cockroachdb/test-eng <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub>
2.0
: failed - . [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Ci_TestsAwsLinuxArm64_UnitTests/11875053?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Ci_TestsAwsLinuxArm64_UnitTests/11875053?buildTab=artifacts#/) on master @ [15ad9891482d63f602c8296ac5c633d49361e45c](https://github.com/cockroachdb/cockroach/commits/15ad9891482d63f602c8296ac5c633d49361e45c): ``` /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__brace-expansion__1.1.11 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:58771:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__fresh__0.5.2 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:78571:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__mime-types__2.1.35 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:97012:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__randexp__0.4.6 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:104312:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__d3-array__3.2.4 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:64686:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__at_types_redux__3.6.0 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:47807:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__chalk__1.1.3 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:60592:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_import_rule defined at: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:844:34: in <toplevel> INFO: Repository npm__deep-object-diff__1.1.9 instantiated at: /go/src/github.com/cockroachdb/cockroach/WORKSPACE:287:17: in <toplevel> /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/npm/repositories.bzl:65456:15: in npm_repositories /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/external/aspect_rules_js/npm/private/npm_import.bzl:1133:20: in npm_import Repository rule npm_impor ``` <details><summary>Help</summary> <p> See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM) </p> </details> /cc @cockroachdb/test-eng <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub>
test
failed with on master home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in npm import repository rule npm import rule defined at home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in info repository npm brace expansion instantiated at go src github com cockroachdb cockroach workspace in home roach cache bazel bazel roach external npm repositories bzl in npm repositories home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in npm import repository rule npm import rule defined at home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in info repository npm fresh instantiated at go src github com cockroachdb cockroach workspace in home roach cache bazel bazel roach external npm repositories bzl in npm repositories home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in npm import repository rule npm import rule defined at home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in info repository npm mime types instantiated at go src github com cockroachdb cockroach workspace in home roach cache bazel bazel roach external npm repositories bzl in npm repositories home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in npm import repository rule npm import rule defined at home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in info repository npm randexp instantiated at go src github com cockroachdb cockroach workspace in home roach cache bazel bazel roach external npm repositories bzl in npm repositories home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in npm import repository rule npm import rule defined at home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in info repository npm array instantiated at go src github com cockroachdb cockroach workspace in home roach cache bazel bazel roach external npm repositories bzl in npm repositories home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in npm import repository rule npm import rule defined at home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in info repository npm at types redux instantiated at go src github com cockroachdb cockroach workspace in home roach cache bazel bazel roach external npm repositories bzl in npm repositories home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in npm import repository rule npm import rule defined at home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in info repository npm chalk instantiated at go src github com cockroachdb cockroach workspace in home roach cache bazel bazel roach external npm repositories bzl in npm repositories home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in npm import repository rule npm import rule defined at home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in info repository npm deep object diff instantiated at go src github com cockroachdb cockroach workspace in home roach cache bazel bazel roach external npm repositories bzl in npm repositories home roach cache bazel bazel roach external aspect rules js npm private npm import bzl in npm import repository rule npm impor help see also cc cockroachdb test eng
1
25,782
12,301,595,433
IssuesEvent
2020-05-11 15:38:27
openshift/odo
https://api.github.com/repos/openshift/odo
closed
In a operator installed cluster odo catalog list services stuck while fetching services details
area/catalog area/service-operators kind/bug kind/flake
/kind bug <!-- Welcome! - We kindly ask you to: 1. Fill out the issue template below 2. Use the Google group if you have a question rather than a bug or feature request. The group is at: https://groups.google.com/forum/#!forum/odo-users Thanks for understanding, and for contributing to the project! --> ## What versions of software are you using? **Operating System:** Supported **Output of `odo version`:** master ## How did you run odo exactly? I tried 3 times, in none of try i could see it. This is observed in CI ``` ------------------------------ Running odo with args [odo project set ci-operator-hub-project] [odo] Switched to project : ci-operator-hub-project Running odo with args [odo catalog list services] <------ here it stuck • Failure [1.873 seconds] odo service command tests for OperatorHub /go/src/github.com/openshift/odo/tests/integration/operatorhub/cmd_service_test.go:17 When using dry-run option to create operator backed service /go/src/github.com/openshift/odo/tests/integration/operatorhub/cmd_service_test.go:58 should only output the definition of the CR that will be used to start service [It] /go/src/github.com/openshift/odo/tests/integration/operatorhub/cmd_service_test.go:59 Timed out after 1.009s. Running odo with args [odo catalog list services] Expected process to exit. It did not. /go/src/github.com/openshift/odo/tests/helper/helper_run.go:34 ------------------------------ ``` ## Actual behavior ```odo catalog list services``` stuck ## Expected behavior Either it should throw error or should fetch the details. ## Any logs, error output, etc? https://prow.svc.ci.openshift.org/view/gcs/origin-ci-test/pr-logs/pull/openshift_odo/2986/pull-ci-openshift-odo-master-v4.2-integration-e2e-benchmark/2282#1:build-log.txt%3A592
1.0
In a operator installed cluster odo catalog list services stuck while fetching services details - /kind bug <!-- Welcome! - We kindly ask you to: 1. Fill out the issue template below 2. Use the Google group if you have a question rather than a bug or feature request. The group is at: https://groups.google.com/forum/#!forum/odo-users Thanks for understanding, and for contributing to the project! --> ## What versions of software are you using? **Operating System:** Supported **Output of `odo version`:** master ## How did you run odo exactly? I tried 3 times, in none of try i could see it. This is observed in CI ``` ------------------------------ Running odo with args [odo project set ci-operator-hub-project] [odo] Switched to project : ci-operator-hub-project Running odo with args [odo catalog list services] <------ here it stuck • Failure [1.873 seconds] odo service command tests for OperatorHub /go/src/github.com/openshift/odo/tests/integration/operatorhub/cmd_service_test.go:17 When using dry-run option to create operator backed service /go/src/github.com/openshift/odo/tests/integration/operatorhub/cmd_service_test.go:58 should only output the definition of the CR that will be used to start service [It] /go/src/github.com/openshift/odo/tests/integration/operatorhub/cmd_service_test.go:59 Timed out after 1.009s. Running odo with args [odo catalog list services] Expected process to exit. It did not. /go/src/github.com/openshift/odo/tests/helper/helper_run.go:34 ------------------------------ ``` ## Actual behavior ```odo catalog list services``` stuck ## Expected behavior Either it should throw error or should fetch the details. ## Any logs, error output, etc? https://prow.svc.ci.openshift.org/view/gcs/origin-ci-test/pr-logs/pull/openshift_odo/2986/pull-ci-openshift-odo-master-v4.2-integration-e2e-benchmark/2282#1:build-log.txt%3A592
non_test
in a operator installed cluster odo catalog list services stuck while fetching services details kind bug welcome we kindly ask you to fill out the issue template below use the google group if you have a question rather than a bug or feature request the group is at thanks for understanding and for contributing to the project what versions of software are you using operating system supported output of odo version master how did you run odo exactly i tried times in none of try i could see it this is observed in ci running odo with args switched to project ci operator hub project running odo with args here it stuck • failure odo service command tests for operatorhub go src github com openshift odo tests integration operatorhub cmd service test go when using dry run option to create operator backed service go src github com openshift odo tests integration operatorhub cmd service test go should only output the definition of the cr that will be used to start service go src github com openshift odo tests integration operatorhub cmd service test go timed out after running odo with args expected process to exit it did not go src github com openshift odo tests helper helper run go actual behavior odo catalog list services stuck expected behavior either it should throw error or should fetch the details any logs error output etc
0
7,303
2,610,362,572
IssuesEvent
2015-02-26 19:57:14
chrsmith/scribefire-chrome
https://api.github.com/repos/chrsmith/scribefire-chrome
closed
wordpress.com blogs credentials
auto-migrated Priority-Medium Type-Defect
``` What's the problem? I'm experiencing problems to log in to my WP.com accounts by safe access. Meanwhile log to my WP account is correctly working. What browser are you using? Opera Next / Chrome 20.0.1132.47 m What version of ScribeFire are you running? Opera extension / Chrome extension ``` ----- Original issue reported on code.google.com by `marco.b...@gmail.com` on 2 Jul 2012 at 1:36 Attachments: * [2012-07-02_153401.png](https://storage.googleapis.com/google-code-attachments/scribefire-chrome/issue-638/comment-0/2012-07-02_153401.png)
1.0
wordpress.com blogs credentials - ``` What's the problem? I'm experiencing problems to log in to my WP.com accounts by safe access. Meanwhile log to my WP account is correctly working. What browser are you using? Opera Next / Chrome 20.0.1132.47 m What version of ScribeFire are you running? Opera extension / Chrome extension ``` ----- Original issue reported on code.google.com by `marco.b...@gmail.com` on 2 Jul 2012 at 1:36 Attachments: * [2012-07-02_153401.png](https://storage.googleapis.com/google-code-attachments/scribefire-chrome/issue-638/comment-0/2012-07-02_153401.png)
non_test
wordpress com blogs credentials what s the problem i m experiencing problems to log in to my wp com accounts by safe access meanwhile log to my wp account is correctly working what browser are you using opera next chrome m what version of scribefire are you running opera extension chrome extension original issue reported on code google com by marco b gmail com on jul at attachments
0
283,147
24,527,268,586
IssuesEvent
2022-10-11 13:58:43
cobudget/cobudget
https://api.github.com/repos/cobudget/cobudget
closed
[BUG] When managing member balance: [GraphQL] Cannot destructure property 'roundId' of 'undefined' as it is undefined.
needs testing
**Describe the bug** When Adding or Setting the balance, it throws this error: > [GraphQL] Cannot destructure property 'roundId' of 'undefined' as it is undefined. **To Reproduce** Steps to reproduce the behavior: 1. Go to 'https://staging.cobudget.com/test-abc/test-round-abc 2. Click on the plus sign next to Balance 3. Add money, click done 4. See error **Screenshots** <img width="1353" alt="Screenshot 2022-10-03 at 15 33 47" src="https://user-images.githubusercontent.com/1287179/193590823-7a0ae447-33c2-4a4b-9aff-0c1f46e0adf1.png"> **Additional context** * I created a fresh group and got the same error, so assume it's not related a setting on the group "test-abc". * I have only tested on Staging.
1.0
[BUG] When managing member balance: [GraphQL] Cannot destructure property 'roundId' of 'undefined' as it is undefined. - **Describe the bug** When Adding or Setting the balance, it throws this error: > [GraphQL] Cannot destructure property 'roundId' of 'undefined' as it is undefined. **To Reproduce** Steps to reproduce the behavior: 1. Go to 'https://staging.cobudget.com/test-abc/test-round-abc 2. Click on the plus sign next to Balance 3. Add money, click done 4. See error **Screenshots** <img width="1353" alt="Screenshot 2022-10-03 at 15 33 47" src="https://user-images.githubusercontent.com/1287179/193590823-7a0ae447-33c2-4a4b-9aff-0c1f46e0adf1.png"> **Additional context** * I created a fresh group and got the same error, so assume it's not related a setting on the group "test-abc". * I have only tested on Staging.
test
when managing member balance cannot destructure property roundid of undefined as it is undefined describe the bug when adding or setting the balance it throws this error cannot destructure property roundid of undefined as it is undefined to reproduce steps to reproduce the behavior go to click on the plus sign next to balance add money click done see error screenshots img width alt screenshot at src additional context i created a fresh group and got the same error so assume it s not related a setting on the group test abc i have only tested on staging
1
217,950
16,891,848,657
IssuesEvent
2021-06-23 10:11:16
hakehuang/zephyr
https://api.github.com/repos/hakehuang/zephyr
opened
tests-ci :kernel.memory_protection.protection.exec_data : zephyr-v2.6.0-286-g46029914a7ac: lpcxpresso55s28: test Timeout
area: Tests bug
**Describe the bug** kernel.memory_protection.protection.exec_data test is Timeout on zephyr-v2.6.0-286-g46029914a7ac on lpcxpresso55s28 see logs for details **To Reproduce** 1. ``` scripts/twister --device-testing --device-serial /dev/ttyACM0 -p lpcxpresso55s28 --testcase-root tests --sub-test kernel.memory_protection ``` 2. See error **Expected behavior** test pass **Impact** **Logs and console output** ``` *** Booting Zephyr OS build zephyr-v2.6.0-286-g46029914a7ac *** Running test suite protection =================================================================== START - test_exec_data trying to call code written to 0x30000399 ASSERTION FAIL [esf != ((void *)0)] @ WEST_TOPDIR/zephyr/arch/arm/core/aarch32/cortex_m/fault.c:993 ESF could not be retrieved successfully. Shall never occur. ASSERTION FAIL [esf != ((void *)0)] @ WEST_TOPDIR/zephyr/arch/arm/core/aarch32/cortex_m/fault.c:993 ESF could not be retrieved successfully. Shall never occur. ``` **Environment (please complete the following information):** - OS: (e.g. Linux ) - Toolchain (e.g Zephyr SDK) - Commit SHA or Version used: zephyr-v2.6.0-286-g46029914a7ac
1.0
tests-ci :kernel.memory_protection.protection.exec_data : zephyr-v2.6.0-286-g46029914a7ac: lpcxpresso55s28: test Timeout - **Describe the bug** kernel.memory_protection.protection.exec_data test is Timeout on zephyr-v2.6.0-286-g46029914a7ac on lpcxpresso55s28 see logs for details **To Reproduce** 1. ``` scripts/twister --device-testing --device-serial /dev/ttyACM0 -p lpcxpresso55s28 --testcase-root tests --sub-test kernel.memory_protection ``` 2. See error **Expected behavior** test pass **Impact** **Logs and console output** ``` *** Booting Zephyr OS build zephyr-v2.6.0-286-g46029914a7ac *** Running test suite protection =================================================================== START - test_exec_data trying to call code written to 0x30000399 ASSERTION FAIL [esf != ((void *)0)] @ WEST_TOPDIR/zephyr/arch/arm/core/aarch32/cortex_m/fault.c:993 ESF could not be retrieved successfully. Shall never occur. ASSERTION FAIL [esf != ((void *)0)] @ WEST_TOPDIR/zephyr/arch/arm/core/aarch32/cortex_m/fault.c:993 ESF could not be retrieved successfully. Shall never occur. ``` **Environment (please complete the following information):** - OS: (e.g. Linux ) - Toolchain (e.g Zephyr SDK) - Commit SHA or Version used: zephyr-v2.6.0-286-g46029914a7ac
test
tests ci kernel memory protection protection exec data zephyr test timeout describe the bug kernel memory protection protection exec data test is timeout on zephyr on see logs for details to reproduce scripts twister device testing device serial dev p testcase root tests sub test kernel memory protection see error expected behavior test pass impact logs and console output booting zephyr os build zephyr running test suite protection start test exec data trying to call code written to assertion fail west topdir zephyr arch arm core cortex m fault c esf could not be retrieved successfully shall never occur assertion fail west topdir zephyr arch arm core cortex m fault c esf could not be retrieved successfully shall never occur environment please complete the following information os e g linux toolchain e g zephyr sdk commit sha or version used zephyr
1
347,496
31,168,000,209
IssuesEvent
2023-08-16 21:28:31
googleapis/google-cloud-python
https://api.github.com/repos/googleapis/google-cloud-python
closed
Adopt split repo: _python-bigquery-migration_
migration:samples:generated migration:workaround:none migration:library:gapic_auto migration:testing:unit migration:issues:none migration:ready migration:pr:none migration:stage:git-history-merged migration:stage:common-files-updated migration:stage:split-repo-trimmed migration:stage:split-repo-archived
Migrate the split-repo https://github.com/googleapis/python-bigquery-migration to https://github.com/googleapis/google-cloud-python. The migration readiness criteria are the following, which we track via GitHub labels on this issue. These criteria apply to the split repo we are migrating from: - No open issues - No open PRs - No handwritten samples - No system tests - No client-specific customizations
1.0
Adopt split repo: _python-bigquery-migration_ - Migrate the split-repo https://github.com/googleapis/python-bigquery-migration to https://github.com/googleapis/google-cloud-python. The migration readiness criteria are the following, which we track via GitHub labels on this issue. These criteria apply to the split repo we are migrating from: - No open issues - No open PRs - No handwritten samples - No system tests - No client-specific customizations
test
adopt split repo python bigquery migration migrate the split repo to the migration readiness criteria are the following which we track via github labels on this issue these criteria apply to the split repo we are migrating from no open issues no open prs no handwritten samples no system tests no client specific customizations
1
273,490
23,758,747,281
IssuesEvent
2022-09-01 06:59:31
nromanen/pratical_testing_2022
https://api.github.com/repos/nromanen/pratical_testing_2022
opened
Password recovery - Negative (invalid email)
test case
# [TC-4.2] : Password recovery - Negative (invalid email) ## Description Verify that system can detect invalid email adresses that user enters on "Password recovery page" ### Precondition There is stable internet connection. User should have access to Chrome, Firefox, Microsoft Edge or Safari. user should be either unregistered or registered, but not authorized. ### Priority High ### Input data Email: uowqfo4241 ## Test Steps | Step No. | Step description | Expected result | | ------------- |:-------------| :-----| | 1. | Navigate to https://ttrackster.herokuapp.com/login | Site should open | | 2. | Click on the link "Forgot password?" | Site should open the "Password recovery" page | | 3. | Enter email | Credential can be entered, system should show "Invalid email" message | | 4. | Click the "SEND LINK" button | System should show "Invalid email" message | ## Expected Result System should show "Invalid email" message. ## Requirement [Password recovery#4](https://github.com/nromanen/pratical_testing_2022/issues/4)
1.0
Password recovery - Negative (invalid email) - # [TC-4.2] : Password recovery - Negative (invalid email) ## Description Verify that system can detect invalid email adresses that user enters on "Password recovery page" ### Precondition There is stable internet connection. User should have access to Chrome, Firefox, Microsoft Edge or Safari. user should be either unregistered or registered, but not authorized. ### Priority High ### Input data Email: uowqfo4241 ## Test Steps | Step No. | Step description | Expected result | | ------------- |:-------------| :-----| | 1. | Navigate to https://ttrackster.herokuapp.com/login | Site should open | | 2. | Click on the link "Forgot password?" | Site should open the "Password recovery" page | | 3. | Enter email | Credential can be entered, system should show "Invalid email" message | | 4. | Click the "SEND LINK" button | System should show "Invalid email" message | ## Expected Result System should show "Invalid email" message. ## Requirement [Password recovery#4](https://github.com/nromanen/pratical_testing_2022/issues/4)
test
password recovery negative invalid email password recovery negative invalid email description verify that system can detect invalid email adresses that user enters on password recovery page precondition there is stable internet connection user should have access to chrome firefox microsoft edge or safari user should be either unregistered or registered but not authorized priority high input data email test steps step no step description expected result navigate to site should open click on the link forgot password site should open the password recovery page enter email credential can be entered system should show invalid email message click the send link button system should show invalid email message expected result system should show invalid email message requirement
1
135,284
10,968,630,230
IssuesEvent
2019-11-28 12:05:21
elastic/elasticsearch
https://api.github.com/repos/elastic/elasticsearch
closed
[CI] Failure in ml.integration.RegressionIT.testStopAndRestart assertion
:ml >test-failure
After I unmuted the test, there were 2 CI failures: https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+master+multijob-unix-compatibility/os=debian-8&&immutable/389/console https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+master+multijob-unix-compatibility/os=amazon/389/console With small changes in the code I was able to reproduce the issue locally today. Here is the test log: ``` Suite: Test class org.elasticsearch.xpack.ml.integration.RegressionIT 2> REPRODUCE WITH: ./gradlew ':x-pack:plugin:ml:qa:native-multi-node-tests:integTestRunner' --tests "org.elasticsearch.xpack.ml.integration.RegressionIT.testStopAndRestart" -Dtests.seed=2454927CB657C116 -Dtests.security.manager=true -Dtests.locale=th -Dtests.timezone=America/Indiana/Vevay -Dcompiler.java=12 -Druntime.java=12 2> java.lang.AssertionError: Hits were: {"hits":{"total":{"value":2,"relation":"eq"},"max_score":0.0,"hits":[{"_index":".ml-inference-000001","_id":"regression_stop_and_restart-1573742320992","_score":0.0,"_source":{"model_id":"regression_stop_and_restart-1573742320992","created_by":"data-frame-analytics","version":"8.0.0","create_time":1573742320992,"tags":["regression_stop_and_restart"],"metadata":{"analytics_config":{"id":"regression_stop_and_restart","source":{"index":["regression_stop_and_restart_source_index"],"query":{"match_all":{}}},"dest":{"index":"regression_stop_and_restart_source_index_results","results_field":"ml"},"analysis":{"regression":{"dependent_variable":"variable","prediction_field_name":"variable_prediction","training_percent":100.0}},"model_memory_limit":"1gb","create_time":1573742309034,"version":"8.0.0","allow_lazy_start":false}},"doc_type":"trained_model_config","input":{"field_names":["feature","variable"]}}},{"_index":".ml-inference-000001","_id":"regression_stop_and_restart-1573742322238","_score":0.0,"_source":{"model_id":"regression_stop_and_restart-1573742322238","created_by":"data-frame-analytics","version":"8.0.0","create_time":1573742322238,"tags":["regression_stop_and_restart"],"metadata":{"analytics_config":{"id":"regression_stop_and_restart","source":{"index":["regression_stop_and_restart_source_index"],"query":{"match_all":{}}},"dest":{"index":"regression_stop_and_restart_source_index_results","results_field":"ml"},"analysis":{"regression":{"dependent_variable":"variable","prediction_field_name":"variable_prediction","training_percent":100.0}},"model_memory_limit":"1gb","create_time":1573742309034,"version":"8.0.0","allow_lazy_start":false}},"doc_type":"trained_model_config","input":{"field_names":["feature","variable"]}}}]}} Expected: an array with size <1> but: array size was <2> at __randomizedtesting.SeedInfo.seed([2454927CB657C116:531C1082A2BDC7B6]:0) at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:18) at org.junit.Assert.assertThat(Assert.java:956) at org.elasticsearch.xpack.ml.integration.MlNativeDataFrameAnalyticsIntegTestCase.assertInferenceModelPersisted(MlNativeDataFrameAnalyticsIntegTestCase.java:200) at org.elasticsearch.xpack.ml.integration.RegressionIT.testStopAndRestart(RegressionIT.java:309) 2> NOTE: leaving temporary files on disk at: /Users/witek/github/elastic/elasticsearch/x-pack/plugin/ml/qa/native-multi-node-tests/build/testrun/integTestRunner/temp/org.elasticsearch.xpack.ml.integration.RegressionIT_2454927CB657C116-001 2> Nov 14, 2019 3:38:51 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks 2> WARNING: Will linger awaiting termination of 1 leaked thread(s). 2> NOTE: test params are: codec=Lucene80, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@67df4402), locale=th, timezone=America/Indiana/Vevay 2> NOTE: Mac OS X 10.14.5 x86_64/Oracle Corporation 12 (64-bit)/cpus=8,threads=1,free=226957824,total=536870912 2> NOTE: All tests run in this JVM: [RegressionIT] ```
1.0
[CI] Failure in ml.integration.RegressionIT.testStopAndRestart assertion - After I unmuted the test, there were 2 CI failures: https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+master+multijob-unix-compatibility/os=debian-8&&immutable/389/console https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+master+multijob-unix-compatibility/os=amazon/389/console With small changes in the code I was able to reproduce the issue locally today. Here is the test log: ``` Suite: Test class org.elasticsearch.xpack.ml.integration.RegressionIT 2> REPRODUCE WITH: ./gradlew ':x-pack:plugin:ml:qa:native-multi-node-tests:integTestRunner' --tests "org.elasticsearch.xpack.ml.integration.RegressionIT.testStopAndRestart" -Dtests.seed=2454927CB657C116 -Dtests.security.manager=true -Dtests.locale=th -Dtests.timezone=America/Indiana/Vevay -Dcompiler.java=12 -Druntime.java=12 2> java.lang.AssertionError: Hits were: {"hits":{"total":{"value":2,"relation":"eq"},"max_score":0.0,"hits":[{"_index":".ml-inference-000001","_id":"regression_stop_and_restart-1573742320992","_score":0.0,"_source":{"model_id":"regression_stop_and_restart-1573742320992","created_by":"data-frame-analytics","version":"8.0.0","create_time":1573742320992,"tags":["regression_stop_and_restart"],"metadata":{"analytics_config":{"id":"regression_stop_and_restart","source":{"index":["regression_stop_and_restart_source_index"],"query":{"match_all":{}}},"dest":{"index":"regression_stop_and_restart_source_index_results","results_field":"ml"},"analysis":{"regression":{"dependent_variable":"variable","prediction_field_name":"variable_prediction","training_percent":100.0}},"model_memory_limit":"1gb","create_time":1573742309034,"version":"8.0.0","allow_lazy_start":false}},"doc_type":"trained_model_config","input":{"field_names":["feature","variable"]}}},{"_index":".ml-inference-000001","_id":"regression_stop_and_restart-1573742322238","_score":0.0,"_source":{"model_id":"regression_stop_and_restart-1573742322238","created_by":"data-frame-analytics","version":"8.0.0","create_time":1573742322238,"tags":["regression_stop_and_restart"],"metadata":{"analytics_config":{"id":"regression_stop_and_restart","source":{"index":["regression_stop_and_restart_source_index"],"query":{"match_all":{}}},"dest":{"index":"regression_stop_and_restart_source_index_results","results_field":"ml"},"analysis":{"regression":{"dependent_variable":"variable","prediction_field_name":"variable_prediction","training_percent":100.0}},"model_memory_limit":"1gb","create_time":1573742309034,"version":"8.0.0","allow_lazy_start":false}},"doc_type":"trained_model_config","input":{"field_names":["feature","variable"]}}}]}} Expected: an array with size <1> but: array size was <2> at __randomizedtesting.SeedInfo.seed([2454927CB657C116:531C1082A2BDC7B6]:0) at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:18) at org.junit.Assert.assertThat(Assert.java:956) at org.elasticsearch.xpack.ml.integration.MlNativeDataFrameAnalyticsIntegTestCase.assertInferenceModelPersisted(MlNativeDataFrameAnalyticsIntegTestCase.java:200) at org.elasticsearch.xpack.ml.integration.RegressionIT.testStopAndRestart(RegressionIT.java:309) 2> NOTE: leaving temporary files on disk at: /Users/witek/github/elastic/elasticsearch/x-pack/plugin/ml/qa/native-multi-node-tests/build/testrun/integTestRunner/temp/org.elasticsearch.xpack.ml.integration.RegressionIT_2454927CB657C116-001 2> Nov 14, 2019 3:38:51 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks 2> WARNING: Will linger awaiting termination of 1 leaked thread(s). 2> NOTE: test params are: codec=Lucene80, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@67df4402), locale=th, timezone=America/Indiana/Vevay 2> NOTE: Mac OS X 10.14.5 x86_64/Oracle Corporation 12 (64-bit)/cpus=8,threads=1,free=226957824,total=536870912 2> NOTE: All tests run in this JVM: [RegressionIT] ```
test
failure in ml integration regressionit teststopandrestart assertion after i unmuted the test there were ci failures with small changes in the code i was able to reproduce the issue locally today here is the test log suite test class org elasticsearch xpack ml integration regressionit reproduce with gradlew x pack plugin ml qa native multi node tests integtestrunner tests org elasticsearch xpack ml integration regressionit teststopandrestart dtests seed dtests security manager true dtests locale th dtests timezone america indiana vevay dcompiler java druntime java java lang assertionerror hits were hits total value relation eq max score hits metadata analytics config id regression stop and restart source index query match all dest index regression stop and restart source index results results field ml analysis regression dependent variable variable prediction field name variable prediction training percent model memory limit create time version allow lazy start false doc type trained model config input field names index ml inference id regression stop and restart score source model id regression stop and restart created by data frame analytics version create time tags metadata analytics config id regression stop and restart source index query match all dest index regression stop and restart source index results results field ml analysis regression dependent variable variable prediction field name variable prediction training percent model memory limit create time version allow lazy start false doc type trained model config input field names expected an array with size but array size was at randomizedtesting seedinfo seed at org hamcrest matcherassert assertthat matcherassert java at org junit assert assertthat assert java at org elasticsearch xpack ml integration mlnativedataframeanalyticsintegtestcase assertinferencemodelpersisted mlnativedataframeanalyticsintegtestcase java at org elasticsearch xpack ml integration regressionit teststopandrestart regressionit java note leaving temporary files on disk at users witek github elastic elasticsearch x pack plugin ml qa native multi node tests build testrun integtestrunner temp org elasticsearch xpack ml integration regressionit nov pm com carrotsearch randomizedtesting threadleakcontrol checkthreadleaks warning will linger awaiting termination of leaked thread s note test params are codec sim asserting org apache lucene search similarities assertingsimilarity locale th timezone america indiana vevay note mac os x oracle corporation bit cpus threads free total note all tests run in this jvm
1
123,688
10,279,291,123
IssuesEvent
2019-08-25 21:43:55
istio/istio
https://api.github.com/repos/istio/istio
closed
Need CI to build and push docker images [tools repo]
area/test and release
We have custom images we use for this repo and will soon have more (https://github.com/istio/tools/pull/95). We should have these built to a official repo Steps: - Create new project GCP (istio-tools), or maybe just reuse mixologist? or istio-testing? - Set up GCB (https://github.com/marketplace/google-cloud-build) for the repo - will need repo admin to do this - Create cloudbuild config and push to repo - Replace images with official tools images @utka does this sound right to you? I can set everything up just want to make sure there isn't a better approach @mandarjog
1.0
Need CI to build and push docker images [tools repo] - We have custom images we use for this repo and will soon have more (https://github.com/istio/tools/pull/95). We should have these built to a official repo Steps: - Create new project GCP (istio-tools), or maybe just reuse mixologist? or istio-testing? - Set up GCB (https://github.com/marketplace/google-cloud-build) for the repo - will need repo admin to do this - Create cloudbuild config and push to repo - Replace images with official tools images @utka does this sound right to you? I can set everything up just want to make sure there isn't a better approach @mandarjog
test
need ci to build and push docker images we have custom images we use for this repo and will soon have more we should have these built to a official repo steps create new project gcp istio tools or maybe just reuse mixologist or istio testing set up gcb for the repo will need repo admin to do this create cloudbuild config and push to repo replace images with official tools images utka does this sound right to you i can set everything up just want to make sure there isn t a better approach mandarjog
1
88,148
10,565,822,592
IssuesEvent
2019-10-05 14:26:55
stepjam/PyRep
https://api.github.com/repos/stepjam/PyRep
closed
How to contribute in adding new Robots ?
documentation
How can we contribute in adding new robots ? I saw, they're just ttm files which you've mentioned to be modified. You can allow the opensource community to contribute in that, by sharing the required details.
1.0
How to contribute in adding new Robots ? - How can we contribute in adding new robots ? I saw, they're just ttm files which you've mentioned to be modified. You can allow the opensource community to contribute in that, by sharing the required details.
non_test
how to contribute in adding new robots how can we contribute in adding new robots i saw they re just ttm files which you ve mentioned to be modified you can allow the opensource community to contribute in that by sharing the required details
0
48,710
5,967,819,699
IssuesEvent
2017-05-30 16:45:08
healthlocker/healthlocker
https://api.github.com/repos/healthlocker/healthlocker
closed
Sleep tacker input edit copy formatting - so that it matches other trackers
enhancement please-test priority-3 T25m
**Notes:** For example: - what affected your sleep? - did you have nightmares or terrors? - did you do something to try to get a better sleep? + [x] reformat as above + [x] (bullets to be in italics)
1.0
Sleep tacker input edit copy formatting - so that it matches other trackers - **Notes:** For example: - what affected your sleep? - did you have nightmares or terrors? - did you do something to try to get a better sleep? + [x] reformat as above + [x] (bullets to be in italics)
test
sleep tacker input edit copy formatting so that it matches other trackers notes for example what affected your sleep did you have nightmares or terrors did you do something to try to get a better sleep reformat as above bullets to be in italics
1
221,956
17,379,165,994
IssuesEvent
2021-07-31 10:20:58
kubernetes-sigs/azuredisk-csi-driver
https://api.github.com/repos/kubernetes-sigs/azuredisk-csi-driver
closed
add example verification test for snapshot examples
kind/test lifecycle/rotten
**Is your feature request related to a problem?/Why is this needed** <!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] --> **Describe the solution you'd like in detail** <!-- A clear and concise description of what you want to happen. --> snapshot examples: https://github.com/kubernetes-sigs/azuredisk-csi-driver/tree/master/deploy/example/snapshot add tests to https://github.com/kubernetes-sigs/azuredisk-csi-driver/blob/master/hack/verify-examples.sh **Describe alternatives you've considered** <!-- A clear and concise description of any alternative solutions or features you've considered. --> **Additional context** <!-- Add any other context or screenshots about the feature request here. -->
1.0
add example verification test for snapshot examples - **Is your feature request related to a problem?/Why is this needed** <!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] --> **Describe the solution you'd like in detail** <!-- A clear and concise description of what you want to happen. --> snapshot examples: https://github.com/kubernetes-sigs/azuredisk-csi-driver/tree/master/deploy/example/snapshot add tests to https://github.com/kubernetes-sigs/azuredisk-csi-driver/blob/master/hack/verify-examples.sh **Describe alternatives you've considered** <!-- A clear and concise description of any alternative solutions or features you've considered. --> **Additional context** <!-- Add any other context or screenshots about the feature request here. -->
test
add example verification test for snapshot examples is your feature request related to a problem why is this needed describe the solution you d like in detail snapshot examples add tests to describe alternatives you ve considered additional context
1
306,980
23,177,181,868
IssuesEvent
2022-07-31 15:41:50
Shelex/cypress-allure-plugin
https://api.github.com/repos/Shelex/cypress-allure-plugin
closed
ReferenceError: AllureWriter is not defined
documentation question
Cypress 10.3.1 Ran `npm i -D @shelex/cypress-allure-plugin` Updated `cypress.config.js` to ``` const { defineConfig } = require("cypress"); module.exports = defineConfig({ e2e: { baseUrl: 'https://the-internet.herokuapp.com/', // baseUrl: 'http://localhost:7080/', setupNodeEvents(on, config) { AllureWriter(on, config); return config; } } }); ``` Added `import '@shelex/cypress-allure-plugin';` to `cypress/support/e2e.js` Ran `npx cypress run --env allure=true` Following full traceback: ``` Your configFile threw an error from: E:\QA Engineering Stuff\the-internet-cypress\cypress.config.js The error was thrown while executing your e2e.setupNodeEvents() function: ReferenceError: AllureWriter is not defined at setupNodeEvents (E:\QA Engineering Stuff\the-internet-cypress\cypress.config.js:9:7) at C:\Users\andre\AppData\Local\Cypress\Cache\10.3.1\Cypress\resources\app\node_modules\@packages\server\lib\plugins\child\run_plugins.js:118:14 at tryCatcher (C:\Users\andre\AppData\Local\Cypress\Cache\10.3.1\Cypress\resources\app\node_modules\@packages\server\node_modules\bluebird\js\release\util.js:16:23) at Function.Promise.attempt.Promise.try (C:\Users\andre\AppData\Local\Cypress\Cache\10.3.1\Cypress\resources\app\node_modules\@packages\server\node_modules\bluebird\js\release\method.js:39:29) at RunPlugins.load (C:\Users\andre\AppData\Local\Cypress\Cache\10.3.1\Cypress\resources\app\node_modules\@packages\server\lib\plugins\child\run_plugins.js:115:9) at RunPlugins.runSetupNodeEvents (C:\Users\andre\AppData\Local\Cypress\Cache\10.3.1\Cypress\resources\app\node_modules\@packages\server\lib\plugins\child\run_plugins.js:254:10) at EventEmitter.<anonymous> (C:\Users\andre\AppData\Local\Cypress\Cache\10.3.1\Cypress\resources\app\node_modules\@packages\server\lib\plugins\child\run_require_async_child.js:185:22) at EventEmitter.emit (events.js:400:28) at process.<anonymous> (C:\Users\andre\AppData\Local\Cypress\Cache\10.3.1\Cypress\resources\app\node_modules\@packages\server\lib\plugins\util.js:33:22) at process.emit (events.js:400:28) at emit (internal/child_process.js:910:12) at processTicksAndRejections (internal/process/task_queues.js:83:21) ``` My package.json content: ``` { "name": "the-internet-cypress", "version": "1.0.0", "description": "Cypress testing of https://the-internet.herokuapp.com/", "main": "index.js", "scripts": { "cypress:open": "cypress open" }, "author": "", "license": "ISC", "dependencies": { "cypress": "^10.3.1" }, "devDependencies": { "@shelex/cypress-allure-plugin": "^2.28.0" } } ``` I am unsure as to what I did wrong in installing the plugin.
1.0
ReferenceError: AllureWriter is not defined - Cypress 10.3.1 Ran `npm i -D @shelex/cypress-allure-plugin` Updated `cypress.config.js` to ``` const { defineConfig } = require("cypress"); module.exports = defineConfig({ e2e: { baseUrl: 'https://the-internet.herokuapp.com/', // baseUrl: 'http://localhost:7080/', setupNodeEvents(on, config) { AllureWriter(on, config); return config; } } }); ``` Added `import '@shelex/cypress-allure-plugin';` to `cypress/support/e2e.js` Ran `npx cypress run --env allure=true` Following full traceback: ``` Your configFile threw an error from: E:\QA Engineering Stuff\the-internet-cypress\cypress.config.js The error was thrown while executing your e2e.setupNodeEvents() function: ReferenceError: AllureWriter is not defined at setupNodeEvents (E:\QA Engineering Stuff\the-internet-cypress\cypress.config.js:9:7) at C:\Users\andre\AppData\Local\Cypress\Cache\10.3.1\Cypress\resources\app\node_modules\@packages\server\lib\plugins\child\run_plugins.js:118:14 at tryCatcher (C:\Users\andre\AppData\Local\Cypress\Cache\10.3.1\Cypress\resources\app\node_modules\@packages\server\node_modules\bluebird\js\release\util.js:16:23) at Function.Promise.attempt.Promise.try (C:\Users\andre\AppData\Local\Cypress\Cache\10.3.1\Cypress\resources\app\node_modules\@packages\server\node_modules\bluebird\js\release\method.js:39:29) at RunPlugins.load (C:\Users\andre\AppData\Local\Cypress\Cache\10.3.1\Cypress\resources\app\node_modules\@packages\server\lib\plugins\child\run_plugins.js:115:9) at RunPlugins.runSetupNodeEvents (C:\Users\andre\AppData\Local\Cypress\Cache\10.3.1\Cypress\resources\app\node_modules\@packages\server\lib\plugins\child\run_plugins.js:254:10) at EventEmitter.<anonymous> (C:\Users\andre\AppData\Local\Cypress\Cache\10.3.1\Cypress\resources\app\node_modules\@packages\server\lib\plugins\child\run_require_async_child.js:185:22) at EventEmitter.emit (events.js:400:28) at process.<anonymous> (C:\Users\andre\AppData\Local\Cypress\Cache\10.3.1\Cypress\resources\app\node_modules\@packages\server\lib\plugins\util.js:33:22) at process.emit (events.js:400:28) at emit (internal/child_process.js:910:12) at processTicksAndRejections (internal/process/task_queues.js:83:21) ``` My package.json content: ``` { "name": "the-internet-cypress", "version": "1.0.0", "description": "Cypress testing of https://the-internet.herokuapp.com/", "main": "index.js", "scripts": { "cypress:open": "cypress open" }, "author": "", "license": "ISC", "dependencies": { "cypress": "^10.3.1" }, "devDependencies": { "@shelex/cypress-allure-plugin": "^2.28.0" } } ``` I am unsure as to what I did wrong in installing the plugin.
non_test
referenceerror allurewriter is not defined cypress ran npm i d shelex cypress allure plugin updated cypress config js to const defineconfig require cypress module exports defineconfig baseurl baseurl setupnodeevents on config allurewriter on config return config added import shelex cypress allure plugin to cypress support js ran npx cypress run env allure true following full traceback your configfile threw an error from e qa engineering stuff the internet cypress cypress config js the error was thrown while executing your setupnodeevents function referenceerror allurewriter is not defined at setupnodeevents e qa engineering stuff the internet cypress cypress config js at c users andre appdata local cypress cache cypress resources app node modules packages server lib plugins child run plugins js at trycatcher c users andre appdata local cypress cache cypress resources app node modules packages server node modules bluebird js release util js at function promise attempt promise try c users andre appdata local cypress cache cypress resources app node modules packages server node modules bluebird js release method js at runplugins load c users andre appdata local cypress cache cypress resources app node modules packages server lib plugins child run plugins js at runplugins runsetupnodeevents c users andre appdata local cypress cache cypress resources app node modules packages server lib plugins child run plugins js at eventemitter c users andre appdata local cypress cache cypress resources app node modules packages server lib plugins child run require async child js at eventemitter emit events js at process c users andre appdata local cypress cache cypress resources app node modules packages server lib plugins util js at process emit events js at emit internal child process js at processticksandrejections internal process task queues js my package json content name the internet cypress version description cypress testing of main index js scripts cypress open cypress open author license isc dependencies cypress devdependencies shelex cypress allure plugin i am unsure as to what i did wrong in installing the plugin
0
184,695
14,289,809,833
IssuesEvent
2020-11-23 19:51:47
github-vet/rangeclosure-findings
https://api.github.com/repos/github-vet/rangeclosure-findings
closed
takyo101/lantern-android: client/client_test.go; 11 LoC
fresh small test
Found a possible issue in [takyo101/lantern-android](https://www.github.com/takyo101/lantern-android) at [client/client_test.go](https://github.com/takyo101/lantern-android/blob/5b9c4aa75c5e425417313c8160daabf063846803/client/client_test.go#L70-L80) The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements which capture loop variables. [Click here to see the code in its original context.](https://github.com/takyo101/lantern-android/blob/5b9c4aa75c5e425417313c8160daabf063846803/client/client_test.go#L70-L80) <details> <summary>Click here to show the 11 line(s) of Go which triggered the analyzer.</summary> ```go for uri, expectedContent := range testURLs { wg.Add(1) go func(wg *sync.WaitGroup) { if err := testReverseProxy(uri, expectedContent); err != nil { t.Fatal(err) } wg.Done() }(&wg) } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 5b9c4aa75c5e425417313c8160daabf063846803
1.0
takyo101/lantern-android: client/client_test.go; 11 LoC - Found a possible issue in [takyo101/lantern-android](https://www.github.com/takyo101/lantern-android) at [client/client_test.go](https://github.com/takyo101/lantern-android/blob/5b9c4aa75c5e425417313c8160daabf063846803/client/client_test.go#L70-L80) The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements which capture loop variables. [Click here to see the code in its original context.](https://github.com/takyo101/lantern-android/blob/5b9c4aa75c5e425417313c8160daabf063846803/client/client_test.go#L70-L80) <details> <summary>Click here to show the 11 line(s) of Go which triggered the analyzer.</summary> ```go for uri, expectedContent := range testURLs { wg.Add(1) go func(wg *sync.WaitGroup) { if err := testReverseProxy(uri, expectedContent); err != nil { t.Fatal(err) } wg.Done() }(&wg) } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 5b9c4aa75c5e425417313c8160daabf063846803
test
lantern android client client test go loc found a possible issue in at the below snippet of go code triggered static analysis which searches for goroutines and or defer statements which capture loop variables click here to show the line s of go which triggered the analyzer go for uri expectedcontent range testurls wg add go func wg sync waitgroup if err testreverseproxy uri expectedcontent err nil t fatal err wg done wg leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
1
251,154
21,435,602,493
IssuesEvent
2022-04-24 00:23:37
kmewrd/mycophilia
https://api.github.com/repos/kmewrd/mycophilia
closed
Write Cypress tests for Dashboard
testing
Paths to test: - [x] Contains stats - [x] Contains sightings - [x] Contains region - [ ] Region is editable (optional) - [x] Contains nav bar or menu - [x] Contains logout button - [ ] URL matches /dashboard
1.0
Write Cypress tests for Dashboard - Paths to test: - [x] Contains stats - [x] Contains sightings - [x] Contains region - [ ] Region is editable (optional) - [x] Contains nav bar or menu - [x] Contains logout button - [ ] URL matches /dashboard
test
write cypress tests for dashboard paths to test contains stats contains sightings contains region region is editable optional contains nav bar or menu contains logout button url matches dashboard
1
333,791
10,131,361,247
IssuesEvent
2019-08-01 19:20:09
jenkins-x/jx
https://api.github.com/repos/jenkins-x/jx
closed
jx version always fails
area/boot area/versions kind/bug kind/fox priority/critical-urgent
### Summary jx diagnose is failing with `error: package jx is on version 2.0.513 but the version stream requires version 2.0.404 exit status 1` ### Jx version The output of `jx version` is: ``` jx 2.0.513 Kubernetes cluster v1.12.8-gke.10 kubectl v1.14.2 helm client Client: v2.14.0+g05811b8 git git version 2.21.0 Operating System Mac OS X 10.13.6 build 17G6030 ```
1.0
jx version always fails - ### Summary jx diagnose is failing with `error: package jx is on version 2.0.513 but the version stream requires version 2.0.404 exit status 1` ### Jx version The output of `jx version` is: ``` jx 2.0.513 Kubernetes cluster v1.12.8-gke.10 kubectl v1.14.2 helm client Client: v2.14.0+g05811b8 git git version 2.21.0 Operating System Mac OS X 10.13.6 build 17G6030 ```
non_test
jx version always fails summary jx diagnose is failing with error package jx is on version but the version stream requires version exit status jx version the output of jx version is jx kubernetes cluster gke kubectl helm client client git git version operating system mac os x build
0
21,473
10,619,226,837
IssuesEvent
2019-10-13 11:45:14
Shuunen/flood-it
https://api.github.com/repos/Shuunen/flood-it
closed
CVE-2018-3721 Medium Severity Vulnerability detected by WhiteSource
security vulnerability
## CVE-2018-3721 - Medium Severity Vulnerability <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-3.10.1.tgz</b></p></summary> <p>The modern build of lodash modular utilities.</p> <p>path: /tmp/git/flood-it/node_modules/xmlbuilder/node_modules/lodash/package.json</p> <p> <p>Library home page: <a href=http://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz>http://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p> Dependency Hierarchy: - jscs-3.0.7.tgz (Root Library) - jscs-jsdoc-2.0.0.tgz - jsdoctypeparser-1.2.0.tgz - :x: **lodash-3.10.1.tgz** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> lodash node module before 4.17.5 suffers from a Modification of Assumed-Immutable Data (MAID) vulnerability via defaultsDeep, merge, and mergeWith functions, which allows a malicious user to modify the prototype of "Object" via __proto__, causing the addition or modification of an existing property that will exist on all objects. <p>Publish Date: 2018-06-07 <p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-3721>CVE-2018-3721</a></p> </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2018-3721">https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2018-3721</a></p> <p>Fix Resolution: Upgrade to version lodash 4.17.5 or greater</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2018-3721 Medium Severity Vulnerability detected by WhiteSource - ## CVE-2018-3721 - Medium Severity Vulnerability <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-3.10.1.tgz</b></p></summary> <p>The modern build of lodash modular utilities.</p> <p>path: /tmp/git/flood-it/node_modules/xmlbuilder/node_modules/lodash/package.json</p> <p> <p>Library home page: <a href=http://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz>http://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p> Dependency Hierarchy: - jscs-3.0.7.tgz (Root Library) - jscs-jsdoc-2.0.0.tgz - jsdoctypeparser-1.2.0.tgz - :x: **lodash-3.10.1.tgz** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> lodash node module before 4.17.5 suffers from a Modification of Assumed-Immutable Data (MAID) vulnerability via defaultsDeep, merge, and mergeWith functions, which allows a malicious user to modify the prototype of "Object" via __proto__, causing the addition or modification of an existing property that will exist on all objects. <p>Publish Date: 2018-06-07 <p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-3721>CVE-2018-3721</a></p> </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2018-3721">https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2018-3721</a></p> <p>Fix Resolution: Upgrade to version lodash 4.17.5 or greater</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve medium severity vulnerability detected by whitesource cve medium severity vulnerability vulnerable library lodash tgz the modern build of lodash modular utilities path tmp git flood it node modules xmlbuilder node modules lodash package json library home page a href dependency hierarchy jscs tgz root library jscs jsdoc tgz jsdoctypeparser tgz x lodash tgz vulnerable library vulnerability details lodash node module before suffers from a modification of assumed immutable data maid vulnerability via defaultsdeep merge and mergewith functions which allows a malicious user to modify the prototype of object via proto causing the addition or modification of an existing property that will exist on all objects publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href fix resolution upgrade to version lodash or greater step up your open source security game with whitesource
0
5,796
5,961,747,030
IssuesEvent
2017-05-29 18:52:05
emberjs/guides
https://api.github.com/repos/emberjs/guides
closed
Remove font smoothing
help wanted infrastructure
Currently, our styles use font smoothing, which is [non-standard and should not be used](https://developer.mozilla.org/en-US/docs/Web/CSS/font-smooth). This should be removed, and the styles updated so that the font still looks good. The relevant properties are `-webkit-font-smoothing` and `-moz-osx-font-smoothing`, and they are in `_highlight.scss`, `_typography.scss`, and `_buttons.scss`.
1.0
Remove font smoothing - Currently, our styles use font smoothing, which is [non-standard and should not be used](https://developer.mozilla.org/en-US/docs/Web/CSS/font-smooth). This should be removed, and the styles updated so that the font still looks good. The relevant properties are `-webkit-font-smoothing` and `-moz-osx-font-smoothing`, and they are in `_highlight.scss`, `_typography.scss`, and `_buttons.scss`.
non_test
remove font smoothing currently our styles use font smoothing which is this should be removed and the styles updated so that the font still looks good the relevant properties are webkit font smoothing and moz osx font smoothing and they are in highlight scss typography scss and buttons scss
0
654
2,536,503,829
IssuesEvent
2015-01-26 14:34:44
ramda/ramda
https://api.github.com/repos/ramda/ramda
closed
documentation: indentation not preserved
documentation duplicate
[Input](https://github.com/ramda/ramda/blob/v0.9.0/src/call.js): ```javascript var indentN = R.pipe(R.times(R.always(' ')), R.join(''), R.replace(/^(?!$)/gm)); var format = R.converge(R.call, R.pipe(R.prop('indent'), indentN), R.prop('value')); format({indent: 2, value: 'foo\nbar\nbaz\n'}); //=> ' foo\n bar\n baz\n' ``` [Output](http://ramdajs.com/docs/#call): ```javascript var indentN = R.pipe(R.times(R.always(' ')), R.join(''), R.replace(/^(?!$)/gm)); var format = R.converge(R.call, R.pipe(R.prop('indent'), indentN), R.prop('value')); format({indent: 2, value: 'foo\nbar\nbaz\n'}); //=> ' foo\n bar\n baz\n' ```
1.0
documentation: indentation not preserved - [Input](https://github.com/ramda/ramda/blob/v0.9.0/src/call.js): ```javascript var indentN = R.pipe(R.times(R.always(' ')), R.join(''), R.replace(/^(?!$)/gm)); var format = R.converge(R.call, R.pipe(R.prop('indent'), indentN), R.prop('value')); format({indent: 2, value: 'foo\nbar\nbaz\n'}); //=> ' foo\n bar\n baz\n' ``` [Output](http://ramdajs.com/docs/#call): ```javascript var indentN = R.pipe(R.times(R.always(' ')), R.join(''), R.replace(/^(?!$)/gm)); var format = R.converge(R.call, R.pipe(R.prop('indent'), indentN), R.prop('value')); format({indent: 2, value: 'foo\nbar\nbaz\n'}); //=> ' foo\n bar\n baz\n' ```
non_test
documentation indentation not preserved javascript var indentn r pipe r times r always r join r replace gm var format r converge r call r pipe r prop indent indentn r prop value format indent value foo nbar nbaz n foo n bar n baz n javascript var indentn r pipe r times r always r join r replace gm var format r converge r call r pipe r prop indent indentn r prop value format indent value foo nbar nbaz n foo n bar n baz n
0
36,091
5,030,620,771
IssuesEvent
2016-12-16 01:43:54
SEED-platform/seed
https://api.github.com/repos/SEED-platform/seed
closed
Mapping: Show both Tax Lot and Property fields in Mapping Review screen
Mapping P-1 seedDEMO - 3 - Tested / Passed V2 Release
Should I be able to see both Tax Lot and Property fields in the Mapping Review screen? Maybe I am not understanding how this works, but I don't see fields from both tables. Here is how I tested it (maybe I am not doing the test correctly). on seeddemostaging with latest develop as of 10/17/2016 at 8:59 am Steps to reproduce - Import sample tax lot file (link from #1099): https://drive.google.com/open?id=0B3fTKpZ9Dx7LZkpJcG4taGtsaFk - Map it as follows -- this time I am mapping all the fields to Tax Lot except the last one, which I am mapping to Property ![image](https://cloud.githubusercontent.com/assets/6314950/19461704/b6ac9b70-949b-11e6-9335-6c4ccf52a608.png) - Click the Map Your Data button - In the Mapping Review screen, I only see a tab for the Tax Lot fields -- how do I see the fields for the Property table? Should there be another tab next to View by Tax Lot that is View by Property, with the one field, "Number of Buildings" shown there? - Also, all the fields from the Tax Lot table are being displayed even though most of them were not in the imported file and so were not mapped -- the program should only be displaying the fields that were mapped. ![image](https://cloud.githubusercontent.com/assets/6314950/19461769/637d2d4c-949c-11e6-8b5e-1a8be6f4b787.png)
1.0
Mapping: Show both Tax Lot and Property fields in Mapping Review screen - Should I be able to see both Tax Lot and Property fields in the Mapping Review screen? Maybe I am not understanding how this works, but I don't see fields from both tables. Here is how I tested it (maybe I am not doing the test correctly). on seeddemostaging with latest develop as of 10/17/2016 at 8:59 am Steps to reproduce - Import sample tax lot file (link from #1099): https://drive.google.com/open?id=0B3fTKpZ9Dx7LZkpJcG4taGtsaFk - Map it as follows -- this time I am mapping all the fields to Tax Lot except the last one, which I am mapping to Property ![image](https://cloud.githubusercontent.com/assets/6314950/19461704/b6ac9b70-949b-11e6-9335-6c4ccf52a608.png) - Click the Map Your Data button - In the Mapping Review screen, I only see a tab for the Tax Lot fields -- how do I see the fields for the Property table? Should there be another tab next to View by Tax Lot that is View by Property, with the one field, "Number of Buildings" shown there? - Also, all the fields from the Tax Lot table are being displayed even though most of them were not in the imported file and so were not mapped -- the program should only be displaying the fields that were mapped. ![image](https://cloud.githubusercontent.com/assets/6314950/19461769/637d2d4c-949c-11e6-8b5e-1a8be6f4b787.png)
test
mapping show both tax lot and property fields in mapping review screen should i be able to see both tax lot and property fields in the mapping review screen maybe i am not understanding how this works but i don t see fields from both tables here is how i tested it maybe i am not doing the test correctly on seeddemostaging with latest develop as of at am steps to reproduce import sample tax lot file link from map it as follows this time i am mapping all the fields to tax lot except the last one which i am mapping to property click the map your data button in the mapping review screen i only see a tab for the tax lot fields how do i see the fields for the property table should there be another tab next to view by tax lot that is view by property with the one field number of buildings shown there also all the fields from the tax lot table are being displayed even though most of them were not in the imported file and so were not mapped the program should only be displaying the fields that were mapped
1
123,830
16,541,667,845
IssuesEvent
2021-05-27 17:37:12
matomo-org/matomo
https://api.github.com/repos/matomo-org/matomo
closed
Sticky "Maximize" mouse title
Bug c: Design / UI
Move the mouse cursor on the dashboard over the maximize icon. Wait till the "Maximize" mouse title appears. Than click on the icon. => The widget opens in a maximized mode But the black "Maximize" mouse title sicks to the cursor and will not disappear. <img width="574" alt="grafik" src="https://user-images.githubusercontent.com/1645099/119682405-27844c00-be43-11eb-9407-8580bab85537.png"> <img width="517" alt="grafik" src="https://user-images.githubusercontent.com/1645099/119682852-81851180-be43-11eb-9365-1ea5a09e09da.png"> ## Expected Behavior mouse title disappears after leaving the icon, also when the widget maximized ## Current Behavior mouse title does NOT disappear ## Steps to Reproduce (for Bugs) see above ## Your Environment * Matomo Version: 4.3.1 * Browser: Firefox 88
1.0
Sticky "Maximize" mouse title - Move the mouse cursor on the dashboard over the maximize icon. Wait till the "Maximize" mouse title appears. Than click on the icon. => The widget opens in a maximized mode But the black "Maximize" mouse title sicks to the cursor and will not disappear. <img width="574" alt="grafik" src="https://user-images.githubusercontent.com/1645099/119682405-27844c00-be43-11eb-9407-8580bab85537.png"> <img width="517" alt="grafik" src="https://user-images.githubusercontent.com/1645099/119682852-81851180-be43-11eb-9365-1ea5a09e09da.png"> ## Expected Behavior mouse title disappears after leaving the icon, also when the widget maximized ## Current Behavior mouse title does NOT disappear ## Steps to Reproduce (for Bugs) see above ## Your Environment * Matomo Version: 4.3.1 * Browser: Firefox 88
non_test
sticky maximize mouse title move the mouse cursor on the dashboard over the maximize icon wait till the maximize mouse title appears than click on the icon the widget opens in a maximized mode but the black maximize mouse title sicks to the cursor and will not disappear img width alt grafik src img width alt grafik src expected behavior mouse title disappears after leaving the icon also when the widget maximized current behavior mouse title does not disappear steps to reproduce for bugs see above your environment matomo version browser firefox
0
336,772
30,221,035,307
IssuesEvent
2023-07-05 19:25:26
RamenDR/ramen
https://api.github.com/repos/RamenDR/ramen
opened
Test basic-test with openshift clusters
test
We want to be able to run ramen basic-test with openshift clusters for testing upstream ramen changes with openshift. Depends on #959. ## Tasks 1. [ ] Get openshift cluster kubeconfigs for hub, cluster1, cluster2 1. [ ] Importopenshift kubeconfigs to local kubeconfig (~/.kube/config) ```sh KUBECONFIG=hub.kubeconfig:cluster1.kubeconfig:cluster2.kubeconfig:~/.kube.config \ kubectl config view --flatten > ~/.kube.config ``` 1. [ ] Create environment file for the external clusters ```yaml # ocp.yaml --- ramen: hub: hub clusters: [cluster1, cluster2] topology: regional-dr profiles: - name: hub external: true - name: cluster1 external: true - name: cluster2 external: true ``` 1. [ ] Run basic test ``` test/basic-test/run ocp.yaml ``` 1. [ ] Automate steps 2, 3 so we don't have to do this manually. It will be useful to have a tool for importing configs and creating environment file for the configs: ```sh drenv generate --hub hub.kubeconfig --cluster cluster1.kubeconfig --cluster cluster2.kubeconfig ``` Open other issue if needed if the test does not work with openshift clusters.
1.0
Test basic-test with openshift clusters - We want to be able to run ramen basic-test with openshift clusters for testing upstream ramen changes with openshift. Depends on #959. ## Tasks 1. [ ] Get openshift cluster kubeconfigs for hub, cluster1, cluster2 1. [ ] Importopenshift kubeconfigs to local kubeconfig (~/.kube/config) ```sh KUBECONFIG=hub.kubeconfig:cluster1.kubeconfig:cluster2.kubeconfig:~/.kube.config \ kubectl config view --flatten > ~/.kube.config ``` 1. [ ] Create environment file for the external clusters ```yaml # ocp.yaml --- ramen: hub: hub clusters: [cluster1, cluster2] topology: regional-dr profiles: - name: hub external: true - name: cluster1 external: true - name: cluster2 external: true ``` 1. [ ] Run basic test ``` test/basic-test/run ocp.yaml ``` 1. [ ] Automate steps 2, 3 so we don't have to do this manually. It will be useful to have a tool for importing configs and creating environment file for the configs: ```sh drenv generate --hub hub.kubeconfig --cluster cluster1.kubeconfig --cluster cluster2.kubeconfig ``` Open other issue if needed if the test does not work with openshift clusters.
test
test basic test with openshift clusters we want to be able to run ramen basic test with openshift clusters for testing upstream ramen changes with openshift depends on tasks get openshift cluster kubeconfigs for hub importopenshift kubeconfigs to local kubeconfig kube config sh kubeconfig hub kubeconfig kubeconfig kubeconfig kube config kubectl config view flatten kube config create environment file for the external clusters yaml ocp yaml ramen hub hub clusters topology regional dr profiles name hub external true name external true name external true run basic test test basic test run ocp yaml automate steps so we don t have to do this manually it will be useful to have a tool for importing configs and creating environment file for the configs sh drenv generate hub hub kubeconfig cluster kubeconfig cluster kubeconfig open other issue if needed if the test does not work with openshift clusters
1