Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 4 112 | repo_url stringlengths 33 141 | action stringclasses 3 values | title stringlengths 1 1.02k | labels stringlengths 4 1.54k | body stringlengths 1 262k | index stringclasses 17 values | text_combine stringlengths 95 262k | label stringclasses 2 values | text stringlengths 96 252k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
254,711 | 21,872,820,049 | IssuesEvent | 2022-05-19 07:27:53 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | opened | kv/kvserver: TestReplicaDrainLease failed | C-test-failure O-robot branch-master | kv/kvserver.TestReplicaDrainLease [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=5203810&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=5203810&tab=artifacts#/) on master @ [ce66acdbba801de88f1dd645eaedeb3834f23dbd](https://github.com/cockroachdb/cockroach/commits/ce66acdbba801de88f1dd645eaedeb3834f23dbd):
```
=== RUN TestReplicaDrainLease
test_log_scope.go:79: test logs captured to: /artifacts/tmp/_tmp/751d67000aac5f3394c2369309253f02/logTestReplicaDrainLease1320269877
test_log_scope.go:80: use -show-logs to present logs inline
```
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
Parameters in this failure:
- TAGS=bazel,gss
</p>
</details>
/cc @cockroachdb/kv
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestReplicaDrainLease.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| 1.0 | kv/kvserver: TestReplicaDrainLease failed - kv/kvserver.TestReplicaDrainLease [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=5203810&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=5203810&tab=artifacts#/) on master @ [ce66acdbba801de88f1dd645eaedeb3834f23dbd](https://github.com/cockroachdb/cockroach/commits/ce66acdbba801de88f1dd645eaedeb3834f23dbd):
```
=== RUN TestReplicaDrainLease
test_log_scope.go:79: test logs captured to: /artifacts/tmp/_tmp/751d67000aac5f3394c2369309253f02/logTestReplicaDrainLease1320269877
test_log_scope.go:80: use -show-logs to present logs inline
```
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
Parameters in this failure:
- TAGS=bazel,gss
</p>
</details>
/cc @cockroachdb/kv
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestReplicaDrainLease.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| test | kv kvserver testreplicadrainlease failed kv kvserver testreplicadrainlease with on master run testreplicadrainlease test log scope go test logs captured to artifacts tmp tmp test log scope go use show logs to present logs inline help see also parameters in this failure tags bazel gss cc cockroachdb kv | 1 |
56,804 | 6,529,074,952 | IssuesEvent | 2017-08-30 10:03:34 | Interaktivtechnology/Raimon-Web | https://api.github.com/repos/Interaktivtechnology/Raimon-Web | closed | Query Color Issue | Monitor Need Testing | @benjaminl83 @purnomoeko
The color changes incorrectly when there are more than 1 filter involved.
Filter color were set and display as follows:
Filter 1 set = yellow, result = green
Filter 2 set = blue, result = blue
Filter 3 set = green, result = yellow
https://raimon-land-sales-matrix.firebaseapp.com/query/edit/-KsHKKChvcYJn_pA9ibI/-KsHVQXsJEOsIG9d9saU

| 1.0 | Query Color Issue - @benjaminl83 @purnomoeko
The color changes incorrectly when there are more than 1 filter involved.
Filter color were set and display as follows:
Filter 1 set = yellow, result = green
Filter 2 set = blue, result = blue
Filter 3 set = green, result = yellow
https://raimon-land-sales-matrix.firebaseapp.com/query/edit/-KsHKKChvcYJn_pA9ibI/-KsHVQXsJEOsIG9d9saU

| test | query color issue purnomoeko the color changes incorrectly when there are more than filter involved filter color were set and display as follows filter set yellow result green filter set blue result blue filter set green result yellow | 1 |
130,085 | 18,017,867,271 | IssuesEvent | 2021-09-16 15:42:01 | cloudfour/cloudfour.com-patterns | https://api.github.com/repos/cloudfour/cloudfour.com-patterns | opened | Add more fallback images | 🎨 design size:? | @AriannaChau designed some great fallback images for posts without a feature image: https://v-next--cloudfour-patterns.netlify.app/?path=/docs/design-illustrations--feature-images
But there's still quite a bit of repetition once applied to our actual articles.
@Paul-Hebert recently worked on some updated "what we do" illustrations that could really come in handy for some additional imagery.
Some potential ideas (just a starting point, not prescriptive instructions):
- Design systems
- Accessibility
- Maintenance
- Process
- Code variations (brackets?)
- Images
- Typography
- Animation
(This issue is purposely a bit loose: We should consider it closed once we've increased the variety by some measure. We can always add more later.) | 1.0 | Add more fallback images - @AriannaChau designed some great fallback images for posts without a feature image: https://v-next--cloudfour-patterns.netlify.app/?path=/docs/design-illustrations--feature-images
But there's still quite a bit of repetition once applied to our actual articles.
@Paul-Hebert recently worked on some updated "what we do" illustrations that could really come in handy for some additional imagery.
Some potential ideas (just a starting point, not prescriptive instructions):
- Design systems
- Accessibility
- Maintenance
- Process
- Code variations (brackets?)
- Images
- Typography
- Animation
(This issue is purposely a bit loose: We should consider it closed once we've increased the variety by some measure. We can always add more later.) | non_test | add more fallback images ariannachau designed some great fallback images for posts without a feature image but there s still quite a bit of repetition once applied to our actual articles paul hebert recently worked on some updated what we do illustrations that could really come in handy for some additional imagery some potential ideas just a starting point not prescriptive instructions design systems accessibility maintenance process code variations brackets images typography animation this issue is purposely a bit loose we should consider it closed once we ve increased the variety by some measure we can always add more later | 0 |
148,977 | 11,873,734,899 | IssuesEvent | 2020-03-26 17:46:42 | kubernetes/minikube | https://api.github.com/repos/kubernetes/minikube | closed | TestAddons/parallel/Registry failing on none: wget: bad address 'registry.kube-system.svc.cluster.local' | area/testing kind/failing-test lifecycle/stale priority/important-longterm | The registry addon hasn't changed in months, but start timings have:
```
--- FAIL: TestAddons (83.61s)
addons_test.go:46: (dbg) Run: out/minikube-linux-amd64 start -p minikube --wait=false --memory=2600 --alsologtostderr -v=1 --addons=ingress --addons=registry --vm-driver=none
addons_test.go:46: (dbg) Done: out/minikube-linux-amd64 start -p minikube --wait=false --memory=2600 --alsologtostderr -v=1 --addons=ingress --addons=registry --vm-driver=none : (22.104702021s)
--- FAIL: TestAddons/parallel (45.08s)
--- FAIL: TestAddons/parallel/Registry (45.08s)
addons_test.go:152: registry stabilized in 6.901098124s
addons_test.go:154: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers.go:268: "registry-4r8zx" [4c39e360-fb60-48bd-a7c6-a6a164a0d6de] Pending
helpers.go:268: "registry-4r8zx" [4c39e360-fb60-48bd-a7c6-a6a164a0d6de] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
helpers.go:268: "registry-4r8zx" [4c39e360-fb60-48bd-a7c6-a6a164a0d6de] Running
addons_test.go:154: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 10.012248147s
addons_test.go:157: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers.go:268: "registry-proxy-vpgmp" [75c22d11-d7a1-4360-97f8-2167673f24c7] Running
addons_test.go:157: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.005989006s
addons_test.go:162: (dbg) Run: kubectl --context minikube delete po -l run=registry-test --now
addons_test.go:167: (dbg) Run: kubectl --context minikube run --rm registry-test --restart=Never --image=busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:167: (dbg) Non-zero exit: kubectl --context minikube run --rm registry-test --restart=Never --image=busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": exit status 1 (3.613896419s)
-- stdout --
wget: bad address 'registry.kube-system.svc.cluster.local'
pod "registry-test" deleted
-- /stdout --
** stderr **
pod default/registry-test terminated (Error)
** /stderr **
addons_test.go:169: [kubectl --context minikube run --rm registry-test --restart=Never --image=busybox -it -- sh -c wget --spider -S http://registry.kube-system.svc.cluster.local] failed: exit status 1
addons_test.go:173: curl = "wget: bad address 'registry.kube-system.svc.cluster.local'\r\npod \"registry-test\" deleted\n", want *HTTP/1.1 200*
addons_test.go:177: (dbg) Run: out/minikube-linux-amd64 -p minikube ip
addons_test.go:206: (dbg) Run: out/minikube-linux-amd64 -p minikube addons disable registry --alsologtostderr -v=1
```
Based on this output:
```
addons_test.go:82: (dbg) kubectl --context minikube get po -A --show-labels:
NAMESPACE NAME READY STATUS RESTARTS AGE LABELS
kube-system coredns-5644d7b6d9-jsczk 1/1 Running 0 45s k8s-app=kube-dns,pod-template-hash=5644d7b6d9
kube-system coredns-5644d7b6d9-z2zgg 1/1 Running 0 45s k8s-app=kube-dns,pod-template-hash=5644d7b6d9
kube-system kube-proxy-5dbf7 1/1 Running 0 45s controller-revision-hash=56ffd4ff47,k8s-app=kube-proxy,pod-template-generation=1
kube-system nginx-ingress-controller-6fc5bcc8c9-w9dbw 1/1 Running 0 43s addonmanager.kubernetes.io/mode=Reconcile,app.kubernetes.io/name=nginx-ingress-controller,app.kubernetes.io/part-of=kube-system,pod-template-hash=6fc5bcc8c9
kube-system registry-4r8zx 1/1 Terminating 0 44s actual-registry=true,addonmanager.kubernetes.io/mode=Reconcile,kubernetes.io/minikube-addons=registry
kube-system registry-proxy-vpgmp 1/1 Terminating 0 44s addonmanager.kubernetes.io/mode=Reconcile,controller-revision-hash=675799b8c9,kubernetes.io/minikube-addons=registry,pod-template-generation=1,registry-proxy=true
kube-system storage-provisioner 1/1 Running 0 44s addonmanager.kubernetes.io/mode=Reconcile,integration-test=storage-provisioner
addons_test.go:82: (dbg) Run: kubectl --context minikube describe node
```
and this output:
```
Non-terminated Pods: (7 in total)
Namespace Name CPU Requests CPU Limits Memory Requests Memory Limits AGE
--------- ---- ------------ ---------- --------------- ------------- ---
kube-system coredns-5644d7b6d9-jsczk 100m (1%) 0 (0%) 70Mi (0%) 170Mi (0%) 45s
kube-system coredns-5644d7b6d9-z2zgg 100m (1%) 0 (0%) 70Mi (0%) 170Mi (0%) 45s
kube-system kube-proxy-5dbf7 0 (0%) 0 (0%) 0 (0%) 0 (0%) 45s
kube-system nginx-ingress-controller-6fc5bcc8c9-w9dbw 0 (0%) 0 (0%) 0 (0%) 0 (0%) 43s
kube-system registry-4r8zx 0 (0%) 0 (0%) 0 (0%) 0 (0%) 44s
kube-system registry-proxy-vpgmp 0 (0%) 0 (0%) 0 (0%) 0 (0%) 44s
kube-system storage-provisioner 0 (0%) 0 (0%) 0 (0%) 0 (0%) 44s
```
My theroy is that maybe the registry came in from a previous test, and was being restarted? Are we not deleting kubernetes pods when we shut down the none driver?
| 2.0 | TestAddons/parallel/Registry failing on none: wget: bad address 'registry.kube-system.svc.cluster.local' - The registry addon hasn't changed in months, but start timings have:
```
--- FAIL: TestAddons (83.61s)
addons_test.go:46: (dbg) Run: out/minikube-linux-amd64 start -p minikube --wait=false --memory=2600 --alsologtostderr -v=1 --addons=ingress --addons=registry --vm-driver=none
addons_test.go:46: (dbg) Done: out/minikube-linux-amd64 start -p minikube --wait=false --memory=2600 --alsologtostderr -v=1 --addons=ingress --addons=registry --vm-driver=none : (22.104702021s)
--- FAIL: TestAddons/parallel (45.08s)
--- FAIL: TestAddons/parallel/Registry (45.08s)
addons_test.go:152: registry stabilized in 6.901098124s
addons_test.go:154: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers.go:268: "registry-4r8zx" [4c39e360-fb60-48bd-a7c6-a6a164a0d6de] Pending
helpers.go:268: "registry-4r8zx" [4c39e360-fb60-48bd-a7c6-a6a164a0d6de] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
helpers.go:268: "registry-4r8zx" [4c39e360-fb60-48bd-a7c6-a6a164a0d6de] Running
addons_test.go:154: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 10.012248147s
addons_test.go:157: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers.go:268: "registry-proxy-vpgmp" [75c22d11-d7a1-4360-97f8-2167673f24c7] Running
addons_test.go:157: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.005989006s
addons_test.go:162: (dbg) Run: kubectl --context minikube delete po -l run=registry-test --now
addons_test.go:167: (dbg) Run: kubectl --context minikube run --rm registry-test --restart=Never --image=busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:167: (dbg) Non-zero exit: kubectl --context minikube run --rm registry-test --restart=Never --image=busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": exit status 1 (3.613896419s)
-- stdout --
wget: bad address 'registry.kube-system.svc.cluster.local'
pod "registry-test" deleted
-- /stdout --
** stderr **
pod default/registry-test terminated (Error)
** /stderr **
addons_test.go:169: [kubectl --context minikube run --rm registry-test --restart=Never --image=busybox -it -- sh -c wget --spider -S http://registry.kube-system.svc.cluster.local] failed: exit status 1
addons_test.go:173: curl = "wget: bad address 'registry.kube-system.svc.cluster.local'\r\npod \"registry-test\" deleted\n", want *HTTP/1.1 200*
addons_test.go:177: (dbg) Run: out/minikube-linux-amd64 -p minikube ip
addons_test.go:206: (dbg) Run: out/minikube-linux-amd64 -p minikube addons disable registry --alsologtostderr -v=1
```
Based on this output:
```
addons_test.go:82: (dbg) kubectl --context minikube get po -A --show-labels:
NAMESPACE NAME READY STATUS RESTARTS AGE LABELS
kube-system coredns-5644d7b6d9-jsczk 1/1 Running 0 45s k8s-app=kube-dns,pod-template-hash=5644d7b6d9
kube-system coredns-5644d7b6d9-z2zgg 1/1 Running 0 45s k8s-app=kube-dns,pod-template-hash=5644d7b6d9
kube-system kube-proxy-5dbf7 1/1 Running 0 45s controller-revision-hash=56ffd4ff47,k8s-app=kube-proxy,pod-template-generation=1
kube-system nginx-ingress-controller-6fc5bcc8c9-w9dbw 1/1 Running 0 43s addonmanager.kubernetes.io/mode=Reconcile,app.kubernetes.io/name=nginx-ingress-controller,app.kubernetes.io/part-of=kube-system,pod-template-hash=6fc5bcc8c9
kube-system registry-4r8zx 1/1 Terminating 0 44s actual-registry=true,addonmanager.kubernetes.io/mode=Reconcile,kubernetes.io/minikube-addons=registry
kube-system registry-proxy-vpgmp 1/1 Terminating 0 44s addonmanager.kubernetes.io/mode=Reconcile,controller-revision-hash=675799b8c9,kubernetes.io/minikube-addons=registry,pod-template-generation=1,registry-proxy=true
kube-system storage-provisioner 1/1 Running 0 44s addonmanager.kubernetes.io/mode=Reconcile,integration-test=storage-provisioner
addons_test.go:82: (dbg) Run: kubectl --context minikube describe node
```
and this output:
```
Non-terminated Pods: (7 in total)
Namespace Name CPU Requests CPU Limits Memory Requests Memory Limits AGE
--------- ---- ------------ ---------- --------------- ------------- ---
kube-system coredns-5644d7b6d9-jsczk 100m (1%) 0 (0%) 70Mi (0%) 170Mi (0%) 45s
kube-system coredns-5644d7b6d9-z2zgg 100m (1%) 0 (0%) 70Mi (0%) 170Mi (0%) 45s
kube-system kube-proxy-5dbf7 0 (0%) 0 (0%) 0 (0%) 0 (0%) 45s
kube-system nginx-ingress-controller-6fc5bcc8c9-w9dbw 0 (0%) 0 (0%) 0 (0%) 0 (0%) 43s
kube-system registry-4r8zx 0 (0%) 0 (0%) 0 (0%) 0 (0%) 44s
kube-system registry-proxy-vpgmp 0 (0%) 0 (0%) 0 (0%) 0 (0%) 44s
kube-system storage-provisioner 0 (0%) 0 (0%) 0 (0%) 0 (0%) 44s
```
My theroy is that maybe the registry came in from a previous test, and was being restarted? Are we not deleting kubernetes pods when we shut down the none driver?
| test | testaddons parallel registry failing on none wget bad address registry kube system svc cluster local the registry addon hasn t changed in months but start timings have fail testaddons addons test go dbg run out minikube linux start p minikube wait false memory alsologtostderr v addons ingress addons registry vm driver none addons test go dbg done out minikube linux start p minikube wait false memory alsologtostderr v addons ingress addons registry vm driver none fail testaddons parallel fail testaddons parallel registry addons test go registry stabilized in addons test go dbg testaddons parallel registry waiting for pods matching actual registry true in namespace kube system helpers go registry pending helpers go registry pending ready containersnotready containers with unready status containersready containersnotready containers with unready status helpers go registry running addons test go dbg testaddons parallel registry actual registry true healthy within addons test go dbg testaddons parallel registry waiting for pods matching registry proxy true in namespace kube system helpers go registry proxy vpgmp running addons test go dbg testaddons parallel registry registry proxy true healthy within addons test go dbg run kubectl context minikube delete po l run registry test now addons test go dbg run kubectl context minikube run rm registry test restart never image busybox it sh c wget spider s addons test go dbg non zero exit kubectl context minikube run rm registry test restart never image busybox it sh c wget spider s exit status stdout wget bad address registry kube system svc cluster local pod registry test deleted stdout stderr pod default registry test terminated error stderr addons test go failed exit status addons test go curl wget bad address registry kube system svc cluster local r npod registry test deleted n want http addons test go dbg run out minikube linux p minikube ip addons test go dbg run out minikube linux p minikube addons disable registry alsologtostderr v based on this output addons test go dbg kubectl context minikube get po a show labels namespace name ready status restarts age labels kube system coredns jsczk running app kube dns pod template hash kube system coredns running app kube dns pod template hash kube system kube proxy running controller revision hash app kube proxy pod template generation kube system nginx ingress controller running addonmanager kubernetes io mode reconcile app kubernetes io name nginx ingress controller app kubernetes io part of kube system pod template hash kube system registry terminating actual registry true addonmanager kubernetes io mode reconcile kubernetes io minikube addons registry kube system registry proxy vpgmp terminating addonmanager kubernetes io mode reconcile controller revision hash kubernetes io minikube addons registry pod template generation registry proxy true kube system storage provisioner running addonmanager kubernetes io mode reconcile integration test storage provisioner addons test go dbg run kubectl context minikube describe node and this output non terminated pods in total namespace name cpu requests cpu limits memory requests memory limits age kube system coredns jsczk kube system coredns kube system kube proxy kube system nginx ingress controller kube system registry kube system registry proxy vpgmp kube system storage provisioner my theroy is that maybe the registry came in from a previous test and was being restarted are we not deleting kubernetes pods when we shut down the none driver | 1 |
679,321 | 23,228,629,195 | IssuesEvent | 2022-08-03 04:44:42 | chloebrett/mlvet | https://api.github.com/repos/chloebrett/mlvet | closed | Multiple copy breaks undo | high priority | Subissue of #93
Copying words all over the place breaks undo. This might be linked to the known issue of indexes not working - but in theory this should be deterministic even with that bug. Below is the result of copying a bunch of times, undoing and then redoing, then undoing until there is no more left to undo. As you can see this doesn't return us to the original transcript, which should always be the result of undoing to the top of the stack. In fact in the state shown below, if I redo a bunch of times and then undo some more, _even more_ copies of "financial business" appear.
<img width="485" alt="Screen Shot 2022-05-22 at 11 49 14 am" src="https://user-images.githubusercontent.com/6735055/169674900-3da4d95f-7351-47bb-8b9c-7bf5f98668e4.png">
| 1.0 | Multiple copy breaks undo - Subissue of #93
Copying words all over the place breaks undo. This might be linked to the known issue of indexes not working - but in theory this should be deterministic even with that bug. Below is the result of copying a bunch of times, undoing and then redoing, then undoing until there is no more left to undo. As you can see this doesn't return us to the original transcript, which should always be the result of undoing to the top of the stack. In fact in the state shown below, if I redo a bunch of times and then undo some more, _even more_ copies of "financial business" appear.
<img width="485" alt="Screen Shot 2022-05-22 at 11 49 14 am" src="https://user-images.githubusercontent.com/6735055/169674900-3da4d95f-7351-47bb-8b9c-7bf5f98668e4.png">
| non_test | multiple copy breaks undo subissue of copying words all over the place breaks undo this might be linked to the known issue of indexes not working but in theory this should be deterministic even with that bug below is the result of copying a bunch of times undoing and then redoing then undoing until there is no more left to undo as you can see this doesn t return us to the original transcript which should always be the result of undoing to the top of the stack in fact in the state shown below if i redo a bunch of times and then undo some more even more copies of financial business appear img width alt screen shot at am src | 0 |
116,564 | 24,942,511,277 | IssuesEvent | 2022-10-31 20:14:43 | llvm/llvm-project | https://api.github.com/repos/llvm/llvm-project | closed | Assertion failure in SelectionDAG | llvm:codegen crash | We're seeing an assertion failure in SelectionDAG when building Fuchsia
Failing Bot:
https://ci.chromium.org/ui/p/fuchsia/builders/ci/clang_toolchain.ci.core.arm64-release/b8799099192459783297/overview
Diagnostic output:
```console
[89930/276809](64) CXX obj/src/media/audio/lib/timeline/timeline.timeline_rate.cc.o
FAILED: obj/src/media/audio/lib/timeline/timeline.timeline_rate.cc.o
../../../recipe_cleanup/clangxe2ptd3r/bin/clang++ -MD -MF obj/src/media/audio/lib/timeline/timeline.timeline_rate.cc.o.d -D_LIBCPP_DISABLE_VISIBILITY_ANNOTATIONS -D_LIBCPP_REMOVE_TRANSITIVE_INCLUDE...
clang++: llvm/lib/CodeGen/SelectionDAG/SelectionDAG.cpp:10274: void llvm::SelectionDAG::ReplaceAllUsesWith(SDNode *, SDNode *): Assertion `(!From->hasAnyUseOfValue(i) || From->getValueType(i) == To...
PLEASE submit a bug report to https://github.com/llvm/llvm-project/issues/ and include the crash backtrace, preprocessed source, and associated run script.
Stack dump:
0. Program arguments: ../../../recipe_cleanup/clangxe2ptd3r/bin/clang++ -MD -MF obj/src/media/audio/lib/timeline/timeline.timeline_rate.cc.o.d -D_LIBCPP_DISABLE_VISIBILITY_ANNOTATIONS -D_LIBCPP_REM...
1. <eof> parser at end of file
2. Code generation
3. Running pass 'Function Pass Manager' on module '../../src/media/audio/lib/timeline/timeline_rate.cc'.
4. Running pass 'AArch64 Instruction Selection' on function '@_ZN5media12TimelineRate7ProductES0_S0_b'
#0 0x0000564835ef2144 llvm::sys::PrintStackTrace(llvm::raw_ostream&, int) (../../../recipe_cleanup/clangxe2ptd3r/bin/clang+++0x3fbb144)
clang++: error: clang frontend command failed with exit code 134 (use -v to see invocation)
Fuchsia clang version 16.0.0 (https://llvm.googlesource.com/llvm-project 7aa0968842c5c36595042257009d1dc1a244ff6a)
Target: aarch64-unknown-fuchsia
Thread model: posix
InstalledDir: ../../../recipe_cleanup/clangxe2ptd3r/bin
clang++: note: diagnostic msg:
********************
PLEASE ATTACH THE FOLLOWING FILES TO THE BUG REPORT:
Preprocessed source(s) and associated run script(s) are located at:
clang++: note: diagnostic msg: clang-crashreports/timeline_rate-37a32c.cpp
clang++: note: diagnostic msg: clang-crashreports/timeline_rate-37a32c.sh
clang++: note: diagnostic msg:
********************
```
Reproducers:
[clang-crashreports.zip](https://github.com/llvm/llvm-project/files/9890819/clang-crashreports.zip)
I've bisected this down to https://reviews.llvm.org/D127531 in https://reviews.llvm.org/rG2ec51f1c753a63c786805708eb7875086cc5da9d | 1.0 | Assertion failure in SelectionDAG - We're seeing an assertion failure in SelectionDAG when building Fuchsia
Failing Bot:
https://ci.chromium.org/ui/p/fuchsia/builders/ci/clang_toolchain.ci.core.arm64-release/b8799099192459783297/overview
Diagnostic output:
```console
[89930/276809](64) CXX obj/src/media/audio/lib/timeline/timeline.timeline_rate.cc.o
FAILED: obj/src/media/audio/lib/timeline/timeline.timeline_rate.cc.o
../../../recipe_cleanup/clangxe2ptd3r/bin/clang++ -MD -MF obj/src/media/audio/lib/timeline/timeline.timeline_rate.cc.o.d -D_LIBCPP_DISABLE_VISIBILITY_ANNOTATIONS -D_LIBCPP_REMOVE_TRANSITIVE_INCLUDE...
clang++: llvm/lib/CodeGen/SelectionDAG/SelectionDAG.cpp:10274: void llvm::SelectionDAG::ReplaceAllUsesWith(SDNode *, SDNode *): Assertion `(!From->hasAnyUseOfValue(i) || From->getValueType(i) == To...
PLEASE submit a bug report to https://github.com/llvm/llvm-project/issues/ and include the crash backtrace, preprocessed source, and associated run script.
Stack dump:
0. Program arguments: ../../../recipe_cleanup/clangxe2ptd3r/bin/clang++ -MD -MF obj/src/media/audio/lib/timeline/timeline.timeline_rate.cc.o.d -D_LIBCPP_DISABLE_VISIBILITY_ANNOTATIONS -D_LIBCPP_REM...
1. <eof> parser at end of file
2. Code generation
3. Running pass 'Function Pass Manager' on module '../../src/media/audio/lib/timeline/timeline_rate.cc'.
4. Running pass 'AArch64 Instruction Selection' on function '@_ZN5media12TimelineRate7ProductES0_S0_b'
#0 0x0000564835ef2144 llvm::sys::PrintStackTrace(llvm::raw_ostream&, int) (../../../recipe_cleanup/clangxe2ptd3r/bin/clang+++0x3fbb144)
clang++: error: clang frontend command failed with exit code 134 (use -v to see invocation)
Fuchsia clang version 16.0.0 (https://llvm.googlesource.com/llvm-project 7aa0968842c5c36595042257009d1dc1a244ff6a)
Target: aarch64-unknown-fuchsia
Thread model: posix
InstalledDir: ../../../recipe_cleanup/clangxe2ptd3r/bin
clang++: note: diagnostic msg:
********************
PLEASE ATTACH THE FOLLOWING FILES TO THE BUG REPORT:
Preprocessed source(s) and associated run script(s) are located at:
clang++: note: diagnostic msg: clang-crashreports/timeline_rate-37a32c.cpp
clang++: note: diagnostic msg: clang-crashreports/timeline_rate-37a32c.sh
clang++: note: diagnostic msg:
********************
```
Reproducers:
[clang-crashreports.zip](https://github.com/llvm/llvm-project/files/9890819/clang-crashreports.zip)
I've bisected this down to https://reviews.llvm.org/D127531 in https://reviews.llvm.org/rG2ec51f1c753a63c786805708eb7875086cc5da9d | non_test | assertion failure in selectiondag we re seeing an assertion failure in selectiondag when building fuchsia failing bot diagnostic output console cxx obj src media audio lib timeline timeline timeline rate cc o failed obj src media audio lib timeline timeline timeline rate cc o recipe cleanup bin clang md mf obj src media audio lib timeline timeline timeline rate cc o d d libcpp disable visibility annotations d libcpp remove transitive include clang llvm lib codegen selectiondag selectiondag cpp void llvm selectiondag replacealluseswith sdnode sdnode assertion from hasanyuseofvalue i from getvaluetype i to please submit a bug report to and include the crash backtrace preprocessed source and associated run script stack dump program arguments recipe cleanup bin clang md mf obj src media audio lib timeline timeline timeline rate cc o d d libcpp disable visibility annotations d libcpp rem parser at end of file code generation running pass function pass manager on module src media audio lib timeline timeline rate cc running pass instruction selection on function b llvm sys printstacktrace llvm raw ostream int recipe cleanup bin clang clang error clang frontend command failed with exit code use v to see invocation fuchsia clang version target unknown fuchsia thread model posix installeddir recipe cleanup bin clang note diagnostic msg please attach the following files to the bug report preprocessed source s and associated run script s are located at clang note diagnostic msg clang crashreports timeline rate cpp clang note diagnostic msg clang crashreports timeline rate sh clang note diagnostic msg reproducers i ve bisected this down to in | 0 |
5,571 | 2,951,692,767 | IssuesEvent | 2015-07-07 01:44:54 | learningequality/ka-lite | https://api.github.com/repos/learningequality/ka-lite | closed | Version number in docs is confusing to users | documentation has PR | (From call with Team4Tech today)
Users were confused about why it said "KA Lite 2" in the docs, and weren't sure if they were on the right page:

It looks like, on the latest develop build of the docs, it's got: "KA Lite dev11 documentation".
Ideally, this should read "KA Lite 0.14 documentation" (or possibly the full "0.14.0", though I think we decided to have the docs just versioned by minor release). | 1.0 | Version number in docs is confusing to users - (From call with Team4Tech today)
Users were confused about why it said "KA Lite 2" in the docs, and weren't sure if they were on the right page:

It looks like, on the latest develop build of the docs, it's got: "KA Lite dev11 documentation".
Ideally, this should read "KA Lite 0.14 documentation" (or possibly the full "0.14.0", though I think we decided to have the docs just versioned by minor release). | non_test | version number in docs is confusing to users from call with today users were confused about why it said ka lite in the docs and weren t sure if they were on the right page it looks like on the latest develop build of the docs it s got ka lite documentation ideally this should read ka lite documentation or possibly the full though i think we decided to have the docs just versioned by minor release | 0 |
111,974 | 24,219,901,123 | IssuesEvent | 2022-09-26 09:57:09 | backstage/backstage | https://api.github.com/repos/backstage/backstage | closed | [Documentation] Improve the TechDocs "getting started" flow for newbies | documentation enhancement help wanted docs-like-code stale | [From discord](https://discord.com/channels/687207715902193673/714754240933003266/855028935188545536):
> I'm trying out Backstage with two goals - first is to get TechDocs working for me, and second is to later leverage the service catalog features.
>
> To use TechDocs I had to read a lot of documentation to understand how it fits together; the new user experience is pretty rough right now :smile: One solution is a simple-as-possible walkthrough of an opinionated setup, through which the user picks up some key concepts.
---
### Possible improvements
- **Different guides for different goals**: We may want separate getting started guides, one for those who have already set up and understand the Software Catalog. ...Another for those who are coming at it from a "TechDocs is my main goal" perspective, which gently introduces Software Catalog and Scaffolder concepts in the process
- **Scaffolder Confusion**: We may want to remove (or improve) the "Documentation Template" / scaffolder portion of the getting started guide, as a "documentation only" component is kind of an edge-case and requires a functioning scaffolder and GitHub integration with write access. Instead, consider a "fork this existing, public repo" as a way to guide adopters through concepts like the annotation, `mkdocs.yml`, etc. | 1.0 | [Documentation] Improve the TechDocs "getting started" flow for newbies - [From discord](https://discord.com/channels/687207715902193673/714754240933003266/855028935188545536):
> I'm trying out Backstage with two goals - first is to get TechDocs working for me, and second is to later leverage the service catalog features.
>
> To use TechDocs I had to read a lot of documentation to understand how it fits together; the new user experience is pretty rough right now :smile: One solution is a simple-as-possible walkthrough of an opinionated setup, through which the user picks up some key concepts.
---
### Possible improvements
- **Different guides for different goals**: We may want separate getting started guides, one for those who have already set up and understand the Software Catalog. ...Another for those who are coming at it from a "TechDocs is my main goal" perspective, which gently introduces Software Catalog and Scaffolder concepts in the process
- **Scaffolder Confusion**: We may want to remove (or improve) the "Documentation Template" / scaffolder portion of the getting started guide, as a "documentation only" component is kind of an edge-case and requires a functioning scaffolder and GitHub integration with write access. Instead, consider a "fork this existing, public repo" as a way to guide adopters through concepts like the annotation, `mkdocs.yml`, etc. | non_test | improve the techdocs getting started flow for newbies i m trying out backstage with two goals first is to get techdocs working for me and second is to later leverage the service catalog features to use techdocs i had to read a lot of documentation to understand how it fits together the new user experience is pretty rough right now smile nbsp nbsp one solution is a simple as possible walkthrough of an opinionated setup through which the user picks up some key concepts possible improvements different guides for different goals we may want separate getting started guides one for those who have already set up and understand the software catalog another for those who are coming at it from a techdocs is my main goal perspective which gently introduces software catalog and scaffolder concepts in the process scaffolder confusion we may want to remove or improve the documentation template scaffolder portion of the getting started guide as a documentation only component is kind of an edge case and requires a functioning scaffolder and github integration with write access instead consider a fork this existing public repo as a way to guide adopters through concepts like the annotation mkdocs yml etc | 0 |
110,058 | 13,895,749,316 | IssuesEvent | 2020-10-19 16:14:12 | archesproject/arches | https://api.github.com/repos/archesproject/arches | reopened | Designer View map component no longer tracks map center long/lat | Audience: Administrator Audience: Developer Subject: Card Designer | **Describe the bug**
<!--- By fully explaining what you are encountering, you can help us understand and reproduce the issue. -->
<!--- Often times, a screenshot or animated GIF can help show what you are encountering. -->
The map component displayed in the Designer view no longer tracks panning/zooming map beahavior.
**To Reproduce**
Steps to reproduce the behavior:
1. Navigate to the Designer view
2. Create a Resource Model with a node of datatype `geojson-feature-collection`.
3. Navigate to the Card subview
4. Pan && Zoom the map. Notice how the values in Map Center Longitude / Map Center Latitude / Default Zoom do not change
**Expected behavior**
The fields should update on user interaction
| 1.0 | Designer View map component no longer tracks map center long/lat - **Describe the bug**
<!--- By fully explaining what you are encountering, you can help us understand and reproduce the issue. -->
<!--- Often times, a screenshot or animated GIF can help show what you are encountering. -->
The map component displayed in the Designer view no longer tracks panning/zooming map beahavior.
**To Reproduce**
Steps to reproduce the behavior:
1. Navigate to the Designer view
2. Create a Resource Model with a node of datatype `geojson-feature-collection`.
3. Navigate to the Card subview
4. Pan && Zoom the map. Notice how the values in Map Center Longitude / Map Center Latitude / Default Zoom do not change
**Expected behavior**
The fields should update on user interaction
| non_test | designer view map component no longer tracks map center long lat describe the bug the map component displayed in the designer view no longer tracks panning zooming map beahavior to reproduce steps to reproduce the behavior navigate to the designer view create a resource model with a node of datatype geojson feature collection navigate to the card subview pan zoom the map notice how the values in map center longitude map center latitude default zoom do not change expected behavior the fields should update on user interaction | 0 |
305,743 | 9,376,464,798 | IssuesEvent | 2019-04-04 08:01:12 | vanilla-framework/vanilla-framework | https://api.github.com/repos/vanilla-framework/vanilla-framework | opened | Docs table of contents has no styling | Priority: High | The table of contents on the right side to the documentation has no styling.
Develop branch:

Live:

| 1.0 | Docs table of contents has no styling - The table of contents on the right side to the documentation has no styling.
Develop branch:

Live:

| non_test | docs table of contents has no styling the table of contents on the right side to the documentation has no styling develop branch live | 0 |
9,838 | 6,469,560,176 | IssuesEvent | 2017-08-17 06:26:24 | coreos/bugs | https://api.github.com/repos/coreos/bugs | opened | Add device-independent Ignition option for specifying partition start/size | area/usability component/ignition kind/enhancement team/tools | # Issue Report #
## Feature Request ##
### Environment ###
Bare metal
### Desired Feature ###
Provide a device-independent way to specify the `start` and `size` of a partition.
### Other Information ###
These values are currently expressed in device-specific logical sectors, which are 4 KiB on 4K native drives and 512 bytes otherwise (ignoring special cases like DVD-RAM disks). This could be awkward if a fleet of bare-metal machines contains data drives which are a mix of 512 or 512e disks and 4Kn disks.
### Reproduction ###
Ignition config:
```json
{
"ignition": {"version": "2.1.0"},
"storage": {
"disks": [{
"device": "/dev/vda",
"partitions": [{
"size": 8192,
"label": "TEST"
}]
}]
}
}
```
Execution environment:
```sh
touch image
truncate image -s 8G
./coreos_production_pxe.sh -i config.ign -global virtio-blk-device.logical_block_size=4096 -drive if=virtio,format=raw,file=image -append coreos.first_boot=1
sudo sgdisk -p /dev/vda
```
This will report a 32 MiB volume, or a 4 MiB volume if the `logical_block_size` parameter is removed.
cc @dgonyeo | True | Add device-independent Ignition option for specifying partition start/size - # Issue Report #
## Feature Request ##
### Environment ###
Bare metal
### Desired Feature ###
Provide a device-independent way to specify the `start` and `size` of a partition.
### Other Information ###
These values are currently expressed in device-specific logical sectors, which are 4 KiB on 4K native drives and 512 bytes otherwise (ignoring special cases like DVD-RAM disks). This could be awkward if a fleet of bare-metal machines contains data drives which are a mix of 512 or 512e disks and 4Kn disks.
### Reproduction ###
Ignition config:
```json
{
"ignition": {"version": "2.1.0"},
"storage": {
"disks": [{
"device": "/dev/vda",
"partitions": [{
"size": 8192,
"label": "TEST"
}]
}]
}
}
```
Execution environment:
```sh
touch image
truncate image -s 8G
./coreos_production_pxe.sh -i config.ign -global virtio-blk-device.logical_block_size=4096 -drive if=virtio,format=raw,file=image -append coreos.first_boot=1
sudo sgdisk -p /dev/vda
```
This will report a 32 MiB volume, or a 4 MiB volume if the `logical_block_size` parameter is removed.
cc @dgonyeo | non_test | add device independent ignition option for specifying partition start size issue report feature request environment bare metal desired feature provide a device independent way to specify the start and size of a partition other information these values are currently expressed in device specific logical sectors which are kib on native drives and bytes otherwise ignoring special cases like dvd ram disks this could be awkward if a fleet of bare metal machines contains data drives which are a mix of or disks and disks reproduction ignition config json ignition version storage disks device dev vda partitions size label test execution environment sh touch image truncate image s coreos production pxe sh i config ign global virtio blk device logical block size drive if virtio format raw file image append coreos first boot sudo sgdisk p dev vda this will report a mib volume or a mib volume if the logical block size parameter is removed cc dgonyeo | 0 |
129,813 | 10,587,200,458 | IssuesEvent | 2019-10-08 21:29:40 | elastic/elasticsearch | https://api.github.com/repos/elastic/elasticsearch | opened | HistoryTemplateSearchInputMappingsTests failure on master with RejectedExecutionException | :Core/Features/Watcher >test-failure | The failure looks like:
```
2> Oct 08, 2019 2:38:57 PM com.carrotsearch.randomizedtesting.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException
2> WARNING: Uncaught exception in thread: Thread[elasticsearch[node_sm1][snapshot][T#2],5,TGRP-HistoryTemplateSearchInputMappingsTests]
2> java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@71681ccd[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@305516f6[Wrapped task = org.elasticsearch.xpack.core.scheduler.SchedulerEngine$ActiveSchedule@346a8424]] rejected from java.util.concurrent.ScheduledThreadPoolExecutor@5829a5fb[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]
2> at __randomizedtesting.SeedInfo.seed([B75BDA1771E40CE4]:0)
2> at java.base/java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2055)
2> at java.base/java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:825)
2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor.delayedExecute(ScheduledThreadPoolExecutor.java:340)
2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor.schedule(ScheduledThreadPoolExecutor.java:562)
2> at org.elasticsearch.xpack.core.scheduler.SchedulerEngine$ActiveSchedule.scheduleNextRun(SchedulerEngine.java:222)
2> at org.elasticsearch.xpack.core.scheduler.SchedulerEngine$ActiveSchedule.<init>(SchedulerEngine.java:196)
2> at org.elasticsearch.xpack.core.scheduler.SchedulerEngine.add(SchedulerEngine.java:147)
2> at org.elasticsearch.xpack.slm.SnapshotRetentionService.rescheduleRetentionJob(SnapshotRetentionService.java:88)
2> at org.elasticsearch.xpack.slm.SnapshotRetentionService.onMaster(SnapshotRetentionService.java:73)
2> at org.elasticsearch.cluster.service.ClusterApplierService$OnMasterRunnable.run(ClusterApplierService.java:644)
2> at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:699)
2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
2> at java.base/java.lang.Thread.run(Thread.java:834)
2> com.carrotsearch.randomizedtesting.UncaughtExceptionError: Captured an uncaught exception in thread: Thread[id=168, name=elasticsearch[node_sm1][snapshot][T#2], state=RUNNABLE, group=TGRP-HistoryTemplateSearchInputMappingsTests]
Caused by:
java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@71681ccd[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@305516f6[Wrapped task = org.elasticsearch.xpack.core.scheduler.SchedulerEngine$ActiveSchedule@346a8424]] rejected from java.util.concurrent.ScheduledThreadPoolExecutor@5829a5fb[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]
```
I was not able to reproduce this with
```
REPRODUCE WITH: ./gradlew ':x-pack:plugin:watcher:test' --tests "org.elasticsearch.xpack.watcher.history.HistoryTemplateSearchInputMappingsTests" -Dtests.seed=B75BDA1771E40CE4 -Dtests.security.manager=true -Dtests.locale=en-US -Dtests.timezone=Etc/UTC -Dcompiler.java=12 -Druntime.java=11
```
https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+pull-request-2/8367/console
https://gradle-enterprise.elastic.co/s/bsbxzot5q5dfy | 1.0 | HistoryTemplateSearchInputMappingsTests failure on master with RejectedExecutionException - The failure looks like:
```
2> Oct 08, 2019 2:38:57 PM com.carrotsearch.randomizedtesting.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException
2> WARNING: Uncaught exception in thread: Thread[elasticsearch[node_sm1][snapshot][T#2],5,TGRP-HistoryTemplateSearchInputMappingsTests]
2> java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@71681ccd[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@305516f6[Wrapped task = org.elasticsearch.xpack.core.scheduler.SchedulerEngine$ActiveSchedule@346a8424]] rejected from java.util.concurrent.ScheduledThreadPoolExecutor@5829a5fb[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]
2> at __randomizedtesting.SeedInfo.seed([B75BDA1771E40CE4]:0)
2> at java.base/java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2055)
2> at java.base/java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:825)
2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor.delayedExecute(ScheduledThreadPoolExecutor.java:340)
2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor.schedule(ScheduledThreadPoolExecutor.java:562)
2> at org.elasticsearch.xpack.core.scheduler.SchedulerEngine$ActiveSchedule.scheduleNextRun(SchedulerEngine.java:222)
2> at org.elasticsearch.xpack.core.scheduler.SchedulerEngine$ActiveSchedule.<init>(SchedulerEngine.java:196)
2> at org.elasticsearch.xpack.core.scheduler.SchedulerEngine.add(SchedulerEngine.java:147)
2> at org.elasticsearch.xpack.slm.SnapshotRetentionService.rescheduleRetentionJob(SnapshotRetentionService.java:88)
2> at org.elasticsearch.xpack.slm.SnapshotRetentionService.onMaster(SnapshotRetentionService.java:73)
2> at org.elasticsearch.cluster.service.ClusterApplierService$OnMasterRunnable.run(ClusterApplierService.java:644)
2> at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:699)
2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
2> at java.base/java.lang.Thread.run(Thread.java:834)
2> com.carrotsearch.randomizedtesting.UncaughtExceptionError: Captured an uncaught exception in thread: Thread[id=168, name=elasticsearch[node_sm1][snapshot][T#2], state=RUNNABLE, group=TGRP-HistoryTemplateSearchInputMappingsTests]
Caused by:
java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@71681ccd[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@305516f6[Wrapped task = org.elasticsearch.xpack.core.scheduler.SchedulerEngine$ActiveSchedule@346a8424]] rejected from java.util.concurrent.ScheduledThreadPoolExecutor@5829a5fb[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]
```
I was not able to reproduce this with
```
REPRODUCE WITH: ./gradlew ':x-pack:plugin:watcher:test' --tests "org.elasticsearch.xpack.watcher.history.HistoryTemplateSearchInputMappingsTests" -Dtests.seed=B75BDA1771E40CE4 -Dtests.security.manager=true -Dtests.locale=en-US -Dtests.timezone=Etc/UTC -Dcompiler.java=12 -Druntime.java=11
```
https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+pull-request-2/8367/console
https://gradle-enterprise.elastic.co/s/bsbxzot5q5dfy | test | historytemplatesearchinputmappingstests failure on master with rejectedexecutionexception the failure looks like oct pm com carrotsearch randomizedtesting randomizedrunner queueuncaughtexceptionshandler uncaughtexception warning uncaught exception in thread thread tgrp historytemplatesearchinputmappingstests java util concurrent rejectedexecutionexception task java util concurrent scheduledthreadpoolexecutor scheduledfuturetask rejected from java util concurrent scheduledthreadpoolexecutor at randomizedtesting seedinfo seed at java base java util concurrent threadpoolexecutor abortpolicy rejectedexecution threadpoolexecutor java at java base java util concurrent threadpoolexecutor reject threadpoolexecutor java at java base java util concurrent scheduledthreadpoolexecutor delayedexecute scheduledthreadpoolexecutor java at java base java util concurrent scheduledthreadpoolexecutor schedule scheduledthreadpoolexecutor java at org elasticsearch xpack core scheduler schedulerengine activeschedule schedulenextrun schedulerengine java at org elasticsearch xpack core scheduler schedulerengine activeschedule schedulerengine java at org elasticsearch xpack core scheduler schedulerengine add schedulerengine java at org elasticsearch xpack slm snapshotretentionservice rescheduleretentionjob snapshotretentionservice java at org elasticsearch xpack slm snapshotretentionservice onmaster snapshotretentionservice java at org elasticsearch cluster service clusterapplierservice onmasterrunnable run clusterapplierservice java at org elasticsearch common util concurrent threadcontext contextpreservingrunnable run threadcontext java at java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java base java lang thread run thread java com carrotsearch randomizedtesting uncaughtexceptionerror captured an uncaught exception in thread thread state runnable group tgrp historytemplatesearchinputmappingstests caused by java util concurrent rejectedexecutionexception task java util concurrent scheduledthreadpoolexecutor scheduledfuturetask rejected from java util concurrent scheduledthreadpoolexecutor i was not able to reproduce this with reproduce with gradlew x pack plugin watcher test tests org elasticsearch xpack watcher history historytemplatesearchinputmappingstests dtests seed dtests security manager true dtests locale en us dtests timezone etc utc dcompiler java druntime java | 1 |
254,334 | 27,372,249,785 | IssuesEvent | 2023-02-28 01:04:36 | pazhanivel07/frameworks_av_A10_r33 | https://api.github.com/repos/pazhanivel07/frameworks_av_A10_r33 | closed | CVE-2020-0134 (Medium) detected in avandroid-10.0.0_r33 - autoclosed | security vulnerability | ## CVE-2020-0134 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>avandroid-10.0.0_r33</b></p></summary>
<p>
<p>Library home page: <a href=https://android.googlesource.com/platform/frameworks/av>https://android.googlesource.com/platform/frameworks/av</a></p>
<p>Found in HEAD commit: <a href="https://github.com/pazhanivel07/frameworks_av_A10_r33/commit/2beb40f4ae9d8dc613cfe3bfd1d381e704ce907d">2beb40f4ae9d8dc613cfe3bfd1d381e704ce907d</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drm/libmediadrm/IDrm.cpp</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In BnDrm::onTransact of IDrm.cpp, there is a possible information disclosure due to uninitialized data. This could lead to local information disclosure with no additional execution privileges needed. User interaction is not needed for exploitation.Product: AndroidVersions: Android-10Android ID: A-146052771
<p>Publish Date: 2020-06-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-0134>CVE-2020-0134</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://android.googlesource.com/platform/frameworks/base/+/refs/tags/android-10.0.0_r37">https://android.googlesource.com/platform/frameworks/base/+/refs/tags/android-10.0.0_r37</a></p>
<p>Release Date: 2020-06-11</p>
<p>Fix Resolution: android-10.0.0_r37</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-0134 (Medium) detected in avandroid-10.0.0_r33 - autoclosed - ## CVE-2020-0134 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>avandroid-10.0.0_r33</b></p></summary>
<p>
<p>Library home page: <a href=https://android.googlesource.com/platform/frameworks/av>https://android.googlesource.com/platform/frameworks/av</a></p>
<p>Found in HEAD commit: <a href="https://github.com/pazhanivel07/frameworks_av_A10_r33/commit/2beb40f4ae9d8dc613cfe3bfd1d381e704ce907d">2beb40f4ae9d8dc613cfe3bfd1d381e704ce907d</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drm/libmediadrm/IDrm.cpp</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In BnDrm::onTransact of IDrm.cpp, there is a possible information disclosure due to uninitialized data. This could lead to local information disclosure with no additional execution privileges needed. User interaction is not needed for exploitation.Product: AndroidVersions: Android-10Android ID: A-146052771
<p>Publish Date: 2020-06-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-0134>CVE-2020-0134</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://android.googlesource.com/platform/frameworks/base/+/refs/tags/android-10.0.0_r37">https://android.googlesource.com/platform/frameworks/base/+/refs/tags/android-10.0.0_r37</a></p>
<p>Release Date: 2020-06-11</p>
<p>Fix Resolution: android-10.0.0_r37</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_test | cve medium detected in avandroid autoclosed cve medium severity vulnerability vulnerable library avandroid library home page a href found in head commit a href found in base branch master vulnerable source files drm libmediadrm idrm cpp vulnerability details in bndrm ontransact of idrm cpp there is a possible information disclosure due to uninitialized data this could lead to local information disclosure with no additional execution privileges needed user interaction is not needed for exploitation product androidversions android id a publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution android step up your open source security game with mend | 0 |
94,647 | 19,572,553,430 | IssuesEvent | 2022-01-04 11:44:59 | Regalis11/Barotrauma | https://api.github.com/repos/Regalis11/Barotrauma | closed | Portable Pump Attach | Code Design | - [x ] I have searched the issue tracker to check if the issue has already been reported.
**Description**
it appears that only the mechanic can properly reattach portable pumps.
if done by another, it fails and gives error message when trying to reattach:
too many of this item type placed on this submarine x/0
**Steps To Reproduce**
create 2 portable pumps;
attach 2 portable pumps;
detach 1 portable pump;
try to reattach 1 portable pump with non-mechanic;
maybe, an error message: too many of this item type placed on this submarine 1/0;
**Version**
v0.15.13.0.
windows
**Additional information**
done some draining and wiring, humpback submarine, inserted fulgurium batteries, tried all other crew professions, only mechanic works properly; singleplayer campaign; docked to station.
| 1.0 | Portable Pump Attach - - [x ] I have searched the issue tracker to check if the issue has already been reported.
**Description**
it appears that only the mechanic can properly reattach portable pumps.
if done by another, it fails and gives error message when trying to reattach:
too many of this item type placed on this submarine x/0
**Steps To Reproduce**
create 2 portable pumps;
attach 2 portable pumps;
detach 1 portable pump;
try to reattach 1 portable pump with non-mechanic;
maybe, an error message: too many of this item type placed on this submarine 1/0;
**Version**
v0.15.13.0.
windows
**Additional information**
done some draining and wiring, humpback submarine, inserted fulgurium batteries, tried all other crew professions, only mechanic works properly; singleplayer campaign; docked to station.
| non_test | portable pump attach i have searched the issue tracker to check if the issue has already been reported description it appears that only the mechanic can properly reattach portable pumps if done by another it fails and gives error message when trying to reattach too many of this item type placed on this submarine x steps to reproduce create portable pumps attach portable pumps detach portable pump try to reattach portable pump with non mechanic maybe an error message too many of this item type placed on this submarine version windows additional information done some draining and wiring humpback submarine inserted fulgurium batteries tried all other crew professions only mechanic works properly singleplayer campaign docked to station | 0 |
733,730 | 25,319,387,491 | IssuesEvent | 2022-11-18 01:34:35 | googleapis/google-cloud-go | https://api.github.com/repos/googleapis/google-cloud-go | closed | pubsub: TestFlowControllerCancel failed | type: bug api: pubsub priority: p1 flakybot: issue | Note: #6844 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.
----
commit: 0eb700d17c4cac56f59038f0f3ae5a65257a3d38
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/e13d2186-e41d-47e2-bec7-d8d909554bb8), [Sponge](http://sponge2/e13d2186-e41d-47e2-bec7-d8d909554bb8)
status: failed | 1.0 | pubsub: TestFlowControllerCancel failed - Note: #6844 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.
----
commit: 0eb700d17c4cac56f59038f0f3ae5a65257a3d38
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/e13d2186-e41d-47e2-bec7-d8d909554bb8), [Sponge](http://sponge2/e13d2186-e41d-47e2-bec7-d8d909554bb8)
status: failed | non_test | pubsub testflowcontrollercancel failed note was also for this test but it was closed more than days ago so i didn t mark it flaky commit buildurl status failed | 0 |
118,968 | 10,020,834,785 | IssuesEvent | 2019-07-16 13:29:01 | OpenLiberty/open-liberty | https://api.github.com/repos/OpenLiberty/open-liberty | opened | Create EJB Custom Bindings in server.xml tests | in:EJB Container team:Blizzard test delivery | Create new EJB Custom Bindings in server.xml tests. | 1.0 | Create EJB Custom Bindings in server.xml tests - Create new EJB Custom Bindings in server.xml tests. | test | create ejb custom bindings in server xml tests create new ejb custom bindings in server xml tests | 1 |
261,344 | 22,739,971,216 | IssuesEvent | 2022-07-07 02:08:13 | dotnet/source-build | https://api.github.com/repos/dotnet/source-build | closed | Microsoft-to-source-build SDK comparison test should output a new baseline to build artifacts | area-ci-testing | After running the diff for .NET 7.0, the difference between the new baseline and the old baseline are very large. It's impractical to update the baseline manually, so a new baseline should be written to the build artifacts to facilitate easier updates.
See https://github.com/dotnet/installer/pull/13990 | 1.0 | Microsoft-to-source-build SDK comparison test should output a new baseline to build artifacts - After running the diff for .NET 7.0, the difference between the new baseline and the old baseline are very large. It's impractical to update the baseline manually, so a new baseline should be written to the build artifacts to facilitate easier updates.
See https://github.com/dotnet/installer/pull/13990 | test | microsoft to source build sdk comparison test should output a new baseline to build artifacts after running the diff for net the difference between the new baseline and the old baseline are very large it s impractical to update the baseline manually so a new baseline should be written to the build artifacts to facilitate easier updates see | 1 |
39,034 | 5,214,787,945 | IssuesEvent | 2017-01-26 01:16:41 | palantir/atlasdb | https://api.github.com/repos/palantir/atlasdb | closed | Sweep background tests causing failures in internal product testing | component: testing priority: P2 | gerrit 280256
#872 #411 have resulted in problems for internal products. The code they had to add to overcome this can be found in the above gerrit change. We should revert or ignore the changes as necessary and revert this.
| 1.0 | Sweep background tests causing failures in internal product testing - gerrit 280256
#872 #411 have resulted in problems for internal products. The code they had to add to overcome this can be found in the above gerrit change. We should revert or ignore the changes as necessary and revert this.
| test | sweep background tests causing failures in internal product testing gerrit have resulted in problems for internal products the code they had to add to overcome this can be found in the above gerrit change we should revert or ignore the changes as necessary and revert this | 1 |
399,307 | 27,236,161,737 | IssuesEvent | 2023-02-21 16:28:11 | mindsdb/mindsdb | https://api.github.com/repos/mindsdb/mindsdb | closed | [Docs] Add a community tutorial link to the `Using MindsDB via Mongo API -> Machine Learning Examples -> Regression` page | help wanted good first issue documentation first-timers-only | ## Instructions :page_facing_up:
Here are the step-by-step instructions:
1. Go to the `/docs/using-mongo-api/regression.mdx` file.
2. Go to the end of this file and add another item to the list, as follows:
```
- [Tutorial to Predict the Energy Usage using MindsDB and MongoDB](https://dev.to/dohrisalim/tutorial-to-predict-the-energy-usage-using-mindsdb-and-mongodb-g60)
by [Salim Dohri](https://github.com/dohrisalim)
```
3. Save the changes and create a PR.
## Hackathon Issue :loudspeaker:
MindsDB has organized a hackathon to let in more contributors to the in-database ML world!
Each hackathon issue is worth a certain amount of points that will bring you prizes by the end of the MindsDB Hackathon.
Stay tuned for the detailed rules of the MindsDB Hackathon!
## The https://github.com/mindsdb/mindsdb/labels/first-timers-only Label
We are happy to welcome you on board! Please take a look at the rules below for first-time contributors.
1. You can solve only one issue labeled as https://github.com/mindsdb/mindsdb/labels/first-timers-only. After that, please look at other issues labeled as https://github.com/mindsdb/mindsdb/labels/good%20first%20issue, https://github.com/mindsdb/mindsdb/labels/help%20wanted, or https://github.com/mindsdb/mindsdb/labels/integration.
2. After you create your first PR in the MindsDB repository, please sign our CLA to become a MindsDB contributor. You can do that by leaving a comment that contains the following: `I have read the CLA Document and I hereby sign the CLA`
Thank you for contributing to MindsDB! | 1.0 | [Docs] Add a community tutorial link to the `Using MindsDB via Mongo API -> Machine Learning Examples -> Regression` page - ## Instructions :page_facing_up:
Here are the step-by-step instructions:
1. Go to the `/docs/using-mongo-api/regression.mdx` file.
2. Go to the end of this file and add another item to the list, as follows:
```
- [Tutorial to Predict the Energy Usage using MindsDB and MongoDB](https://dev.to/dohrisalim/tutorial-to-predict-the-energy-usage-using-mindsdb-and-mongodb-g60)
by [Salim Dohri](https://github.com/dohrisalim)
```
3. Save the changes and create a PR.
## Hackathon Issue :loudspeaker:
MindsDB has organized a hackathon to let in more contributors to the in-database ML world!
Each hackathon issue is worth a certain amount of points that will bring you prizes by the end of the MindsDB Hackathon.
Stay tuned for the detailed rules of the MindsDB Hackathon!
## The https://github.com/mindsdb/mindsdb/labels/first-timers-only Label
We are happy to welcome you on board! Please take a look at the rules below for first-time contributors.
1. You can solve only one issue labeled as https://github.com/mindsdb/mindsdb/labels/first-timers-only. After that, please look at other issues labeled as https://github.com/mindsdb/mindsdb/labels/good%20first%20issue, https://github.com/mindsdb/mindsdb/labels/help%20wanted, or https://github.com/mindsdb/mindsdb/labels/integration.
2. After you create your first PR in the MindsDB repository, please sign our CLA to become a MindsDB contributor. You can do that by leaving a comment that contains the following: `I have read the CLA Document and I hereby sign the CLA`
Thank you for contributing to MindsDB! | non_test | add a community tutorial link to the using mindsdb via mongo api machine learning examples regression page instructions page facing up here are the step by step instructions go to the docs using mongo api regression mdx file go to the end of this file and add another item to the list as follows by save the changes and create a pr hackathon issue loudspeaker mindsdb has organized a hackathon to let in more contributors to the in database ml world each hackathon issue is worth a certain amount of points that will bring you prizes by the end of the mindsdb hackathon stay tuned for the detailed rules of the mindsdb hackathon the label we are happy to welcome you on board please take a look at the rules below for first time contributors you can solve only one issue labeled as after that please look at other issues labeled as or after you create your first pr in the mindsdb repository please sign our cla to become a mindsdb contributor you can do that by leaving a comment that contains the following i have read the cla document and i hereby sign the cla thank you for contributing to mindsdb | 0 |
255,797 | 19,339,948,814 | IssuesEvent | 2021-12-15 02:28:38 | godotengine/godot | https://api.github.com/repos/godotengine/godot | closed | Node.replace_by() has strange behavior with instanced scenes | topic:core documentation | **Godot version:**
3.1
**Issue description:**
Using [`Node.replace_by`](https://docs.godotengine.org/en/3.1/classes/class_node.html#class-node-method-replace-by) to replace an instanced scene with another one created from script appears to merge the two subscenes instead of actually replacing.
**Steps to reproduce:**
- Create three scenes: `main`, `other`, `yetanother`, each with a simple `Node2D` as base node. Then add different sprites to `other` and `yetanother` to distinguish between them.
- In the `main` scene, instantiate the `other` scene (say, as `Other`) as a child with the editor.
- Using script attached to `main` scene's base node, instantiate a `yetanother` scene with `load('res://yetanother.scn')` and call `.instance()` on it (call the resulting variable `yetanother`).
- Add `$Other.replace_by(yetanother)` to the following line.
Instead of replacing the initially instanced scene node with the new one, this results in merging the two scenes together (which can be confirmed by looking at the Remote tab's scene tree during execution).
As an additional note, `print($Other)` will print `[Object:null]` then (because the node also took the new node's name), which imho is a bit counter-intuitive, though this is easily fixable with `yetanother.name = 'Other'`. I believe the replaced node should take the initial node's name by default, but that may be intended for reasons I'm missing.
**Minimal reproduction project:**
[replace_by_bug.zip](https://github.com/godotengine/godot/files/3154742/replace_by_bug.zip)
| 1.0 | Node.replace_by() has strange behavior with instanced scenes - **Godot version:**
3.1
**Issue description:**
Using [`Node.replace_by`](https://docs.godotengine.org/en/3.1/classes/class_node.html#class-node-method-replace-by) to replace an instanced scene with another one created from script appears to merge the two subscenes instead of actually replacing.
**Steps to reproduce:**
- Create three scenes: `main`, `other`, `yetanother`, each with a simple `Node2D` as base node. Then add different sprites to `other` and `yetanother` to distinguish between them.
- In the `main` scene, instantiate the `other` scene (say, as `Other`) as a child with the editor.
- Using script attached to `main` scene's base node, instantiate a `yetanother` scene with `load('res://yetanother.scn')` and call `.instance()` on it (call the resulting variable `yetanother`).
- Add `$Other.replace_by(yetanother)` to the following line.
Instead of replacing the initially instanced scene node with the new one, this results in merging the two scenes together (which can be confirmed by looking at the Remote tab's scene tree during execution).
As an additional note, `print($Other)` will print `[Object:null]` then (because the node also took the new node's name), which imho is a bit counter-intuitive, though this is easily fixable with `yetanother.name = 'Other'`. I believe the replaced node should take the initial node's name by default, but that may be intended for reasons I'm missing.
**Minimal reproduction project:**
[replace_by_bug.zip](https://github.com/godotengine/godot/files/3154742/replace_by_bug.zip)
| non_test | node replace by has strange behavior with instanced scenes godot version issue description using to replace an instanced scene with another one created from script appears to merge the two subscenes instead of actually replacing steps to reproduce create three scenes main other yetanother each with a simple as base node then add different sprites to other and yetanother to distinguish between them in the main scene instantiate the other scene say as other as a child with the editor using script attached to main scene s base node instantiate a yetanother scene with load res yetanother scn and call instance on it call the resulting variable yetanother add other replace by yetanother to the following line instead of replacing the initially instanced scene node with the new one this results in merging the two scenes together which can be confirmed by looking at the remote tab s scene tree during execution as an additional note print other will print then because the node also took the new node s name which imho is a bit counter intuitive though this is easily fixable with yetanother name other i believe the replaced node should take the initial node s name by default but that may be intended for reasons i m missing minimal reproduction project | 0 |
327,941 | 28,093,212,488 | IssuesEvent | 2023-03-30 14:16:14 | SPW-DIG/metawal-core-geonetwork | https://api.github.com/repos/SPW-DIG/metawal-core-geonetwork | closed | Mapping MW-GP - Infos des services liés | Env test - OK Env valid - OK Env prod - OK MW-GP | Suite à discussion avec l'équipe géoportail, on souhaite récupérer plus d'informations au niveau des services liés à une donnée.
L'idée est d'abandonner [le catalogue des services dans sa version actuelle](https://geoportail.wallonie.be/catalogue-donnees-et-services?&search-tab=2) et basculer vers l'intégration de certaines informations des services dans la fiche de donnée, par exemple dans un onglet API/services.
(Un catalogue spécifique au service pourrait subsister, par exemple sous forme de tableau créé avec les webcomponents)
Les infos qu'on souhaite afficher au niveau de l'onglet API/services d'une donnée sont :
- Nom du service
- Descriptif
- Adresse de connexion
- Conditions d'accès et d'utilisation
- Rapport de qualité / disponibilité
+ éventuellement la liste des donnée servies
Une approche serait d'ajouter les infos manquantes à celles déjà fournies par l'élément related.service._source
Est-ce une bonne approche ?
| 1.0 | Mapping MW-GP - Infos des services liés - Suite à discussion avec l'équipe géoportail, on souhaite récupérer plus d'informations au niveau des services liés à une donnée.
L'idée est d'abandonner [le catalogue des services dans sa version actuelle](https://geoportail.wallonie.be/catalogue-donnees-et-services?&search-tab=2) et basculer vers l'intégration de certaines informations des services dans la fiche de donnée, par exemple dans un onglet API/services.
(Un catalogue spécifique au service pourrait subsister, par exemple sous forme de tableau créé avec les webcomponents)
Les infos qu'on souhaite afficher au niveau de l'onglet API/services d'une donnée sont :
- Nom du service
- Descriptif
- Adresse de connexion
- Conditions d'accès et d'utilisation
- Rapport de qualité / disponibilité
+ éventuellement la liste des donnée servies
Une approche serait d'ajouter les infos manquantes à celles déjà fournies par l'élément related.service._source
Est-ce une bonne approche ?
| test | mapping mw gp infos des services liés suite à discussion avec l équipe géoportail on souhaite récupérer plus d informations au niveau des services liés à une donnée l idée est d abandonner et basculer vers l intégration de certaines informations des services dans la fiche de donnée par exemple dans un onglet api services un catalogue spécifique au service pourrait subsister par exemple sous forme de tableau créé avec les webcomponents les infos qu on souhaite afficher au niveau de l onglet api services d une donnée sont nom du service descriptif adresse de connexion conditions d accès et d utilisation rapport de qualité disponibilité éventuellement la liste des donnée servies une approche serait d ajouter les infos manquantes à celles déjà fournies par l élément related service source est ce une bonne approche | 1 |
322,782 | 9,828,658,208 | IssuesEvent | 2019-06-15 13:43:58 | wevote/WeVoteCordova | https://api.github.com/repos/wevote/WeVoteCordova | closed | Header Styles: Following Facebook sign in, page displays under header | Priority: 2 Verify Fix | Following Facebook sign in, page displays under header.
Please test all header changes in the browser-based WebApp as well.

| 1.0 | Header Styles: Following Facebook sign in, page displays under header - Following Facebook sign in, page displays under header.
Please test all header changes in the browser-based WebApp as well.

| non_test | header styles following facebook sign in page displays under header following facebook sign in page displays under header please test all header changes in the browser based webapp as well | 0 |
12,749 | 3,288,167,369 | IssuesEvent | 2015-10-29 14:10:11 | bedita/bedita | https://api.github.com/repos/bedita/bedita | closed | Frontend language requires double-load to change | Priority - High Status - Test Topic - Frontend Type - Bug | After #517, clicking on a `/lang/xyz/*` link isn't enough to switch language, and a page reload is required. | 1.0 | Frontend language requires double-load to change - After #517, clicking on a `/lang/xyz/*` link isn't enough to switch language, and a page reload is required. | test | frontend language requires double load to change after clicking on a lang xyz link isn t enough to switch language and a page reload is required | 1 |
219,178 | 7,333,767,127 | IssuesEvent | 2018-03-05 20:28:45 | joshherkness/AR-Top | https://api.github.com/repos/joshherkness/AR-Top | opened | Test read session endpoint | Effort: 2 Priority: Low | /sessions/<id>, GET
- returns ({success="Successfully read session", session=<read session json>}, 200, json_tag) if the session with the given exists.
- returns ({error="Session does not exist"}, 404, json_tag) if a session with the given id does not exist. | 1.0 | Test read session endpoint - /sessions/<id>, GET
- returns ({success="Successfully read session", session=<read session json>}, 200, json_tag) if the session with the given exists.
- returns ({error="Session does not exist"}, 404, json_tag) if a session with the given id does not exist. | non_test | test read session endpoint sessions get returns success successfully read session session json tag if the session with the given exists returns error session does not exist json tag if a session with the given id does not exist | 0 |
4,818 | 4,651,207,816 | IssuesEvent | 2016-10-03 09:12:25 | syl20bnr/spacemacs | https://api.github.com/repos/syl20bnr/spacemacs | closed | Reduce gc-cons-threshold for responsiveness | - Mailling list - Core Enhancement ☺ Fixed in develop Performance | It's worth to have a look at this the article in this thread and its discussion: https://www.reddit.com/r/emacs/comments/41m7x3/why_are_you_changing_gcconsthreshold/. In summary, higher `gc-cons-threshold` speed up Emacs, but lower `gc-cons-threshold` gives more responsiveness. Given Spacemacs sometimes causes random freeze, I think it's worth to try setting `gc-cons-threshold` back to its default value after Spacemacs done initializing. | True | Reduce gc-cons-threshold for responsiveness - It's worth to have a look at this the article in this thread and its discussion: https://www.reddit.com/r/emacs/comments/41m7x3/why_are_you_changing_gcconsthreshold/. In summary, higher `gc-cons-threshold` speed up Emacs, but lower `gc-cons-threshold` gives more responsiveness. Given Spacemacs sometimes causes random freeze, I think it's worth to try setting `gc-cons-threshold` back to its default value after Spacemacs done initializing. | non_test | reduce gc cons threshold for responsiveness it s worth to have a look at this the article in this thread and its discussion in summary higher gc cons threshold speed up emacs but lower gc cons threshold gives more responsiveness given spacemacs sometimes causes random freeze i think it s worth to try setting gc cons threshold back to its default value after spacemacs done initializing | 0 |
276,150 | 23,969,939,660 | IssuesEvent | 2022-09-13 06:49:02 | milvus-io/milvus | https://api.github.com/repos/milvus-io/milvus | opened | [Bug]: [Benchmark][cluster]After Milvus upgrade, kafka's node hangs, causes search and query failures | kind/bug priority/critical-urgent test/benchmark test/100million | ### Is there an existing issue for this?
- [X] I have searched the existing issues
### Environment
```markdown
- Milvus version:v2.1.0->2.1.0-20220909-36e333d
- Deployment mode(standalone or cluster):cluster
- SDK version(e.g. pymilvus v2.0.0rc2):2.1.0dev9
- OS(Ubuntu or CentOS):
- CPU/Memory:
- GPU:
- Others:
```
### Current Behavior
server-instance
fouram-tag-no-clean-dp5qg-1
server-configmap
server-cluster-8c64m-kafka-redur0
client-configmap
client-random-locust-100m-ddl-r8-w2-48h
update
server-instance
fouram-tag-no-clean-dp5qg-1
server-configmap
server-cluster-8c64m-kafka-redur0
client-configmap
client-random-locust-100m-ddl-r8-w2-con-12h
sever:
```
fouram-tag-no-clean-dp5qg-1-etcd-0 1/1 Running 0 17h 10.104.9.175 4am-node14 <none> <none>
fouram-tag-no-clean-dp5qg-1-etcd-1 1/1 Running 0 17h 10.104.1.24 4am-node10 <none> <none>
fouram-tag-no-clean-dp5qg-1-etcd-2 1/1 Running 0 17h 10.104.5.84 4am-node12 <none> <none>
fouram-tag-no-clean-dp5qg-1-kafka-0 1/1 Running 3 (3d18h ago) 3d18h 10.104.9.251 4am-node14 <none> <none>
fouram-tag-no-clean-dp5qg-1-kafka-1 0/1 Running 2 (3d18h ago) 3d18h 10.104.5.76 4am-node12 <none> <none>
fouram-tag-no-clean-dp5qg-1-kafka-2 1/1 Running 3 (3d18h ago) 3d18h 10.104.6.176 4am-node13 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-datacoord-6bc7f56fbb-nb8kg 1/1 Running 0 17h 10.104.1.21 4am-node10 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-datanode-5889ff8dbb-55hzm 1/1 Running 42 (9m24s ago) 17h 10.104.4.239 4am-node11 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-indexcoord-7cb5f46f98-97pnk 1/1 Running 0 17h 10.104.4.241 4am-node11 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-indexnode-6d7cff4c77-dfp44 1/1 Running 0 17h 10.104.4.240 4am-node11 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-proxy-668984d458-v4c8v 1/1 Running 0 17h 10.104.6.207 4am-node13 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-querycoord-6ccb9b8c97-8tfcz 1/1 Running 0 17h 10.104.1.20 4am-node10 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-querynode-79cfddd4f9-xr9j7 1/1 Running 0 17h 10.104.6.208 4am-node13 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-rootcoord-55bfb66dd9-4lb42 1/1 Running 0 17h 10.104.1.23 4am-node10 <none> <none>
fouram-tag-no-clean-dp5qg-1-minio-0 1/1 Running 0 3d18h 10.104.9.252 4am-node14 <none> <none>
fouram-tag-no-clean-dp5qg-1-minio-1 1/1 Running 0 3d18h 10.104.1.73 4am-node10 <none> <none>
fouram-tag-no-clean-dp5qg-1-minio-2 1/1 Running 0 3d18h 10.104.5.80 4am-node12 <none> <none>
fouram-tag-no-clean-dp5qg-1-minio-3 1/1 Running 0 3d18h 10.104.6.178 4am-node13 <none> <none>
```
client log:
<img width="1386" alt="截屏2022-09-13 14 46 18" src="https://user-images.githubusercontent.com/34296482/189829674-3ef983c2-19d5-4be0-9eb6-f6c3bf08811c.png">
### Expected Behavior
_No response_
### Steps To Reproduce
_No response_
### Milvus Log
_No response_
### Anything else?
_No response_ | 2.0 | [Bug]: [Benchmark][cluster]After Milvus upgrade, kafka's node hangs, causes search and query failures - ### Is there an existing issue for this?
- [X] I have searched the existing issues
### Environment
```markdown
- Milvus version:v2.1.0->2.1.0-20220909-36e333d
- Deployment mode(standalone or cluster):cluster
- SDK version(e.g. pymilvus v2.0.0rc2):2.1.0dev9
- OS(Ubuntu or CentOS):
- CPU/Memory:
- GPU:
- Others:
```
### Current Behavior
server-instance
fouram-tag-no-clean-dp5qg-1
server-configmap
server-cluster-8c64m-kafka-redur0
client-configmap
client-random-locust-100m-ddl-r8-w2-48h
update
server-instance
fouram-tag-no-clean-dp5qg-1
server-configmap
server-cluster-8c64m-kafka-redur0
client-configmap
client-random-locust-100m-ddl-r8-w2-con-12h
sever:
```
fouram-tag-no-clean-dp5qg-1-etcd-0 1/1 Running 0 17h 10.104.9.175 4am-node14 <none> <none>
fouram-tag-no-clean-dp5qg-1-etcd-1 1/1 Running 0 17h 10.104.1.24 4am-node10 <none> <none>
fouram-tag-no-clean-dp5qg-1-etcd-2 1/1 Running 0 17h 10.104.5.84 4am-node12 <none> <none>
fouram-tag-no-clean-dp5qg-1-kafka-0 1/1 Running 3 (3d18h ago) 3d18h 10.104.9.251 4am-node14 <none> <none>
fouram-tag-no-clean-dp5qg-1-kafka-1 0/1 Running 2 (3d18h ago) 3d18h 10.104.5.76 4am-node12 <none> <none>
fouram-tag-no-clean-dp5qg-1-kafka-2 1/1 Running 3 (3d18h ago) 3d18h 10.104.6.176 4am-node13 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-datacoord-6bc7f56fbb-nb8kg 1/1 Running 0 17h 10.104.1.21 4am-node10 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-datanode-5889ff8dbb-55hzm 1/1 Running 42 (9m24s ago) 17h 10.104.4.239 4am-node11 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-indexcoord-7cb5f46f98-97pnk 1/1 Running 0 17h 10.104.4.241 4am-node11 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-indexnode-6d7cff4c77-dfp44 1/1 Running 0 17h 10.104.4.240 4am-node11 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-proxy-668984d458-v4c8v 1/1 Running 0 17h 10.104.6.207 4am-node13 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-querycoord-6ccb9b8c97-8tfcz 1/1 Running 0 17h 10.104.1.20 4am-node10 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-querynode-79cfddd4f9-xr9j7 1/1 Running 0 17h 10.104.6.208 4am-node13 <none> <none>
fouram-tag-no-clean-dp5qg-1-milvus-rootcoord-55bfb66dd9-4lb42 1/1 Running 0 17h 10.104.1.23 4am-node10 <none> <none>
fouram-tag-no-clean-dp5qg-1-minio-0 1/1 Running 0 3d18h 10.104.9.252 4am-node14 <none> <none>
fouram-tag-no-clean-dp5qg-1-minio-1 1/1 Running 0 3d18h 10.104.1.73 4am-node10 <none> <none>
fouram-tag-no-clean-dp5qg-1-minio-2 1/1 Running 0 3d18h 10.104.5.80 4am-node12 <none> <none>
fouram-tag-no-clean-dp5qg-1-minio-3 1/1 Running 0 3d18h 10.104.6.178 4am-node13 <none> <none>
```
client log:
<img width="1386" alt="截屏2022-09-13 14 46 18" src="https://user-images.githubusercontent.com/34296482/189829674-3ef983c2-19d5-4be0-9eb6-f6c3bf08811c.png">
### Expected Behavior
_No response_
### Steps To Reproduce
_No response_
### Milvus Log
_No response_
### Anything else?
_No response_ | test | after milvus upgrade kafka s node hangs causes search and query failures is there an existing issue for this i have searched the existing issues environment markdown milvus version deployment mode standalone or cluster cluster sdk version e g pymilvus os ubuntu or centos cpu memory gpu others current behavior server instance fouram tag no clean server configmap server cluster kafka client configmap client random locust ddl update server instance fouram tag no clean server configmap server cluster kafka client configmap client random locust ddl con sever fouram tag no clean etcd running fouram tag no clean etcd running fouram tag no clean etcd running fouram tag no clean kafka running ago fouram tag no clean kafka running ago fouram tag no clean kafka running ago fouram tag no clean milvus datacoord running fouram tag no clean milvus datanode running ago fouram tag no clean milvus indexcoord running fouram tag no clean milvus indexnode running fouram tag no clean milvus proxy running fouram tag no clean milvus querycoord running fouram tag no clean milvus querynode running fouram tag no clean milvus rootcoord running fouram tag no clean minio running fouram tag no clean minio running fouram tag no clean minio running fouram tag no clean minio running client log img width alt src expected behavior no response steps to reproduce no response milvus log no response anything else no response | 1 |
176,074 | 13,626,117,962 | IssuesEvent | 2020-09-24 10:30:58 | blockframes/blockframes | https://api.github.com/repos/blockframes/blockframes | closed | e2e tests for movie tunnel - 10 Images | Back Test - e2e test | # UI
on Zeplin :
https://app.zeplin.io/project/5ca741b3ed21e79faa266664/screen/5e0b5ce0ce5cf519870bb60e
# E2E Tests to implement
## Page 10
- [ ] Uploads an image to banner (if possible with cypress)
- [ ] Uploads an image to poster (if possible with cypress)
- [ ] Clicks on "Next" button
- [ ] Redirects to movie tunnel page 11 Files & links | 2.0 | e2e tests for movie tunnel - 10 Images - # UI
on Zeplin :
https://app.zeplin.io/project/5ca741b3ed21e79faa266664/screen/5e0b5ce0ce5cf519870bb60e
# E2E Tests to implement
## Page 10
- [ ] Uploads an image to banner (if possible with cypress)
- [ ] Uploads an image to poster (if possible with cypress)
- [ ] Clicks on "Next" button
- [ ] Redirects to movie tunnel page 11 Files & links | test | tests for movie tunnel images ui on zeplin tests to implement page uploads an image to banner if possible with cypress uploads an image to poster if possible with cypress clicks on next button redirects to movie tunnel page files links | 1 |
76,534 | 14,633,511,287 | IssuesEvent | 2020-12-24 02:10:44 | Pokecube-Development/Pokecube-Issues-and-Wiki | https://api.github.com/repos/Pokecube-Development/Pokecube-Issues-and-Wiki | closed | Legendary capture issue | 1.14.x 1.15.2 Bug - Code Fixed-Pending Check |
#### Issue Description:
You can capture legendarys and it will say they are wild in the pokeball
#### What happens:
When I was trying to capture kyouger , he came back to me in the pokeball (it was an ultracube) as a wild Pokemon
#### What you expected to happen:
Him to come to me as a claimed Pokemon
#### Steps to reproduce:
1.stand next to a. Building or block
2.throw pokeball at legendary
3.when the Pokemon dissapers it will be in your invatory , throw it out and it will say it's wild
- Pokecube AIO:
- Minecraft: 1.15.2
- Forge: 31.1.18
| 1.0 | Legendary capture issue -
#### Issue Description:
You can capture legendarys and it will say they are wild in the pokeball
#### What happens:
When I was trying to capture kyouger , he came back to me in the pokeball (it was an ultracube) as a wild Pokemon
#### What you expected to happen:
Him to come to me as a claimed Pokemon
#### Steps to reproduce:
1.stand next to a. Building or block
2.throw pokeball at legendary
3.when the Pokemon dissapers it will be in your invatory , throw it out and it will say it's wild
- Pokecube AIO:
- Minecraft: 1.15.2
- Forge: 31.1.18
| non_test | legendary capture issue issue description you can capture legendarys and it will say they are wild in the pokeball what happens when i was trying to capture kyouger he came back to me in the pokeball it was an ultracube as a wild pokemon what you expected to happen him to come to me as a claimed pokemon steps to reproduce stand next to a building or block throw pokeball at legendary when the pokemon dissapers it will be in your invatory throw it out and it will say it s wild pokecube aio minecraft forge | 0 |
320,518 | 27,438,828,179 | IssuesEvent | 2023-03-02 09:34:07 | WordPress/gutenberg | https://api.github.com/repos/WordPress/gutenberg | closed | [Flaky Test] should cut partial selection and merge like a normal `delete` - not forward | [Type] Flaky Test | <!-- __META_DATA__:{} -->
**Flaky test detected. This is an auto-generated issue by GitHub Actions. Please do NOT edit this manually.**
## Test title
should cut partial selection and merge like a normal `delete` - not forward
## Test path
`/test/e2e/specs/editor/various/copy-cut-paste.spec.js`
## Errors
<!-- __TEST_RESULTS_LIST__ -->
<!-- __TEST_RESULT__ --><details>
<summary>
<time datetime="2023-02-27T18:35:23.808Z"><code>[2023-02-27T18:35:23.808Z]</code></time> Test passed after 2 failed attempts on <a href="https://github.com/WordPress/gutenberg/actions/runs/4285127412"><code>try/something</code></a>.
</summary>
```
Error: Snapshot comparison failed:
<!-- wp:heading -->
<h2 class="wp-block-heading">He</h2>
<!-- /wp:heading -->
<!-- wp:heading -->
<h2 class="wp-block-heading">ading</h2>
<!-- /wp:heading -->
<!-- wp:paragraph -->
<p>Paragra</p>
<!-- /wp:paragraph -->
<!-- wp:heading -->
<h2 class="wp-block-heading">Heph</h2>
<!-- /wp:heading -->
<!-- wp:heading -->
<h2 class="wp-block-heading">ph</h2>
<!-- /wp:heading -->
Expected: /home/runner/work/gutenberg/gutenberg/artifacts/test-results/editor-various-copy-cut-paste-Copy-cut-paste-s-9fc9d-n-and-merge-like-a-normal-delete---not-forward--chromium/Copy-cut-paste-should-cut-partial-selection-and-merge-like-a-normal-delete---not-forward-2-expected.txt
Received: /home/runner/work/gutenberg/gutenberg/artifacts/test-results/editor-various-copy-cut-paste-Copy-cut-paste-s-9fc9d-n-and-merge-like-a-normal-delete---not-forward--chromium/Copy-cut-paste-should-cut-partial-selection-and-merge-like-a-normal-delete---not-forward-2-actual.txt
at /home/runner/work/gutenberg/gutenberg/test/e2e/specs/editor/various/copy-cut-paste.spec.js:372:49
Error: Snapshot comparison failed:
<!-- wp:heading -->
<h2 class="wp-block-heading">He</h2>
<!-- /wp:heading -->
<!-- wp:heading -->
<h2 class="wp-block-heading">ading</h2>
<!-- /wp:heading -->
<!-- wp:paragraph -->
<p>Paragra</p>
<!-- /wp:paragraph -->
<!-- wp:heading -->
<h2 class="wp-block-heading">Heph</h2>
<!-- /wp:heading -->
<!-- wp:heading -->
<h2 class="wp-block-heading">ph</h2>
<!-- /wp:heading -->
Expected: /home/runner/work/gutenberg/gutenberg/artifacts/test-results/editor-various-copy-cut-paste-Copy-cut-paste-s-9fc9d-n-and-merge-like-a-normal-delete---not-forward--chromium-retry1/Copy-cut-paste-should-cut-partial-selection-and-merge-like-a-normal-delete---not-forward-2-expected.txt
Received: /home/runner/work/gutenberg/gutenberg/artifacts/test-results/editor-various-copy-cut-paste-Copy-cut-paste-s-9fc9d-n-and-merge-like-a-normal-delete---not-forward--chromium-retry1/Copy-cut-paste-should-cut-partial-selection-and-merge-like-a-normal-delete---not-forward-2-actual.txt
at /home/runner/work/gutenberg/gutenberg/test/e2e/specs/editor/various/copy-cut-paste.spec.js:372:49
```
</details><!-- /__TEST_RESULT__ -->
<!-- /__TEST_RESULTS_LIST__ -->
| 1.0 | [Flaky Test] should cut partial selection and merge like a normal `delete` - not forward - <!-- __META_DATA__:{} -->
**Flaky test detected. This is an auto-generated issue by GitHub Actions. Please do NOT edit this manually.**
## Test title
should cut partial selection and merge like a normal `delete` - not forward
## Test path
`/test/e2e/specs/editor/various/copy-cut-paste.spec.js`
## Errors
<!-- __TEST_RESULTS_LIST__ -->
<!-- __TEST_RESULT__ --><details>
<summary>
<time datetime="2023-02-27T18:35:23.808Z"><code>[2023-02-27T18:35:23.808Z]</code></time> Test passed after 2 failed attempts on <a href="https://github.com/WordPress/gutenberg/actions/runs/4285127412"><code>try/something</code></a>.
</summary>
```
Error: Snapshot comparison failed:
<!-- wp:heading -->
<h2 class="wp-block-heading">He</h2>
<!-- /wp:heading -->
<!-- wp:heading -->
<h2 class="wp-block-heading">ading</h2>
<!-- /wp:heading -->
<!-- wp:paragraph -->
<p>Paragra</p>
<!-- /wp:paragraph -->
<!-- wp:heading -->
<h2 class="wp-block-heading">Heph</h2>
<!-- /wp:heading -->
<!-- wp:heading -->
<h2 class="wp-block-heading">ph</h2>
<!-- /wp:heading -->
Expected: /home/runner/work/gutenberg/gutenberg/artifacts/test-results/editor-various-copy-cut-paste-Copy-cut-paste-s-9fc9d-n-and-merge-like-a-normal-delete---not-forward--chromium/Copy-cut-paste-should-cut-partial-selection-and-merge-like-a-normal-delete---not-forward-2-expected.txt
Received: /home/runner/work/gutenberg/gutenberg/artifacts/test-results/editor-various-copy-cut-paste-Copy-cut-paste-s-9fc9d-n-and-merge-like-a-normal-delete---not-forward--chromium/Copy-cut-paste-should-cut-partial-selection-and-merge-like-a-normal-delete---not-forward-2-actual.txt
at /home/runner/work/gutenberg/gutenberg/test/e2e/specs/editor/various/copy-cut-paste.spec.js:372:49
Error: Snapshot comparison failed:
<!-- wp:heading -->
<h2 class="wp-block-heading">He</h2>
<!-- /wp:heading -->
<!-- wp:heading -->
<h2 class="wp-block-heading">ading</h2>
<!-- /wp:heading -->
<!-- wp:paragraph -->
<p>Paragra</p>
<!-- /wp:paragraph -->
<!-- wp:heading -->
<h2 class="wp-block-heading">Heph</h2>
<!-- /wp:heading -->
<!-- wp:heading -->
<h2 class="wp-block-heading">ph</h2>
<!-- /wp:heading -->
Expected: /home/runner/work/gutenberg/gutenberg/artifacts/test-results/editor-various-copy-cut-paste-Copy-cut-paste-s-9fc9d-n-and-merge-like-a-normal-delete---not-forward--chromium-retry1/Copy-cut-paste-should-cut-partial-selection-and-merge-like-a-normal-delete---not-forward-2-expected.txt
Received: /home/runner/work/gutenberg/gutenberg/artifacts/test-results/editor-various-copy-cut-paste-Copy-cut-paste-s-9fc9d-n-and-merge-like-a-normal-delete---not-forward--chromium-retry1/Copy-cut-paste-should-cut-partial-selection-and-merge-like-a-normal-delete---not-forward-2-actual.txt
at /home/runner/work/gutenberg/gutenberg/test/e2e/specs/editor/various/copy-cut-paste.spec.js:372:49
```
</details><!-- /__TEST_RESULT__ -->
<!-- /__TEST_RESULTS_LIST__ -->
| test | should cut partial selection and merge like a normal delete not forward flaky test detected this is an auto generated issue by github actions please do not edit this manually test title should cut partial selection and merge like a normal delete not forward test path test specs editor various copy cut paste spec js errors test passed after failed attempts on a href error snapshot comparison failed he ading paragra heph ph expected home runner work gutenberg gutenberg artifacts test results editor various copy cut paste copy cut paste s n and merge like a normal delete not forward chromium copy cut paste should cut partial selection and merge like a normal delete not forward expected txt received home runner work gutenberg gutenberg artifacts test results editor various copy cut paste copy cut paste s n and merge like a normal delete not forward chromium copy cut paste should cut partial selection and merge like a normal delete not forward actual txt at home runner work gutenberg gutenberg test specs editor various copy cut paste spec js error snapshot comparison failed he ading paragra heph ph expected home runner work gutenberg gutenberg artifacts test results editor various copy cut paste copy cut paste s n and merge like a normal delete not forward chromium copy cut paste should cut partial selection and merge like a normal delete not forward expected txt received home runner work gutenberg gutenberg artifacts test results editor various copy cut paste copy cut paste s n and merge like a normal delete not forward chromium copy cut paste should cut partial selection and merge like a normal delete not forward actual txt at home runner work gutenberg gutenberg test specs editor various copy cut paste spec js | 1 |
132,981 | 10,775,947,028 | IssuesEvent | 2019-11-03 17:31:02 | wp-graphql/wp-graphql | https://api.github.com/repos/wp-graphql/wp-graphql | closed | Provide common functions for unit tests | Needs Discussion Tests help wanted | I've found myself writing some duplicate code in my unit tests, so it would be good to provide some common classes we can utilize in the unit tests.
For example, this bit of code (or a close variation of it) exists in several tests within the setUp method already:
```
$this->current_time = strtotime( 'now' );
$this->current_date = date( 'Y-m-d H:i:s', $this->current_time );
$this->current_date_gmt = gmdate( 'Y-m-d H:i:s', $this->current_time );
$this->admin = $this->factory->user->create( [
'role' => 'administrator',
] );
```
So, it might be good to provide a general class that extends WP_UnitTestCase that provides some of this boilerplate up front, and unit tests can extend that class and make use of some of that boilerplate.
Would be good to research how some other projects handle their "common" test code.
| 1.0 | Provide common functions for unit tests - I've found myself writing some duplicate code in my unit tests, so it would be good to provide some common classes we can utilize in the unit tests.
For example, this bit of code (or a close variation of it) exists in several tests within the setUp method already:
```
$this->current_time = strtotime( 'now' );
$this->current_date = date( 'Y-m-d H:i:s', $this->current_time );
$this->current_date_gmt = gmdate( 'Y-m-d H:i:s', $this->current_time );
$this->admin = $this->factory->user->create( [
'role' => 'administrator',
] );
```
So, it might be good to provide a general class that extends WP_UnitTestCase that provides some of this boilerplate up front, and unit tests can extend that class and make use of some of that boilerplate.
Would be good to research how some other projects handle their "common" test code.
| test | provide common functions for unit tests i ve found myself writing some duplicate code in my unit tests so it would be good to provide some common classes we can utilize in the unit tests for example this bit of code or a close variation of it exists in several tests within the setup method already this current time strtotime now this current date date y m d h i s this current time this current date gmt gmdate y m d h i s this current time this admin this factory user create role administrator so it might be good to provide a general class that extends wp unittestcase that provides some of this boilerplate up front and unit tests can extend that class and make use of some of that boilerplate would be good to research how some other projects handle their common test code | 1 |
25,420 | 4,157,202,708 | IssuesEvent | 2016-06-16 20:31:37 | Clojure-Intro-Course/clojure-intro-class | https://api.github.com/repos/Clojure-Intro-Course/clojure-intro-class | closed | Detecting the only argument | enhancement tests | In function +, the first argument [1 2] must be a number
but is a vector.
should be "the argument". | 1.0 | Detecting the only argument - In function +, the first argument [1 2] must be a number
but is a vector.
should be "the argument". | test | detecting the only argument in function the first argument must be a number but is a vector should be the argument | 1 |
81,406 | 7,780,598,051 | IssuesEvent | 2018-06-05 20:34:26 | dotnet/roslyn | https://api.github.com/repos/dotnet/roslyn | opened | Pending test work items for Flow analysis | Area-Analyzers Area-Compilers New Feature - Flow Analysis New Feature - IOperation Test | **C# tests:**
- [ ] Branch operation: Port applicable `BranchFlow_38 - BranchFlow_48` tests from VB.
- [ ] Fixed statement tests improvements. For further details, see comments in `IOperationTests_IFixedStatement.cs` with this issue ID.
- [ ] Add flow tests for Variable declaration operation in Using/For/Fixed declarations.
**VB tests:**
- [ ] Branch operation: Port applicable `BranchFlow_02 - BranchFlow_05` and `BranchFlow_07 - BranchFlow_11` tests from C#.
- [ ] Block operation: Port applicable `BlockFlow` tests from `IOperationTests_IBlock.cs`.
- [ ] Conditional access operation: Port applicable `ConditionalAccessFlow` tests from C#.
- [ ] Conditional operation: Port applicable `IfFlow` tests from C#.
- [ ] Lock operation: Port applicable `LockFlow` tests from C#.
- [ ] Return operation: Port applicable `ReturnFlow_03 - ReturnFlow_16` tests from C#.
- [ ] Throw operation: Port applicable `ThrowFlow` tests from `IOperationTests_IThrowOperation.cs`.
- [ ] Try/Catch/Finally operation: Port applicable flow tests from `IOperationTests_TryCatch.cs`.
- [ ] Using operation: Port applicable `UsingFlow_01 - UsingFlow_13` tests from C#.
- [ ] Add flow tests for Variable declaration operation in Using/For/ declarations.
| 1.0 | Pending test work items for Flow analysis - **C# tests:**
- [ ] Branch operation: Port applicable `BranchFlow_38 - BranchFlow_48` tests from VB.
- [ ] Fixed statement tests improvements. For further details, see comments in `IOperationTests_IFixedStatement.cs` with this issue ID.
- [ ] Add flow tests for Variable declaration operation in Using/For/Fixed declarations.
**VB tests:**
- [ ] Branch operation: Port applicable `BranchFlow_02 - BranchFlow_05` and `BranchFlow_07 - BranchFlow_11` tests from C#.
- [ ] Block operation: Port applicable `BlockFlow` tests from `IOperationTests_IBlock.cs`.
- [ ] Conditional access operation: Port applicable `ConditionalAccessFlow` tests from C#.
- [ ] Conditional operation: Port applicable `IfFlow` tests from C#.
- [ ] Lock operation: Port applicable `LockFlow` tests from C#.
- [ ] Return operation: Port applicable `ReturnFlow_03 - ReturnFlow_16` tests from C#.
- [ ] Throw operation: Port applicable `ThrowFlow` tests from `IOperationTests_IThrowOperation.cs`.
- [ ] Try/Catch/Finally operation: Port applicable flow tests from `IOperationTests_TryCatch.cs`.
- [ ] Using operation: Port applicable `UsingFlow_01 - UsingFlow_13` tests from C#.
- [ ] Add flow tests for Variable declaration operation in Using/For/ declarations.
| test | pending test work items for flow analysis c tests branch operation port applicable branchflow branchflow tests from vb fixed statement tests improvements for further details see comments in ioperationtests ifixedstatement cs with this issue id add flow tests for variable declaration operation in using for fixed declarations vb tests branch operation port applicable branchflow branchflow and branchflow branchflow tests from c block operation port applicable blockflow tests from ioperationtests iblock cs conditional access operation port applicable conditionalaccessflow tests from c conditional operation port applicable ifflow tests from c lock operation port applicable lockflow tests from c return operation port applicable returnflow returnflow tests from c throw operation port applicable throwflow tests from ioperationtests ithrowoperation cs try catch finally operation port applicable flow tests from ioperationtests trycatch cs using operation port applicable usingflow usingflow tests from c add flow tests for variable declaration operation in using for declarations | 1 |
74,597 | 25,200,904,022 | IssuesEvent | 2022-11-13 04:39:04 | vector-im/element-android | https://api.github.com/repos/vector-im/element-android | opened | WYSIWYG composer some times not show edited text | T-Defect | ### Steps to reproduce
1. select previous writed message in room
2. select "edit message"
3. in WYSIWYG composer not see edited text
This is reproduce some times (time to time).

### Outcome
#### What did you expect?
If user select "edit message" - test message always must copy in compose
#### What happened instead?
Text not alwasy copy to compose
### Your phone model
OnePluse 8t
### Operating system version
Android 12
### Application version and app store
element andrid 1.5.7
### Homeserver
synapse 1.50.2+buster1
### Will you send logs?
Yes
### Are you willing to provide a PR?
No | 1.0 | WYSIWYG composer some times not show edited text - ### Steps to reproduce
1. select previous writed message in room
2. select "edit message"
3. in WYSIWYG composer not see edited text
This is reproduce some times (time to time).

### Outcome
#### What did you expect?
If user select "edit message" - test message always must copy in compose
#### What happened instead?
Text not alwasy copy to compose
### Your phone model
OnePluse 8t
### Operating system version
Android 12
### Application version and app store
element andrid 1.5.7
### Homeserver
synapse 1.50.2+buster1
### Will you send logs?
Yes
### Are you willing to provide a PR?
No | non_test | wysiwyg composer some times not show edited text steps to reproduce select previous writed message in room select edit message in wysiwyg composer not see edited text this is reproduce some times time to time outcome what did you expect if user select edit message test message always must copy in compose what happened instead text not alwasy copy to compose your phone model onepluse operating system version android application version and app store element andrid homeserver synapse will you send logs yes are you willing to provide a pr no | 0 |
47,583 | 5,903,603,117 | IssuesEvent | 2017-05-19 07:21:46 | elastic/elasticsearch | https://api.github.com/repos/elastic/elasticsearch | closed | SimpleLocalTransportTests#testConcurrentSendRespondAndDisconnect failure | :Network test | Fails because of a suite timeout and doesn't reproduce.
Build url: https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+5.3+multijob-unix-compatibility/os=fedora/104
```
> Assumption #1: only tcp transport has a handshake method
2> WARNING: Suite execution timed out: org.elasticsearch.transport.local.SimpleLocalTransportTests
2> ==== jstack at approximately timeout time ====
1> [2017-04-06T13:59:06,671][INFO ][o.e.t.l.SimpleLocalTransportTests] [testNotifyOnShutdown]: before test
2> "elasticsearch[TS_B_9][local_transport][T#1]" ID=1084 WAITING on java.util.concurrent.LinkedTransferQueue@4aaecf9d
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on java.util.concurrent.LinkedTransferQueue@4aaecf9d
1> [2017-04-06T13:59:06,680][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[71]}, bound_addresses {local[71]}
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
1> [2017-04-06T13:59:06,681][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[72]}, bound_addresses {local[72]}
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
1> [2017-04-06T13:59:06,683][INFO ][o.e.t.l.SimpleLocalTransportTests] Stop ServiceB now
1> [2017-04-06T13:59:06,685][INFO ][o.e.t.l.SimpleLocalTransportTests] [testNotifyOnShutdown]: after test
1> [2017-04-06T13:59:06,686][INFO ][o.e.t.l.SimpleLocalTransportTests] [testResponseHeadersArePreserved]: before test
1> [2017-04-06T13:59:06,698][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[73]}, bound_addresses {local[73]}
1> [2017-04-06T13:59:06,700][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[74]}, bound_addresses {local[74]}
1> [2017-04-06T13:59:06,724][INFO ][o.e.t.l.SimpleLocalTransportTests] [testResponseHeadersArePreserved]: after test
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
1> [2017-04-06T13:59:06,726][INFO ][o.e.t.l.SimpleLocalTransportTests] [testSendRandomRequests]: before test
2> "elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#13]" ID=1083 WAITING on org.elasticsearch.common.util.concurrent.EsExecutors$ExecutorScalingQueue@6e85027f
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on org.elasticsearch.common.util.concurrent.EsExecutors$ExecutorScalingQueue@6e85027f
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> "elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#12]" ID=1080 WAITING on org.elasticsearch.common.util.concurrent.EsExecutors$ExecutorScalingQueue@6e85027f
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on org.elasticsearch.common.util.concurrent.EsExecutors$ExecutorScalingQueue@6e85027f
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> "elasticsearch[TS_A][local_transport][T#4]" ID=1079 WAITING on java.util.concurrent.LinkedTransferQueue@7bae44fc
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on java.util.concurrent.LinkedTransferQueue@7bae44fc
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> "elasticsearch[TS_A][local_transport][T#3]" ID=1076 WAITING on java.util.concurrent.LinkedTransferQueue@7bae44fc
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on java.util.concurrent.LinkedTransferQueue@7bae44fc
1> [2017-04-06T13:59:06,729][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[75]}, bound_addresses {local[75]}
1> [2017-04-06T13:59:06,730][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[76]}, bound_addresses {local[76]}
1> [2017-04-06T13:59:06,731][INFO ][o.e.t.t.MockTransportService] publish_address {local[77]}, bound_addresses {local[77]}
1> [2017-04-06T13:59:07,040][INFO ][o.e.t.l.SimpleLocalTransportTests] [testSendRandomRequests]: after test
1> [2017-04-06T13:59:07,042][INFO ][o.e.t.l.SimpleLocalTransportTests] [testTimeoutSendExceptionWithDelayedResponse]: before test
1> [2017-04-06T13:59:07,044][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[78]}, bound_addresses {local[78]}
1> [2017-04-06T13:59:07,045][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[79]}, bound_addresses {local[79]}
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
1> [2017-04-06T13:59:07,161][WARN ][o.e.t.t.MockTransportService] [TS_B] Received response for a request that has timed out, sent [113ms] ago, timed out [13ms] ago, action [sayHelloTimeoutDelayedResponse], node [{TS_A}{TS_A}{1NyEEx-aT3qaawS4qVn9dg}{local}{local[78]}], id [1]
1> [2017-04-06T13:59:07,162][INFO ][o.e.t.l.SimpleLocalTransportTests] [testTimeoutSendExceptionWithDelayedResponse]: after test
1> [2017-04-06T13:59:07,163][INFO ][o.e.t.l.SimpleLocalTransportTests] [testVersionFrom0to0]: before test
1> [2017-04-06T13:59:07,165][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[80]}, bound_addresses {local[80]}
1> [2017-04-06T13:59:07,166][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[81]}, bound_addresses {local[81]}
1> [2017-04-06T13:59:07,167][INFO ][o.e.t.l.SimpleLocalTransportTests] [testVersionFrom0to0]: after test
1> [2017-04-06T13:59:07,168][INFO ][o.e.t.l.SimpleLocalTransportTests] [testBlockingIncomingRequests]: before test
1> [2017-04-06T13:59:07,170][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[82]}, bound_addresses {local[82]}
1> [2017-04-06T13:59:07,171][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[83]}, bound_addresses {local[83]}
1> [2017-04-06T13:59:07,172][INFO ][o.e.t.t.MockTransportService] [TS_TEST] publish_address {local[84]}, bound_addresses {local[84]}
1> [2017-04-06T13:59:07,173][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[85]}, bound_addresses {local[85]}
1> [2017-04-06T13:59:07,175][INFO ][o.e.t.l.SimpleLocalTransportTests] [testBlockingIncomingRequests]: after test
1> [2017-04-06T13:59:07,176][INFO ][o.e.t.l.SimpleLocalTransportTests] [testErrorMessage]: before test
1> [2017-04-06T13:59:07,177][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[86]}, bound_addresses {local[86]}
1> [2017-04-06T13:59:07,178][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[87]}, bound_addresses {local[87]}
1> [2017-04-06T13:59:07,182][INFO ][o.e.t.l.SimpleLocalTransportTests] [testErrorMessage]: after test
1> [2017-04-06T13:59:07,183][INFO ][o.e.t.l.SimpleLocalTransportTests] [testTracerLog]: before test
1> [2017-04-06T13:59:07,184][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[88]}, bound_addresses {local[88]}
1> [2017-04-06T13:59:07,185][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[89]}, bound_addresses {local[89]}
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> "elasticsearch[TS_A][local_transport][T#2]" ID=1075 WAITING on java.util.concurrent.LinkedTransferQueue@7bae44fc
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on java.util.concurrent.LinkedTransferQueue@7bae44fc
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> "elasticsearch[TS_A][local_transport][T#1]" ID=1073 WAITING on java.util.concurrent.LinkedTransferQueue@7bae44fc
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on java.util.concurrent.LinkedTransferQueue@7bae44fc
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
1> [2017-04-06T13:59:07,192][INFO ][o.e.c.s.ClusterSettings ] updating [transport.tracer.include] from [[]] to [["test"]]
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
1> [2017-04-06T13:59:07,192][INFO ][o.e.c.s.ClusterSettings ] updating [transport.tracer.exclude] from [["internal:discovery/zen/fd*","cluster:monitor/nodes/liveness"]] to [["DOESN'T_MATCH"]]
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> "elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#9]" ID=1070 WAITING on org.elasticsearch.common.util.concurrent.BaseFuture$Sync@166fd7a5
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on org.elasticsearch.common.util.concurrent.BaseFuture$Sync@166fd7a5
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:997)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
2> at org.elasticsearch.common.util.concurrent.BaseFuture$Sync.get(BaseFuture.java:248)
2> at org.elasticsearch.common.util.concurrent.BaseFuture.get(BaseFuture.java:91)
2> at org.elasticsearch.action.support.AdapterActionFuture.actionGet(AdapterActionFuture.java:42)
2> at org.elasticsearch.transport.AbstractSimpleTransportTestCase$13.doRun(AbstractSimpleTransportTestCase.java:596)
2> at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:638)
2> at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> Locked synchronizers:
2> - java.util.concurrent.ThreadPoolExecutor$Worker@3f16dced
2> "elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#4]" ID=1065 WAITING on org.elasticsearch.common.util.concurrent.EsExecutors$ExecutorScalingQueue@6e85027f
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on org.elasticsearch.common.util.concurrent.EsExecutors$ExecutorScalingQueue@6e85027f
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> "elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][[timer]]" ID=1061 TIMED_WAITING
2> at java.lang.Thread.sleep(Native Method)
2> at org.elasticsearch.threadpool.ThreadPool$CachedTimeThread.run(ThreadPool.java:536)
2> "TEST-SimpleLocalTransportTests.testConcurrentSendRespondAndDisconnect-seed#[290BC1B07436B39B]" ID=858 WAITING on java.util.concurrent.CountDownLatch$Sync@2a8130d2
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on java.util.concurrent.CountDownLatch$Sync@2a8130d2
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:997)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
2> at java.util.concurrent.CountDownLatch.await(CountDownLatch.java:231)
2> at org.elasticsearch.transport.AbstractSimpleTransportTestCase.testConcurrentSendRespondAndDisconnect(AbstractSimpleTransportTestCase.java:635)
2> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2> at java.lang.reflect.Method.invoke(Method.java:498)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1713)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:907)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:943)
1> [2017-04-06T13:59:07,196][INFO ][o.e.t.l.SimpleLocalTransportTests] [testTracerLog]: after test
1> [2017-04-06T13:59:07,198][INFO ][o.e.t.l.SimpleLocalTransportTests] [testTcpHandshakeTimeout]: before test
1> [2017-04-06T13:59:07,199][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[90]}, bound_addresses {local[90]}
1> [2017-04-06T13:59:07,200][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[91]}, bound_addresses {local[91]}
1> [2017-04-06T13:59:07,202][INFO ][o.e.t.l.SimpleLocalTransportTests] [testTcpHandshakeTimeout]: after test
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:957)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
2> at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
2> at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
2> at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
2> at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:811)
IGNOR/A 0.01s J1 | SimpleLocalTransportTests.testTcpHandshakeTimeout
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:462)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:916)
> Assumption #1: only tcp transport does a handshake
1> [2017-04-06T13:59:07,204][INFO ][o.e.t.l.SimpleLocalTransportTests] [testThreadContext]: before test
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:802)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:852)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
2> at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
2> at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
2> at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
2> at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
2> at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
2> at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
2> at java.lang.Thread.run(Thread.java:745)
2> "SUITE-SimpleLocalTransportTests-seed#[290BC1B07436B39B]" ID=857 RUNNABLE
2> at sun.management.ThreadImpl.dumpThreads0(Native Method)
2> at sun.management.ThreadImpl.dumpAllThreads(ThreadImpl.java:454)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl.formatThreadStacksFull(ThreadLeakControl.java:675)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl.access$1000(ThreadLeakControl.java:64)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl$2.evaluate(ThreadLeakControl.java:415)
2> - locked java.lang.Object@6c514907
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.runSuite(RandomizedRunner.java:678)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.access$200(RandomizedRunner.java:140)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$2.run(RandomizedRunner.java:598)
2> "JUnit4-serializer-daemon" ID=9 TIMED_WAITING
2> at java.lang.Thread.sleep(Native Method)
2> at com.carrotsearch.ant.tasks.junit4.events.Serializer$1.run(Serializer.java:50)
2> "Signal Dispatcher" ID=4 RUNNABLE
2> "Finalizer" ID=3 WAITING on java.lang.ref.ReferenceQueue$Lock@655179d5
2> at java.lang.Object.wait(Native Method)
2> - waiting on java.lang.ref.ReferenceQueue$Lock@655179d5
2> at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)
2> at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:164)
2> at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:209)
2> "Reference Handler" ID=2 WAITING on java.lang.ref.Reference$Lock@d6b5825
2> at java.lang.Object.wait(Native Method)
2> - waiting on java.lang.ref.Reference$Lock@d6b5825
2> at java.lang.Object.wait(Object.java:502)
2> at java.lang.ref.Reference.tryHandlePending(Reference.java:191)
2> at java.lang.ref.Reference$ReferenceHandler.run(Reference.java:153)
2> "main" ID=1 WAITING on com.carrotsearch.randomizedtesting.RandomizedRunner$2@b93c5a6
2> at java.lang.Object.wait(Native Method)
2> - waiting on com.carrotsearch.randomizedtesting.RandomizedRunner$2@b93c5a6
2> at java.lang.Thread.join(Thread.java:1249)
2> at java.lang.Thread.join(Thread.java:1323)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.runSuite(RandomizedRunner.java:608)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.run(RandomizedRunner.java:457)
2> at com.carrotsearch.ant.tasks.junit4.slave.SlaveMain.execute(SlaveMain.java:243)
2> at com.carrotsearch.ant.tasks.junit4.slave.SlaveMain.main(SlaveMain.java:354)
1> [2017-04-06T13:59:07,207][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[92]}, bound_addresses {local[92]}
2> at com.carrotsearch.ant.tasks.junit4.slave.SlaveMainSafe.main(SlaveMainSafe.java:10)
1> [2017-04-06T13:59:07,207][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[93]}, bound_addresses {local[93]}
1> [2017-04-06T13:59:07,215][INFO ][o.e.t.l.SimpleLocalTransportTests] [testThreadContext]: after test
1> [2017-04-06T13:59:07,217][INFO ][o.e.t.l.SimpleLocalTransportTests] [testConcurrentSendRespondAndDisconnect]: before test
1> [2017-04-06T13:59:07,219][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[94]}, bound_addresses {local[94]}
2> ^^==============================================
1> [2017-04-06T13:59:07,219][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[95]}, bound_addresses {local[95]}
1> [2017-04-06T13:59:07,230][INFO ][o.e.t.t.MockTransportService] [TS_B_0] publish_address {local[96]}, bound_addresses {local[96]}
1> [2017-04-06T13:59:07,231][INFO ][o.e.t.t.MockTransportService] [TS_B_3] publish_address {local[97]}, bound_addresses {local[97]}
1> [2017-04-06T13:59:07,231][INFO ][o.e.t.t.MockTransportService] [TS_B_6] publish_address {local[98]}, bound_addresses {local[98]}
1> [2017-04-06T13:59:07,232][INFO ][o.e.t.t.MockTransportService] [TS_B_9] publish_address {local[99]}, bound_addresses {local[99]}
2> REPRODUCE WITH: gradle :core:test -Dtests.seed=290BC1B07436B39B -Dtests.class=org.elasticsearch.transport.local.SimpleLocalTransportTests -Dtests.method="testConcurrentSendRespondAndDisconnect" -Dtests.security.manager=true -Dtests.locale=zh-SG -Dtests.timezone=Asia/Aqtau
ERROR 1199s J1 | SimpleLocalTransportTests.testConcurrentSendRespondAndDisconnect <<< FAILURES!
> Throwable #1: java.lang.Exception: Test abandoned because suite timeout was reached.
> at __randomizedtesting.SeedInfo.seed([290BC1B07436B39B]:0)
2> 四月 06, 2017 2:19:06 下午 com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
2> WARNING: Will linger awaiting termination of 11 leaked thread(s).
2> 四月 06, 2017 2:19:11 下午 com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
2> SEVERE: 11 threads leaked from SUITE scope at org.elasticsearch.transport.local.SimpleLocalTransportTests:
2> 1) Thread[id=858, name=TEST-SimpleLocalTransportTests.testConcurrentSendRespondAndDisconnect-seed#[290BC1B07436B39B], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
1> [2017-04-06T14:19:11,129][ERROR][o.e.t.l.SimpleLocalTransportTests] caught exception while sending to node {TS_B}{TS_B}{GbRouU_HT0GtKrANTCtUyA}{local}{local[95]}
1> java.lang.IllegalStateException: Future got interrupted
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:997)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
2> at java.util.concurrent.CountDownLatch.await(CountDownLatch.java:231)
2> at org.elasticsearch.transport.AbstractSimpleTransportTestCase.testConcurrentSendRespondAndDisconnect(AbstractSimpleTransportTestCase.java:635)
2> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
1> at org.elasticsearch.action.support.AdapterActionFuture.actionGet(AdapterActionFuture.java:44) ~[main/:?]
1> at org.elasticsearch.transport.AbstractSimpleTransportTestCase$13.doRun(AbstractSimpleTransportTestCase.java:596) ~[framework-5.3.1-SNAPSHOT.jar:?]
2> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2> at java.lang.reflect.Method.invoke(Method.java:498)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1713)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:907)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:943)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:957)
1> at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:638) ~[main/:?]
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
2> at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
2> at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
2> at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
2> at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:811)
1> at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) ~[main/:?]
1> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_121]
1> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_121]
1> at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_121]
1> Caused by: java.lang.InterruptedException
1> at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998) ~[?:1.8.0_121]
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:462)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:916)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:802)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:852)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
2> at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
1> at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304) ~[?:1.8.0_121]
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
1> at org.elasticsearch.common.util.concurrent.BaseFuture$Sync.get(BaseFuture.java:248) ~[main/:?]
1> at org.elasticsearch.common.util.concurrent.BaseFuture.get(BaseFuture.java:91) ~[main/:?]
1> at org.elasticsearch.action.support.AdapterActionFuture.actionGet(AdapterActionFuture.java:42) ~[main/:?]
1> ... 6 more
1> [2017-04-06T14:19:11,131][INFO ][o.e.t.l.SimpleLocalTransportTests] [testConcurrentSendRespondAndDisconnect]: after test
2> at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
2> at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
2> at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
2> at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
2> at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
2> at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
2> at java.lang.Thread.run(Thread.java:745)
2> 2) Thread[id=1076, name=elasticsearch[TS_A][local_transport][T#3], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 3) Thread[id=1075, name=elasticsearch[TS_A][local_transport][T#2], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 4) Thread[id=1080, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#12], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 5) Thread[id=1083, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#13], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 6) Thread[id=1070, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#9], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:997)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
2> at org.elasticsearch.common.util.concurrent.BaseFuture$Sync.get(BaseFuture.java:248)
ERROR 0.00s J1 | SimpleLocalTransportTests (suite) <<< FAILURES!
> Throwable #1: java.lang.Exception: Suite timeout exceeded (>= 1200000 msec).
> at __randomizedtesting.SeedInfo.seed([290BC1B07436B39B]:0)
Completed [868/902] on J1 in 1205.08s, 27 tests, 2 errors, 6 skipped <<< FAILURES!
2> at org.elasticsearch.common.util.concurrent.BaseFuture.get(BaseFuture.java:91)
2> at org.elasticsearch.action.support.AdapterActionFuture.actionGet(AdapterActionFuture.java:42)
2> at org.elasticsearch.transport.AbstractSimpleTransportTestCase$13.doRun(AbstractSimpleTransportTestCase.java:596)
2> at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:638)
2> at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 7) Thread[id=1084, name=elasticsearch[TS_B_9][local_transport][T#1], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 8) Thread[id=1079, name=elasticsearch[TS_A][local_transport][T#4], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 9) Thread[id=1061, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][[timer]], state=TIMED_WAITING, group=TGRP-SimpleLocalTransportTests]
2> at java.lang.Thread.sleep(Native Method)
2> at org.elasticsearch.threadpool.ThreadPool$CachedTimeThread.run(ThreadPool.java:536)
2> 10) Thread[id=1065, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#4], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
Suite: org.elasticsearch.cluster.routing.allocation.AllocateUnassignedDecisionTests
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 11) Thread[id=1073, name=elasticsearch[TS_A][local_transport][T#1], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
Completed [869/902] on J1 in 0.03s, 6 tests
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 四月 06, 2017 2:19:11 下午 com.carrotsearch.randomizedtesting.ThreadLeakControl tryToInterruptAll
2> INFO: Starting to interrupt leaked threads:
2> 1) Thread[id=858, name=TEST-SimpleLocalTransportTests.testConcurrentSendRespondAndDisconnect-seed#[290BC1B07436B39B], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 2) Thread[id=1076, name=elasticsearch[TS_A][local_transport][T#3], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 3) Thread[id=1075, name=elasticsearch[TS_A][local_transport][T#2], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 4) Thread[id=1080, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#12], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 5) Thread[id=1083, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#13], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 6) Thread[id=1070, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#9], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 7) Thread[id=1084, name=elasticsearch[TS_B_9][local_transport][T#1], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 8) Thread[id=1079, name=elasticsearch[TS_A][local_transport][T#4], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 9) Thread[id=1061, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][[timer]], state=TIMED_WAITING, group=TGRP-SimpleLocalTransportTests]
2> 10) Thread[id=1065, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#4], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 11) Thread[id=1073, name=elasticsearch[TS_A][local_transport][T#1], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> NOTE: leaving temporary files on disk at: /var/lib/jenkins/workspace/elastic+elasticsearch+5.3+multijob-unix-compatibility/os/fedora/core/build/testrun/test/J1/temp/org.elasticsearch.transport.local.SimpleLocalTransportTests_290BC1B07436B39B-001
2> Apr 06, 2017 9:19:11 AM com.carrotsearch.randomizedtesting.ThreadLeakControl tryToInterruptAll
2> INFO: All leaked threads terminated.
2> REPRODUCE WITH: gradle :core:test -Dtests.seed=290BC1B07436B39B -Dtests.class=org.elasticsearch.transport.local.SimpleLocalTransportTests -Dtests.security.manager=true -Dtests.locale=en-US -Dtests.timezone=Etc/UTC
2> NOTE: test params are: codec=Asserting(Lucene62), sim=RandomSimilarity(queryNorm=true,coord=yes): {}, locale=zh-SG, timezone=Asia/Aqtau
2> NOTE: Linux 4.9.14-200.fc25.x86_64 amd64/Oracle Corporation 1.8.0_121 (64-bit)/cpus=4,threads=1,free=395954336,total=531628032
2> NOTE: All tests run in this JVM: [WriteableIngestDocumentTests, MatchPhrasePrefixQueryBuilderTests, HasParentQueryBuilderTests, OperationRoutingTests,
``` | 1.0 | SimpleLocalTransportTests#testConcurrentSendRespondAndDisconnect failure - Fails because of a suite timeout and doesn't reproduce.
Build url: https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+5.3+multijob-unix-compatibility/os=fedora/104
```
> Assumption #1: only tcp transport has a handshake method
2> WARNING: Suite execution timed out: org.elasticsearch.transport.local.SimpleLocalTransportTests
2> ==== jstack at approximately timeout time ====
1> [2017-04-06T13:59:06,671][INFO ][o.e.t.l.SimpleLocalTransportTests] [testNotifyOnShutdown]: before test
2> "elasticsearch[TS_B_9][local_transport][T#1]" ID=1084 WAITING on java.util.concurrent.LinkedTransferQueue@4aaecf9d
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on java.util.concurrent.LinkedTransferQueue@4aaecf9d
1> [2017-04-06T13:59:06,680][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[71]}, bound_addresses {local[71]}
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
1> [2017-04-06T13:59:06,681][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[72]}, bound_addresses {local[72]}
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
1> [2017-04-06T13:59:06,683][INFO ][o.e.t.l.SimpleLocalTransportTests] Stop ServiceB now
1> [2017-04-06T13:59:06,685][INFO ][o.e.t.l.SimpleLocalTransportTests] [testNotifyOnShutdown]: after test
1> [2017-04-06T13:59:06,686][INFO ][o.e.t.l.SimpleLocalTransportTests] [testResponseHeadersArePreserved]: before test
1> [2017-04-06T13:59:06,698][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[73]}, bound_addresses {local[73]}
1> [2017-04-06T13:59:06,700][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[74]}, bound_addresses {local[74]}
1> [2017-04-06T13:59:06,724][INFO ][o.e.t.l.SimpleLocalTransportTests] [testResponseHeadersArePreserved]: after test
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
1> [2017-04-06T13:59:06,726][INFO ][o.e.t.l.SimpleLocalTransportTests] [testSendRandomRequests]: before test
2> "elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#13]" ID=1083 WAITING on org.elasticsearch.common.util.concurrent.EsExecutors$ExecutorScalingQueue@6e85027f
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on org.elasticsearch.common.util.concurrent.EsExecutors$ExecutorScalingQueue@6e85027f
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> "elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#12]" ID=1080 WAITING on org.elasticsearch.common.util.concurrent.EsExecutors$ExecutorScalingQueue@6e85027f
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on org.elasticsearch.common.util.concurrent.EsExecutors$ExecutorScalingQueue@6e85027f
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> "elasticsearch[TS_A][local_transport][T#4]" ID=1079 WAITING on java.util.concurrent.LinkedTransferQueue@7bae44fc
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on java.util.concurrent.LinkedTransferQueue@7bae44fc
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> "elasticsearch[TS_A][local_transport][T#3]" ID=1076 WAITING on java.util.concurrent.LinkedTransferQueue@7bae44fc
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on java.util.concurrent.LinkedTransferQueue@7bae44fc
1> [2017-04-06T13:59:06,729][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[75]}, bound_addresses {local[75]}
1> [2017-04-06T13:59:06,730][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[76]}, bound_addresses {local[76]}
1> [2017-04-06T13:59:06,731][INFO ][o.e.t.t.MockTransportService] publish_address {local[77]}, bound_addresses {local[77]}
1> [2017-04-06T13:59:07,040][INFO ][o.e.t.l.SimpleLocalTransportTests] [testSendRandomRequests]: after test
1> [2017-04-06T13:59:07,042][INFO ][o.e.t.l.SimpleLocalTransportTests] [testTimeoutSendExceptionWithDelayedResponse]: before test
1> [2017-04-06T13:59:07,044][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[78]}, bound_addresses {local[78]}
1> [2017-04-06T13:59:07,045][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[79]}, bound_addresses {local[79]}
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
1> [2017-04-06T13:59:07,161][WARN ][o.e.t.t.MockTransportService] [TS_B] Received response for a request that has timed out, sent [113ms] ago, timed out [13ms] ago, action [sayHelloTimeoutDelayedResponse], node [{TS_A}{TS_A}{1NyEEx-aT3qaawS4qVn9dg}{local}{local[78]}], id [1]
1> [2017-04-06T13:59:07,162][INFO ][o.e.t.l.SimpleLocalTransportTests] [testTimeoutSendExceptionWithDelayedResponse]: after test
1> [2017-04-06T13:59:07,163][INFO ][o.e.t.l.SimpleLocalTransportTests] [testVersionFrom0to0]: before test
1> [2017-04-06T13:59:07,165][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[80]}, bound_addresses {local[80]}
1> [2017-04-06T13:59:07,166][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[81]}, bound_addresses {local[81]}
1> [2017-04-06T13:59:07,167][INFO ][o.e.t.l.SimpleLocalTransportTests] [testVersionFrom0to0]: after test
1> [2017-04-06T13:59:07,168][INFO ][o.e.t.l.SimpleLocalTransportTests] [testBlockingIncomingRequests]: before test
1> [2017-04-06T13:59:07,170][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[82]}, bound_addresses {local[82]}
1> [2017-04-06T13:59:07,171][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[83]}, bound_addresses {local[83]}
1> [2017-04-06T13:59:07,172][INFO ][o.e.t.t.MockTransportService] [TS_TEST] publish_address {local[84]}, bound_addresses {local[84]}
1> [2017-04-06T13:59:07,173][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[85]}, bound_addresses {local[85]}
1> [2017-04-06T13:59:07,175][INFO ][o.e.t.l.SimpleLocalTransportTests] [testBlockingIncomingRequests]: after test
1> [2017-04-06T13:59:07,176][INFO ][o.e.t.l.SimpleLocalTransportTests] [testErrorMessage]: before test
1> [2017-04-06T13:59:07,177][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[86]}, bound_addresses {local[86]}
1> [2017-04-06T13:59:07,178][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[87]}, bound_addresses {local[87]}
1> [2017-04-06T13:59:07,182][INFO ][o.e.t.l.SimpleLocalTransportTests] [testErrorMessage]: after test
1> [2017-04-06T13:59:07,183][INFO ][o.e.t.l.SimpleLocalTransportTests] [testTracerLog]: before test
1> [2017-04-06T13:59:07,184][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[88]}, bound_addresses {local[88]}
1> [2017-04-06T13:59:07,185][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[89]}, bound_addresses {local[89]}
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> "elasticsearch[TS_A][local_transport][T#2]" ID=1075 WAITING on java.util.concurrent.LinkedTransferQueue@7bae44fc
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on java.util.concurrent.LinkedTransferQueue@7bae44fc
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> "elasticsearch[TS_A][local_transport][T#1]" ID=1073 WAITING on java.util.concurrent.LinkedTransferQueue@7bae44fc
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on java.util.concurrent.LinkedTransferQueue@7bae44fc
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
1> [2017-04-06T13:59:07,192][INFO ][o.e.c.s.ClusterSettings ] updating [transport.tracer.include] from [[]] to [["test"]]
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
1> [2017-04-06T13:59:07,192][INFO ][o.e.c.s.ClusterSettings ] updating [transport.tracer.exclude] from [["internal:discovery/zen/fd*","cluster:monitor/nodes/liveness"]] to [["DOESN'T_MATCH"]]
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> "elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#9]" ID=1070 WAITING on org.elasticsearch.common.util.concurrent.BaseFuture$Sync@166fd7a5
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on org.elasticsearch.common.util.concurrent.BaseFuture$Sync@166fd7a5
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:997)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
2> at org.elasticsearch.common.util.concurrent.BaseFuture$Sync.get(BaseFuture.java:248)
2> at org.elasticsearch.common.util.concurrent.BaseFuture.get(BaseFuture.java:91)
2> at org.elasticsearch.action.support.AdapterActionFuture.actionGet(AdapterActionFuture.java:42)
2> at org.elasticsearch.transport.AbstractSimpleTransportTestCase$13.doRun(AbstractSimpleTransportTestCase.java:596)
2> at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:638)
2> at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> Locked synchronizers:
2> - java.util.concurrent.ThreadPoolExecutor$Worker@3f16dced
2> "elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#4]" ID=1065 WAITING on org.elasticsearch.common.util.concurrent.EsExecutors$ExecutorScalingQueue@6e85027f
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on org.elasticsearch.common.util.concurrent.EsExecutors$ExecutorScalingQueue@6e85027f
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> "elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][[timer]]" ID=1061 TIMED_WAITING
2> at java.lang.Thread.sleep(Native Method)
2> at org.elasticsearch.threadpool.ThreadPool$CachedTimeThread.run(ThreadPool.java:536)
2> "TEST-SimpleLocalTransportTests.testConcurrentSendRespondAndDisconnect-seed#[290BC1B07436B39B]" ID=858 WAITING on java.util.concurrent.CountDownLatch$Sync@2a8130d2
2> at sun.misc.Unsafe.park(Native Method)
2> - waiting on java.util.concurrent.CountDownLatch$Sync@2a8130d2
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:997)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
2> at java.util.concurrent.CountDownLatch.await(CountDownLatch.java:231)
2> at org.elasticsearch.transport.AbstractSimpleTransportTestCase.testConcurrentSendRespondAndDisconnect(AbstractSimpleTransportTestCase.java:635)
2> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2> at java.lang.reflect.Method.invoke(Method.java:498)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1713)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:907)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:943)
1> [2017-04-06T13:59:07,196][INFO ][o.e.t.l.SimpleLocalTransportTests] [testTracerLog]: after test
1> [2017-04-06T13:59:07,198][INFO ][o.e.t.l.SimpleLocalTransportTests] [testTcpHandshakeTimeout]: before test
1> [2017-04-06T13:59:07,199][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[90]}, bound_addresses {local[90]}
1> [2017-04-06T13:59:07,200][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[91]}, bound_addresses {local[91]}
1> [2017-04-06T13:59:07,202][INFO ][o.e.t.l.SimpleLocalTransportTests] [testTcpHandshakeTimeout]: after test
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:957)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
2> at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
2> at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
2> at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
2> at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:811)
IGNOR/A 0.01s J1 | SimpleLocalTransportTests.testTcpHandshakeTimeout
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:462)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:916)
> Assumption #1: only tcp transport does a handshake
1> [2017-04-06T13:59:07,204][INFO ][o.e.t.l.SimpleLocalTransportTests] [testThreadContext]: before test
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:802)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:852)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
2> at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
2> at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
2> at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
2> at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
2> at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
2> at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
2> at java.lang.Thread.run(Thread.java:745)
2> "SUITE-SimpleLocalTransportTests-seed#[290BC1B07436B39B]" ID=857 RUNNABLE
2> at sun.management.ThreadImpl.dumpThreads0(Native Method)
2> at sun.management.ThreadImpl.dumpAllThreads(ThreadImpl.java:454)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl.formatThreadStacksFull(ThreadLeakControl.java:675)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl.access$1000(ThreadLeakControl.java:64)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl$2.evaluate(ThreadLeakControl.java:415)
2> - locked java.lang.Object@6c514907
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.runSuite(RandomizedRunner.java:678)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.access$200(RandomizedRunner.java:140)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$2.run(RandomizedRunner.java:598)
2> "JUnit4-serializer-daemon" ID=9 TIMED_WAITING
2> at java.lang.Thread.sleep(Native Method)
2> at com.carrotsearch.ant.tasks.junit4.events.Serializer$1.run(Serializer.java:50)
2> "Signal Dispatcher" ID=4 RUNNABLE
2> "Finalizer" ID=3 WAITING on java.lang.ref.ReferenceQueue$Lock@655179d5
2> at java.lang.Object.wait(Native Method)
2> - waiting on java.lang.ref.ReferenceQueue$Lock@655179d5
2> at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)
2> at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:164)
2> at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:209)
2> "Reference Handler" ID=2 WAITING on java.lang.ref.Reference$Lock@d6b5825
2> at java.lang.Object.wait(Native Method)
2> - waiting on java.lang.ref.Reference$Lock@d6b5825
2> at java.lang.Object.wait(Object.java:502)
2> at java.lang.ref.Reference.tryHandlePending(Reference.java:191)
2> at java.lang.ref.Reference$ReferenceHandler.run(Reference.java:153)
2> "main" ID=1 WAITING on com.carrotsearch.randomizedtesting.RandomizedRunner$2@b93c5a6
2> at java.lang.Object.wait(Native Method)
2> - waiting on com.carrotsearch.randomizedtesting.RandomizedRunner$2@b93c5a6
2> at java.lang.Thread.join(Thread.java:1249)
2> at java.lang.Thread.join(Thread.java:1323)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.runSuite(RandomizedRunner.java:608)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.run(RandomizedRunner.java:457)
2> at com.carrotsearch.ant.tasks.junit4.slave.SlaveMain.execute(SlaveMain.java:243)
2> at com.carrotsearch.ant.tasks.junit4.slave.SlaveMain.main(SlaveMain.java:354)
1> [2017-04-06T13:59:07,207][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[92]}, bound_addresses {local[92]}
2> at com.carrotsearch.ant.tasks.junit4.slave.SlaveMainSafe.main(SlaveMainSafe.java:10)
1> [2017-04-06T13:59:07,207][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[93]}, bound_addresses {local[93]}
1> [2017-04-06T13:59:07,215][INFO ][o.e.t.l.SimpleLocalTransportTests] [testThreadContext]: after test
1> [2017-04-06T13:59:07,217][INFO ][o.e.t.l.SimpleLocalTransportTests] [testConcurrentSendRespondAndDisconnect]: before test
1> [2017-04-06T13:59:07,219][INFO ][o.e.t.t.MockTransportService] [TS_A] publish_address {local[94]}, bound_addresses {local[94]}
2> ^^==============================================
1> [2017-04-06T13:59:07,219][INFO ][o.e.t.t.MockTransportService] [TS_B] publish_address {local[95]}, bound_addresses {local[95]}
1> [2017-04-06T13:59:07,230][INFO ][o.e.t.t.MockTransportService] [TS_B_0] publish_address {local[96]}, bound_addresses {local[96]}
1> [2017-04-06T13:59:07,231][INFO ][o.e.t.t.MockTransportService] [TS_B_3] publish_address {local[97]}, bound_addresses {local[97]}
1> [2017-04-06T13:59:07,231][INFO ][o.e.t.t.MockTransportService] [TS_B_6] publish_address {local[98]}, bound_addresses {local[98]}
1> [2017-04-06T13:59:07,232][INFO ][o.e.t.t.MockTransportService] [TS_B_9] publish_address {local[99]}, bound_addresses {local[99]}
2> REPRODUCE WITH: gradle :core:test -Dtests.seed=290BC1B07436B39B -Dtests.class=org.elasticsearch.transport.local.SimpleLocalTransportTests -Dtests.method="testConcurrentSendRespondAndDisconnect" -Dtests.security.manager=true -Dtests.locale=zh-SG -Dtests.timezone=Asia/Aqtau
ERROR 1199s J1 | SimpleLocalTransportTests.testConcurrentSendRespondAndDisconnect <<< FAILURES!
> Throwable #1: java.lang.Exception: Test abandoned because suite timeout was reached.
> at __randomizedtesting.SeedInfo.seed([290BC1B07436B39B]:0)
2> 四月 06, 2017 2:19:06 下午 com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
2> WARNING: Will linger awaiting termination of 11 leaked thread(s).
2> 四月 06, 2017 2:19:11 下午 com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
2> SEVERE: 11 threads leaked from SUITE scope at org.elasticsearch.transport.local.SimpleLocalTransportTests:
2> 1) Thread[id=858, name=TEST-SimpleLocalTransportTests.testConcurrentSendRespondAndDisconnect-seed#[290BC1B07436B39B], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
1> [2017-04-06T14:19:11,129][ERROR][o.e.t.l.SimpleLocalTransportTests] caught exception while sending to node {TS_B}{TS_B}{GbRouU_HT0GtKrANTCtUyA}{local}{local[95]}
1> java.lang.IllegalStateException: Future got interrupted
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:997)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
2> at java.util.concurrent.CountDownLatch.await(CountDownLatch.java:231)
2> at org.elasticsearch.transport.AbstractSimpleTransportTestCase.testConcurrentSendRespondAndDisconnect(AbstractSimpleTransportTestCase.java:635)
2> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
1> at org.elasticsearch.action.support.AdapterActionFuture.actionGet(AdapterActionFuture.java:44) ~[main/:?]
1> at org.elasticsearch.transport.AbstractSimpleTransportTestCase$13.doRun(AbstractSimpleTransportTestCase.java:596) ~[framework-5.3.1-SNAPSHOT.jar:?]
2> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2> at java.lang.reflect.Method.invoke(Method.java:498)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1713)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:907)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:943)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:957)
1> at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:638) ~[main/:?]
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
2> at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
2> at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
2> at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
2> at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:811)
1> at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) ~[main/:?]
1> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_121]
1> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_121]
1> at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_121]
1> Caused by: java.lang.InterruptedException
1> at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998) ~[?:1.8.0_121]
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:462)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:916)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:802)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:852)
2> at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
2> at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
1> at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304) ~[?:1.8.0_121]
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
1> at org.elasticsearch.common.util.concurrent.BaseFuture$Sync.get(BaseFuture.java:248) ~[main/:?]
1> at org.elasticsearch.common.util.concurrent.BaseFuture.get(BaseFuture.java:91) ~[main/:?]
1> at org.elasticsearch.action.support.AdapterActionFuture.actionGet(AdapterActionFuture.java:42) ~[main/:?]
1> ... 6 more
1> [2017-04-06T14:19:11,131][INFO ][o.e.t.l.SimpleLocalTransportTests] [testConcurrentSendRespondAndDisconnect]: after test
2> at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
2> at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
2> at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
2> at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
2> at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
2> at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
2> at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
2> at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
2> at java.lang.Thread.run(Thread.java:745)
2> 2) Thread[id=1076, name=elasticsearch[TS_A][local_transport][T#3], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 3) Thread[id=1075, name=elasticsearch[TS_A][local_transport][T#2], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 4) Thread[id=1080, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#12], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 5) Thread[id=1083, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#13], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 6) Thread[id=1070, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#9], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:997)
2> at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
2> at org.elasticsearch.common.util.concurrent.BaseFuture$Sync.get(BaseFuture.java:248)
ERROR 0.00s J1 | SimpleLocalTransportTests (suite) <<< FAILURES!
> Throwable #1: java.lang.Exception: Suite timeout exceeded (>= 1200000 msec).
> at __randomizedtesting.SeedInfo.seed([290BC1B07436B39B]:0)
Completed [868/902] on J1 in 1205.08s, 27 tests, 2 errors, 6 skipped <<< FAILURES!
2> at org.elasticsearch.common.util.concurrent.BaseFuture.get(BaseFuture.java:91)
2> at org.elasticsearch.action.support.AdapterActionFuture.actionGet(AdapterActionFuture.java:42)
2> at org.elasticsearch.transport.AbstractSimpleTransportTestCase$13.doRun(AbstractSimpleTransportTestCase.java:596)
2> at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:638)
2> at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 7) Thread[id=1084, name=elasticsearch[TS_B_9][local_transport][T#1], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 8) Thread[id=1079, name=elasticsearch[TS_A][local_transport][T#4], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 9) Thread[id=1061, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][[timer]], state=TIMED_WAITING, group=TGRP-SimpleLocalTransportTests]
2> at java.lang.Thread.sleep(Native Method)
2> at org.elasticsearch.threadpool.ThreadPool$CachedTimeThread.run(ThreadPool.java:536)
2> 10) Thread[id=1065, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#4], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
Suite: org.elasticsearch.cluster.routing.allocation.AllocateUnassignedDecisionTests
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 11) Thread[id=1073, name=elasticsearch[TS_A][local_transport][T#1], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> at sun.misc.Unsafe.park(Native Method)
Completed [869/902] on J1 in 0.03s, 6 tests
2> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
2> at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:737)
2> at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:647)
2> at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1269)
2> at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
2> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
2> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2> at java.lang.Thread.run(Thread.java:745)
2> 四月 06, 2017 2:19:11 下午 com.carrotsearch.randomizedtesting.ThreadLeakControl tryToInterruptAll
2> INFO: Starting to interrupt leaked threads:
2> 1) Thread[id=858, name=TEST-SimpleLocalTransportTests.testConcurrentSendRespondAndDisconnect-seed#[290BC1B07436B39B], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 2) Thread[id=1076, name=elasticsearch[TS_A][local_transport][T#3], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 3) Thread[id=1075, name=elasticsearch[TS_A][local_transport][T#2], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 4) Thread[id=1080, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#12], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 5) Thread[id=1083, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#13], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 6) Thread[id=1070, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#9], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 7) Thread[id=1084, name=elasticsearch[TS_B_9][local_transport][T#1], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 8) Thread[id=1079, name=elasticsearch[TS_A][local_transport][T#4], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 9) Thread[id=1061, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][[timer]], state=TIMED_WAITING, group=TGRP-SimpleLocalTransportTests]
2> 10) Thread[id=1065, name=elasticsearch[org.elasticsearch.transport.local.SimpleLocalTransportTests][generic][T#4], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> 11) Thread[id=1073, name=elasticsearch[TS_A][local_transport][T#1], state=WAITING, group=TGRP-SimpleLocalTransportTests]
2> NOTE: leaving temporary files on disk at: /var/lib/jenkins/workspace/elastic+elasticsearch+5.3+multijob-unix-compatibility/os/fedora/core/build/testrun/test/J1/temp/org.elasticsearch.transport.local.SimpleLocalTransportTests_290BC1B07436B39B-001
2> Apr 06, 2017 9:19:11 AM com.carrotsearch.randomizedtesting.ThreadLeakControl tryToInterruptAll
2> INFO: All leaked threads terminated.
2> REPRODUCE WITH: gradle :core:test -Dtests.seed=290BC1B07436B39B -Dtests.class=org.elasticsearch.transport.local.SimpleLocalTransportTests -Dtests.security.manager=true -Dtests.locale=en-US -Dtests.timezone=Etc/UTC
2> NOTE: test params are: codec=Asserting(Lucene62), sim=RandomSimilarity(queryNorm=true,coord=yes): {}, locale=zh-SG, timezone=Asia/Aqtau
2> NOTE: Linux 4.9.14-200.fc25.x86_64 amd64/Oracle Corporation 1.8.0_121 (64-bit)/cpus=4,threads=1,free=395954336,total=531628032
2> NOTE: All tests run in this JVM: [WriteableIngestDocumentTests, MatchPhrasePrefixQueryBuilderTests, HasParentQueryBuilderTests, OperationRoutingTests,
``` | test | simplelocaltransporttests testconcurrentsendrespondanddisconnect failure fails because of a suite timeout and doesn t reproduce build url assumption only tcp transport has a handshake method warning suite execution timed out org elasticsearch transport local simplelocaltransporttests jstack at approximately timeout time before test elasticsearch id waiting on java util concurrent linkedtransferqueue at sun misc unsafe park native method waiting on java util concurrent linkedtransferqueue publish address local bound addresses local at java util concurrent locks locksupport park locksupport java publish address local bound addresses local at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java at java util concurrent linkedtransferqueue take linkedtransferqueue java at java util concurrent threadpoolexecutor gettask threadpoolexecutor java stop serviceb now after test before test publish address local bound addresses local publish address local bound addresses local after test at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java before test elasticsearch id waiting on org elasticsearch common util concurrent esexecutors executorscalingqueue at sun misc unsafe park native method waiting on org elasticsearch common util concurrent esexecutors executorscalingqueue at java util concurrent locks locksupport park locksupport java at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java at java util concurrent linkedtransferqueue take linkedtransferqueue java at java util concurrent threadpoolexecutor gettask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java elasticsearch id waiting on org elasticsearch common util concurrent esexecutors executorscalingqueue at sun misc unsafe park native method waiting on org elasticsearch common util concurrent esexecutors executorscalingqueue at java util concurrent locks locksupport park locksupport java at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java at java util concurrent linkedtransferqueue take linkedtransferqueue java at java util concurrent threadpoolexecutor gettask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java elasticsearch id waiting on java util concurrent linkedtransferqueue at sun misc unsafe park native method waiting on java util concurrent linkedtransferqueue at java util concurrent locks locksupport park locksupport java at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java at java util concurrent linkedtransferqueue take linkedtransferqueue java at java util concurrent threadpoolexecutor gettask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java elasticsearch id waiting on java util concurrent linkedtransferqueue at sun misc unsafe park native method waiting on java util concurrent linkedtransferqueue publish address local bound addresses local publish address local bound addresses local publish address local bound addresses local after test before test publish address local bound addresses local publish address local bound addresses local at java util concurrent locks locksupport park locksupport java received response for a request that has timed out sent ago timed out ago action node id after test before test publish address local bound addresses local publish address local bound addresses local after test before test publish address local bound addresses local publish address local bound addresses local publish address local bound addresses local publish address local bound addresses local after test before test publish address local bound addresses local publish address local bound addresses local after test before test publish address local bound addresses local publish address local bound addresses local at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java at java util concurrent linkedtransferqueue take linkedtransferqueue java at java util concurrent threadpoolexecutor gettask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java elasticsearch id waiting on java util concurrent linkedtransferqueue at sun misc unsafe park native method waiting on java util concurrent linkedtransferqueue at java util concurrent locks locksupport park locksupport java at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java at java util concurrent linkedtransferqueue take linkedtransferqueue java at java util concurrent threadpoolexecutor gettask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java elasticsearch id waiting on java util concurrent linkedtransferqueue at sun misc unsafe park native method waiting on java util concurrent linkedtransferqueue at java util concurrent locks locksupport park locksupport java at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java updating from to at java util concurrent linkedtransferqueue take linkedtransferqueue java at java util concurrent threadpoolexecutor gettask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java updating from to at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java elasticsearch id waiting on org elasticsearch common util concurrent basefuture sync at sun misc unsafe park native method waiting on org elasticsearch common util concurrent basefuture sync at java util concurrent locks locksupport park locksupport java at java util concurrent locks abstractqueuedsynchronizer parkandcheckinterrupt abstractqueuedsynchronizer java at java util concurrent locks abstractqueuedsynchronizer doacquiresharedinterruptibly abstractqueuedsynchronizer java at java util concurrent locks abstractqueuedsynchronizer acquiresharedinterruptibly abstractqueuedsynchronizer java at org elasticsearch common util concurrent basefuture sync get basefuture java at org elasticsearch common util concurrent basefuture get basefuture java at org elasticsearch action support adapteractionfuture actionget adapteractionfuture java at org elasticsearch transport abstractsimpletransporttestcase dorun abstractsimpletransporttestcase java at org elasticsearch common util concurrent threadcontext contextpreservingabstractrunnable dorun threadcontext java at org elasticsearch common util concurrent abstractrunnable run abstractrunnable java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java locked synchronizers java util concurrent threadpoolexecutor worker elasticsearch id waiting on org elasticsearch common util concurrent esexecutors executorscalingqueue at sun misc unsafe park native method waiting on org elasticsearch common util concurrent esexecutors executorscalingqueue at java util concurrent locks locksupport park locksupport java at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java at java util concurrent linkedtransferqueue take linkedtransferqueue java at java util concurrent threadpoolexecutor gettask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java elasticsearch id timed waiting at java lang thread sleep native method at org elasticsearch threadpool threadpool cachedtimethread run threadpool java test simplelocaltransporttests testconcurrentsendrespondanddisconnect seed id waiting on java util concurrent countdownlatch sync at sun misc unsafe park native method waiting on java util concurrent countdownlatch sync at java util concurrent locks locksupport park locksupport java at java util concurrent locks abstractqueuedsynchronizer parkandcheckinterrupt abstractqueuedsynchronizer java at java util concurrent locks abstractqueuedsynchronizer doacquiresharedinterruptibly abstractqueuedsynchronizer java at java util concurrent locks abstractqueuedsynchronizer acquiresharedinterruptibly abstractqueuedsynchronizer java at java util concurrent countdownlatch await countdownlatch java at org elasticsearch transport abstractsimpletransporttestcase testconcurrentsendrespondanddisconnect abstractsimpletransporttestcase java at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at com carrotsearch randomizedtesting randomizedrunner invoke randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java after test before test publish address local bound addresses local publish address local bound addresses local after test at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testrulesetupteardownchained evaluate testrulesetupteardownchained java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java at org apache lucene util testrulethreadandtestname evaluate testrulethreadandtestname java at org apache lucene util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene util testrulemarkfailure evaluate testrulemarkfailure java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol forktimeoutingtask threadleakcontrol java ignor a simplelocaltransporttests testtcphandshaketimeout at com carrotsearch randomizedtesting threadleakcontrol evaluate threadleakcontrol java at com carrotsearch randomizedtesting randomizedrunner runsingletest randomizedrunner java assumption only tcp transport does a handshake before test at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testrulestoreclassname evaluate testrulestoreclassname java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testruleassertionsrequired evaluate testruleassertionsrequired java at org apache lucene util testrulemarkfailure evaluate testrulemarkfailure java at org apache lucene util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene util testruleignoretestsuites evaluate testruleignoretestsuites java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at java lang thread run thread java suite simplelocaltransporttests seed id runnable at sun management threadimpl native method at sun management threadimpl dumpallthreads threadimpl java at com carrotsearch randomizedtesting threadleakcontrol formatthreadstacksfull threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol access threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol evaluate threadleakcontrol java locked java lang object at com carrotsearch randomizedtesting randomizedrunner runsuite randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner access randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner run randomizedrunner java serializer daemon id timed waiting at java lang thread sleep native method at com carrotsearch ant tasks events serializer run serializer java signal dispatcher id runnable finalizer id waiting on java lang ref referencequeue lock at java lang object wait native method waiting on java lang ref referencequeue lock at java lang ref referencequeue remove referencequeue java at java lang ref referencequeue remove referencequeue java at java lang ref finalizer finalizerthread run finalizer java reference handler id waiting on java lang ref reference lock at java lang object wait native method waiting on java lang ref reference lock at java lang object wait object java at java lang ref reference tryhandlepending reference java at java lang ref reference referencehandler run reference java main id waiting on com carrotsearch randomizedtesting randomizedrunner at java lang object wait native method waiting on com carrotsearch randomizedtesting randomizedrunner at java lang thread join thread java at java lang thread join thread java at com carrotsearch randomizedtesting randomizedrunner runsuite randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner run randomizedrunner java at com carrotsearch ant tasks slave slavemain execute slavemain java at com carrotsearch ant tasks slave slavemain main slavemain java publish address local bound addresses local at com carrotsearch ant tasks slave slavemainsafe main slavemainsafe java publish address local bound addresses local after test before test publish address local bound addresses local publish address local bound addresses local publish address local bound addresses local publish address local bound addresses local publish address local bound addresses local publish address local bound addresses local reproduce with gradle core test dtests seed dtests class org elasticsearch transport local simplelocaltransporttests dtests method testconcurrentsendrespondanddisconnect dtests security manager true dtests locale zh sg dtests timezone asia aqtau error simplelocaltransporttests testconcurrentsendrespondanddisconnect failures throwable java lang exception test abandoned because suite timeout was reached at randomizedtesting seedinfo seed 四月 下午 com carrotsearch randomizedtesting threadleakcontrol checkthreadleaks warning will linger awaiting termination of leaked thread s 四月 下午 com carrotsearch randomizedtesting threadleakcontrol checkthreadleaks severe threads leaked from suite scope at org elasticsearch transport local simplelocaltransporttests thread state waiting group tgrp simplelocaltransporttests at sun misc unsafe park native method at java util concurrent locks locksupport park locksupport java at java util concurrent locks abstractqueuedsynchronizer parkandcheckinterrupt abstractqueuedsynchronizer java caught exception while sending to node ts b ts b gbrouu local local java lang illegalstateexception future got interrupted at java util concurrent locks abstractqueuedsynchronizer doacquiresharedinterruptibly abstractqueuedsynchronizer java at java util concurrent locks abstractqueuedsynchronizer acquiresharedinterruptibly abstractqueuedsynchronizer java at java util concurrent countdownlatch await countdownlatch java at org elasticsearch transport abstractsimpletransporttestcase testconcurrentsendrespondanddisconnect abstractsimpletransporttestcase java at sun reflect nativemethodaccessorimpl native method at org elasticsearch action support adapteractionfuture actionget adapteractionfuture java at org elasticsearch transport abstractsimpletransporttestcase dorun abstractsimpletransporttestcase java at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at com carrotsearch randomizedtesting randomizedrunner invoke randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at org elasticsearch common util concurrent threadcontext contextpreservingabstractrunnable dorun threadcontext java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testrulesetupteardownchained evaluate testrulesetupteardownchained java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java at org apache lucene util testrulethreadandtestname evaluate testrulethreadandtestname java at org apache lucene util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene util testrulemarkfailure evaluate testrulemarkfailure java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol forktimeoutingtask threadleakcontrol java at org elasticsearch common util concurrent abstractrunnable run abstractrunnable java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java caused by java lang interruptedexception at java util concurrent locks abstractqueuedsynchronizer doacquiresharedinterruptibly abstractqueuedsynchronizer java at com carrotsearch randomizedtesting threadleakcontrol evaluate threadleakcontrol java at com carrotsearch randomizedtesting randomizedrunner runsingletest randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java at java util concurrent locks abstractqueuedsynchronizer acquiresharedinterruptibly abstractqueuedsynchronizer java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org elasticsearch common util concurrent basefuture sync get basefuture java at org elasticsearch common util concurrent basefuture get basefuture java at org elasticsearch action support adapteractionfuture actionget adapteractionfuture java more after test at org apache lucene util testrulestoreclassname evaluate testrulestoreclassname java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testruleassertionsrequired evaluate testruleassertionsrequired java at org apache lucene util testrulemarkfailure evaluate testrulemarkfailure java at org apache lucene util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene util testruleignoretestsuites evaluate testruleignoretestsuites java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at java lang thread run thread java thread state waiting group tgrp simplelocaltransporttests at sun misc unsafe park native method at java util concurrent locks locksupport park locksupport java at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java at java util concurrent linkedtransferqueue take linkedtransferqueue java at java util concurrent threadpoolexecutor gettask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java thread state waiting group tgrp simplelocaltransporttests at sun misc unsafe park native method at java util concurrent locks locksupport park locksupport java at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java at java util concurrent linkedtransferqueue take linkedtransferqueue java at java util concurrent threadpoolexecutor gettask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java thread state waiting group tgrp simplelocaltransporttests at sun misc unsafe park native method at java util concurrent locks locksupport park locksupport java at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java at java util concurrent linkedtransferqueue take linkedtransferqueue java at java util concurrent threadpoolexecutor gettask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java thread state waiting group tgrp simplelocaltransporttests at sun misc unsafe park native method at java util concurrent locks locksupport park locksupport java at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java at java util concurrent linkedtransferqueue take linkedtransferqueue java at java util concurrent threadpoolexecutor gettask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java thread state waiting group tgrp simplelocaltransporttests at sun misc unsafe park native method at java util concurrent locks locksupport park locksupport java at java util concurrent locks abstractqueuedsynchronizer parkandcheckinterrupt abstractqueuedsynchronizer java at java util concurrent locks abstractqueuedsynchronizer doacquiresharedinterruptibly abstractqueuedsynchronizer java at java util concurrent locks abstractqueuedsynchronizer acquiresharedinterruptibly abstractqueuedsynchronizer java at org elasticsearch common util concurrent basefuture sync get basefuture java error simplelocaltransporttests suite failures throwable java lang exception suite timeout exceeded msec at randomizedtesting seedinfo seed completed on in tests errors skipped failures at org elasticsearch common util concurrent basefuture get basefuture java at org elasticsearch action support adapteractionfuture actionget adapteractionfuture java at org elasticsearch transport abstractsimpletransporttestcase dorun abstractsimpletransporttestcase java at org elasticsearch common util concurrent threadcontext contextpreservingabstractrunnable dorun threadcontext java at org elasticsearch common util concurrent abstractrunnable run abstractrunnable java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java thread state waiting group tgrp simplelocaltransporttests at sun misc unsafe park native method at java util concurrent locks locksupport park locksupport java at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java at java util concurrent linkedtransferqueue take linkedtransferqueue java at java util concurrent threadpoolexecutor gettask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java thread state waiting group tgrp simplelocaltransporttests at sun misc unsafe park native method at java util concurrent locks locksupport park locksupport java at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java at java util concurrent linkedtransferqueue take linkedtransferqueue java at java util concurrent threadpoolexecutor gettask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java thread state timed waiting group tgrp simplelocaltransporttests at java lang thread sleep native method at org elasticsearch threadpool threadpool cachedtimethread run threadpool java thread state waiting group tgrp simplelocaltransporttests at sun misc unsafe park native method at java util concurrent locks locksupport park locksupport java at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java at java util concurrent linkedtransferqueue take linkedtransferqueue java suite org elasticsearch cluster routing allocation allocateunassigneddecisiontests at java util concurrent threadpoolexecutor gettask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java thread state waiting group tgrp simplelocaltransporttests at sun misc unsafe park native method completed on in tests at java util concurrent locks locksupport park locksupport java at java util concurrent linkedtransferqueue awaitmatch linkedtransferqueue java at java util concurrent linkedtransferqueue xfer linkedtransferqueue java at java util concurrent linkedtransferqueue take linkedtransferqueue java at java util concurrent threadpoolexecutor gettask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java 四月 下午 com carrotsearch randomizedtesting threadleakcontrol trytointerruptall info starting to interrupt leaked threads thread state waiting group tgrp simplelocaltransporttests thread state waiting group tgrp simplelocaltransporttests thread state waiting group tgrp simplelocaltransporttests thread state waiting group tgrp simplelocaltransporttests thread state waiting group tgrp simplelocaltransporttests thread state waiting group tgrp simplelocaltransporttests thread state waiting group tgrp simplelocaltransporttests thread state waiting group tgrp simplelocaltransporttests thread state timed waiting group tgrp simplelocaltransporttests thread state waiting group tgrp simplelocaltransporttests thread state waiting group tgrp simplelocaltransporttests note leaving temporary files on disk at var lib jenkins workspace elastic elasticsearch multijob unix compatibility os fedora core build testrun test temp org elasticsearch transport local simplelocaltransporttests apr am com carrotsearch randomizedtesting threadleakcontrol trytointerruptall info all leaked threads terminated reproduce with gradle core test dtests seed dtests class org elasticsearch transport local simplelocaltransporttests dtests security manager true dtests locale en us dtests timezone etc utc note test params are codec asserting sim randomsimilarity querynorm true coord yes locale zh sg timezone asia aqtau note linux oracle corporation bit cpus threads free total note all tests run in this jvm writeableingestdocumenttests matchphraseprefixquerybuildertests hasparentquerybuildertests operationroutingtests | 1 |
294,734 | 25,399,443,396 | IssuesEvent | 2022-11-22 10:57:47 | zonemaster/zonemaster | https://api.github.com/repos/zonemaster/zonemaster | closed | Nameserver11 will not capture misbehaving name server | T-Bug A-TestCase | Test case Nameserver11 was updated by #993. Implementation of that is done in PR zonemaster/zonemaster-engine#1034.
The updated test case will, however, not capture all misbehaving name servers. In the example below the name server accepts default EDNS (no options) but returns FORMERR when EDNS Option is included. That misbehavior is not caputered with the updated specification.
The current specification (before the update) and implementation does not capture it either:
```
$ zonemaster-cli bemacom.se --show-testcase --test Nameserver --level INFO
Seconds Level Testcase Message
======= ========= ============== =======
(...)
0.49 INFO NAMESERVER02 The following nameservers support EDNS0 : dns1.elbrev.com/92.33.14.98;dns2.elbrev.com/194.17.14.66.
(...)
0.61 WARNING NAMESERVER11 Nameserver dns2.elbrev.com/194.17.14.66 does not support EDNS0 (replies with FORMERR).
0.62 WARNING NAMESERVER11 Nameserver dns1.elbrev.com/92.33.14.98 does not support EDNS0 (replies with FORMERR).
```
```
; <<>> DiG 9.18.7 <<>> bemacom.se @92.33.14.98 soa +nocookie +mult +norec
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 22667
;; flags: qr aa; QUERY: 1, ANSWER: 1, AUTHORITY: 0, ADDITIONAL: 2
;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 4000
;; QUESTION SECTION:
;bemacom.se. IN SOA
;; ANSWER SECTION:
bemacom.se. 7200 IN SOA dns1.elbrev.com. info.elbrev.com. (
2020020326 ; serial
14400 ; refresh (4 hours)
3600 ; retry (1 hour)
2000000 ; expire (3 weeks 2 days 3 hours 33 minutes 20 seconds)
7200 ; minimum (2 hours)
)
```
```
; <<>> DiG 9.18.7 <<>> bemacom.se @92.33.14.98 soa +nocookie +mult +norec +ednsopt=99:dead
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: FORMERR, id: 53870
;; flags: qr; QUERY: 1, ANSWER: 0, AUTHORITY: 0, ADDITIONAL: 1
;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 1232
; OPT=99: de ad ("..")
;; QUESTION SECTION:
;bemacom.se. IN SOA
;; Query time: 6 msec
;; SERVER: 92.33.14.98#53(92.33.14.98) (UDP)
;; WHEN: Fri Oct 21 09:47:04 UTC 2022
;; MSG SIZE rcvd: 45
```
| 1.0 | Nameserver11 will not capture misbehaving name server - Test case Nameserver11 was updated by #993. Implementation of that is done in PR zonemaster/zonemaster-engine#1034.
The updated test case will, however, not capture all misbehaving name servers. In the example below the name server accepts default EDNS (no options) but returns FORMERR when EDNS Option is included. That misbehavior is not caputered with the updated specification.
The current specification (before the update) and implementation does not capture it either:
```
$ zonemaster-cli bemacom.se --show-testcase --test Nameserver --level INFO
Seconds Level Testcase Message
======= ========= ============== =======
(...)
0.49 INFO NAMESERVER02 The following nameservers support EDNS0 : dns1.elbrev.com/92.33.14.98;dns2.elbrev.com/194.17.14.66.
(...)
0.61 WARNING NAMESERVER11 Nameserver dns2.elbrev.com/194.17.14.66 does not support EDNS0 (replies with FORMERR).
0.62 WARNING NAMESERVER11 Nameserver dns1.elbrev.com/92.33.14.98 does not support EDNS0 (replies with FORMERR).
```
```
; <<>> DiG 9.18.7 <<>> bemacom.se @92.33.14.98 soa +nocookie +mult +norec
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 22667
;; flags: qr aa; QUERY: 1, ANSWER: 1, AUTHORITY: 0, ADDITIONAL: 2
;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 4000
;; QUESTION SECTION:
;bemacom.se. IN SOA
;; ANSWER SECTION:
bemacom.se. 7200 IN SOA dns1.elbrev.com. info.elbrev.com. (
2020020326 ; serial
14400 ; refresh (4 hours)
3600 ; retry (1 hour)
2000000 ; expire (3 weeks 2 days 3 hours 33 minutes 20 seconds)
7200 ; minimum (2 hours)
)
```
```
; <<>> DiG 9.18.7 <<>> bemacom.se @92.33.14.98 soa +nocookie +mult +norec +ednsopt=99:dead
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: FORMERR, id: 53870
;; flags: qr; QUERY: 1, ANSWER: 0, AUTHORITY: 0, ADDITIONAL: 1
;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 1232
; OPT=99: de ad ("..")
;; QUESTION SECTION:
;bemacom.se. IN SOA
;; Query time: 6 msec
;; SERVER: 92.33.14.98#53(92.33.14.98) (UDP)
;; WHEN: Fri Oct 21 09:47:04 UTC 2022
;; MSG SIZE rcvd: 45
```
| test | will not capture misbehaving name server test case was updated by implementation of that is done in pr zonemaster zonemaster engine the updated test case will however not capture all misbehaving name servers in the example below the name server accepts default edns no options but returns formerr when edns option is included that misbehavior is not caputered with the updated specification the current specification before the update and implementation does not capture it either zonemaster cli bemacom se show testcase test nameserver level info seconds level testcase message info the following nameservers support elbrev com elbrev com warning nameserver elbrev com does not support replies with formerr warning nameserver elbrev com does not support replies with formerr dig bemacom se soa nocookie mult norec global options cmd got answer header opcode query status noerror id flags qr aa query answer authority additional opt pseudosection edns version flags udp question section bemacom se in soa answer section bemacom se in soa elbrev com info elbrev com serial refresh hours retry hour expire weeks days hours minutes seconds minimum hours dig bemacom se soa nocookie mult norec ednsopt dead global options cmd got answer header opcode query status formerr id flags qr query answer authority additional opt pseudosection edns version flags udp opt de ad question section bemacom se in soa query time msec server udp when fri oct utc msg size rcvd | 1 |
261,647 | 8,244,538,725 | IssuesEvent | 2018-09-11 06:44:42 | uwhumansvszombies/uwhvz | https://api.github.com/repos/uwhumansvszombies/uwhvz | opened | Upgrade & Fix Wagtail | backend medium priority | #100 🎉 ... except we have to upgrade and fix Wagtail for the logic-breaking #96 PR that just launched. Whoops. | 1.0 | Upgrade & Fix Wagtail - #100 🎉 ... except we have to upgrade and fix Wagtail for the logic-breaking #96 PR that just launched. Whoops. | non_test | upgrade fix wagtail 🎉 except we have to upgrade and fix wagtail for the logic breaking pr that just launched whoops | 0 |
14,699 | 3,418,741,386 | IssuesEvent | 2015-12-08 04:34:31 | aidanlane/snapapps | https://api.github.com/repos/aidanlane/snapapps | closed | Key released header block | enhancement jmss testing day | A user asked for a block that allowed them to check when a key was released, rather than when it was pressed. | 1.0 | Key released header block - A user asked for a block that allowed them to check when a key was released, rather than when it was pressed. | test | key released header block a user asked for a block that allowed them to check when a key was released rather than when it was pressed | 1 |
11,565 | 31,047,300,149 | IssuesEvent | 2023-08-11 01:48:09 | Vector35/binaryninja-api | https://api.github.com/repos/Vector35/binaryninja-api | closed | Code review needed before mips64 support into DEV. | Type: Enhancement Component: Architecture Arch: MIPS | We briefly discussed a code review and also some upcoming changes that could resolve the ELF mips id issue.
https://github.com/Vector35/arch-mips/pull/9
The only unit test this one should violate is the one listing all architectures, as it adds the new mips64. | 1.0 | Code review needed before mips64 support into DEV. - We briefly discussed a code review and also some upcoming changes that could resolve the ELF mips id issue.
https://github.com/Vector35/arch-mips/pull/9
The only unit test this one should violate is the one listing all architectures, as it adds the new mips64. | non_test | code review needed before support into dev we briefly discussed a code review and also some upcoming changes that could resolve the elf mips id issue the only unit test this one should violate is the one listing all architectures as it adds the new | 0 |
246,933 | 20,926,171,362 | IssuesEvent | 2022-03-24 23:18:59 | COS-301/graduates | https://api.github.com/repos/COS-301/graduates | closed | Research testing plans | priority:medium type:test role:tester | Task: research the method as to how our team can test any code that is created | 2.0 | Research testing plans - Task: research the method as to how our team can test any code that is created | test | research testing plans task research the method as to how our team can test any code that is created | 1 |
17,291 | 2,998,142,019 | IssuesEvent | 2015-07-23 12:30:23 | contao/core-bundle | https://api.github.com/repos/contao/core-bundle | closed | TL_CSS needs to be prefixed with “web/” for combined CSS files | defect | If you add a stylesheet via `$GLOBALS['TL_CSS']`:
```php
// Works always
$GLOBALS['TL_CSS'][] = 'bundles/mybundle/file.css';
// Doesn’t work if not in debug mode
$GLOBALS['TL_CSS'][] = 'bundles/mybundle/file.css|static';
// Always doesn’t work
$GLOBALS['TL_CSS'][] = 'web/bundles/mybundle/file.css';
// Works if not in debug mode
$GLOBALS['TL_CSS'][] = 'web/bundles/mybundle/file.css|static';
``` | 1.0 | TL_CSS needs to be prefixed with “web/” for combined CSS files - If you add a stylesheet via `$GLOBALS['TL_CSS']`:
```php
// Works always
$GLOBALS['TL_CSS'][] = 'bundles/mybundle/file.css';
// Doesn’t work if not in debug mode
$GLOBALS['TL_CSS'][] = 'bundles/mybundle/file.css|static';
// Always doesn’t work
$GLOBALS['TL_CSS'][] = 'web/bundles/mybundle/file.css';
// Works if not in debug mode
$GLOBALS['TL_CSS'][] = 'web/bundles/mybundle/file.css|static';
``` | non_test | tl css needs to be prefixed with “web ” for combined css files if you add a stylesheet via globals php works always globals bundles mybundle file css doesn’t work if not in debug mode globals bundles mybundle file css static always doesn’t work globals web bundles mybundle file css works if not in debug mode globals web bundles mybundle file css static | 0 |
293,784 | 25,321,480,896 | IssuesEvent | 2022-11-18 04:34:38 | dotnet/sdk | https://api.github.com/repos/dotnet/sdk | closed | dotnet test ignores -property command line parameters | Area-DotNet Test untriaged | ### Describe the bug.
In a command like `dotnet test something.csproj -property:foo=bar`, the property is ignored.
### To Reproduce
In the attached solution, the following all correctly fail with a warning treated as error:
```
dotnet build -property:TreatWarningsAsErrors=true
dotnet test -property:TreatWarningsAsErrors=true
dotnet build TestProject2.sln -property:TreatWarningsAsErrors=true
dotnet build TestProject2.csproj -property:TreatWarningsAsErrors=true
```
However, the following succeed and merely produce a warning, ignoring the property:
```
dotnet test TestProject2.sln -property:TreatWarningsAsErrors=true
dotnet test TestProject2.csproj -property:TreatWarningsAsErrors=true
```
### Further technical details
```
.NET SDK:
Version: 7.0.100
Commit: e12b7af219
Runtime Environment:
OS Name: Mac OS X
OS Version: 12.6
OS Platform: Darwin
RID: osx.12-x64
Base Path: /usr/local/share/dotnet/sdk/7.0.100/
Host:
Version: 7.0.0
Architecture: x64
Commit: d099f075e4
.NET SDKs installed:
7.0.100 [/usr/local/share/dotnet/sdk]
.NET runtimes installed:
Microsoft.AspNetCore.App 7.0.0 [/usr/local/share/dotnet/shared/Microsoft.AspNetCore.App]
Microsoft.NETCore.App 7.0.0 [/usr/local/share/dotnet/shared/Microsoft.NETCore.App]
Other architectures found:
None
Environment variables:
Not set
global.json file:
Not found
Learn more:
https://aka.ms/dotnet/info
Download .NET:
https://aka.ms/dotnet/download
```
[TestProject2.zip](https://github.com/dotnet/sdk/files/10037706/TestProject2.zip)
| 1.0 | dotnet test ignores -property command line parameters - ### Describe the bug.
In a command like `dotnet test something.csproj -property:foo=bar`, the property is ignored.
### To Reproduce
In the attached solution, the following all correctly fail with a warning treated as error:
```
dotnet build -property:TreatWarningsAsErrors=true
dotnet test -property:TreatWarningsAsErrors=true
dotnet build TestProject2.sln -property:TreatWarningsAsErrors=true
dotnet build TestProject2.csproj -property:TreatWarningsAsErrors=true
```
However, the following succeed and merely produce a warning, ignoring the property:
```
dotnet test TestProject2.sln -property:TreatWarningsAsErrors=true
dotnet test TestProject2.csproj -property:TreatWarningsAsErrors=true
```
### Further technical details
```
.NET SDK:
Version: 7.0.100
Commit: e12b7af219
Runtime Environment:
OS Name: Mac OS X
OS Version: 12.6
OS Platform: Darwin
RID: osx.12-x64
Base Path: /usr/local/share/dotnet/sdk/7.0.100/
Host:
Version: 7.0.0
Architecture: x64
Commit: d099f075e4
.NET SDKs installed:
7.0.100 [/usr/local/share/dotnet/sdk]
.NET runtimes installed:
Microsoft.AspNetCore.App 7.0.0 [/usr/local/share/dotnet/shared/Microsoft.AspNetCore.App]
Microsoft.NETCore.App 7.0.0 [/usr/local/share/dotnet/shared/Microsoft.NETCore.App]
Other architectures found:
None
Environment variables:
Not set
global.json file:
Not found
Learn more:
https://aka.ms/dotnet/info
Download .NET:
https://aka.ms/dotnet/download
```
[TestProject2.zip](https://github.com/dotnet/sdk/files/10037706/TestProject2.zip)
| test | dotnet test ignores property command line parameters describe the bug in a command like dotnet test something csproj property foo bar the property is ignored to reproduce in the attached solution the following all correctly fail with a warning treated as error dotnet build property treatwarningsaserrors true dotnet test property treatwarningsaserrors true dotnet build sln property treatwarningsaserrors true dotnet build csproj property treatwarningsaserrors true however the following succeed and merely produce a warning ignoring the property dotnet test sln property treatwarningsaserrors true dotnet test csproj property treatwarningsaserrors true further technical details net sdk version commit runtime environment os name mac os x os version os platform darwin rid osx base path usr local share dotnet sdk host version architecture commit net sdks installed net runtimes installed microsoft aspnetcore app microsoft netcore app other architectures found none environment variables not set global json file not found learn more download net | 1 |
303,395 | 23,018,483,623 | IssuesEvent | 2022-07-22 00:55:45 | stevenloboorg1/43 | https://api.github.com/repos/stevenloboorg1/43 | opened | Repository protection settings not added | documentation | Repository protection settings not added since repo is private | 1.0 | Repository protection settings not added - Repository protection settings not added since repo is private | non_test | repository protection settings not added repository protection settings not added since repo is private | 0 |
93,686 | 8,441,483,016 | IssuesEvent | 2018-10-18 10:21:05 | humera987/FXLabs-Test-Automation | https://api.github.com/repos/humera987/FXLabs-Test-Automation | opened | FX_Testing : ApiV1AlertsGetAnonymousInvalid | FX_Testing | Project : FX_Testing
Job : UAT
Env : UAT
Region : US_WEST_3
Result : fail
Status Code : 404
Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Thu, 18 Oct 2018 10:21:05 GMT]}
Endpoint : http://13.56.210.25/api/v1/api/v1/alerts
Request :
Response :
{
"timestamp" : "2018-10-18T10:21:05.242+0000",
"status" : 404,
"error" : "Not Found",
"message" : "No message available",
"path" : "/api/v1/api/v1/alerts"
}
Logs :
Assertion [@StatusCode == 401 OR @StatusCode == 403] resolved-to [404 == 401 OR 404 == 403] result [Failed]
--- FX Bot --- | 1.0 | FX_Testing : ApiV1AlertsGetAnonymousInvalid - Project : FX_Testing
Job : UAT
Env : UAT
Region : US_WEST_3
Result : fail
Status Code : 404
Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Thu, 18 Oct 2018 10:21:05 GMT]}
Endpoint : http://13.56.210.25/api/v1/api/v1/alerts
Request :
Response :
{
"timestamp" : "2018-10-18T10:21:05.242+0000",
"status" : 404,
"error" : "Not Found",
"message" : "No message available",
"path" : "/api/v1/api/v1/alerts"
}
Logs :
Assertion [@StatusCode == 401 OR @StatusCode == 403] resolved-to [404 == 401 OR 404 == 403] result [Failed]
--- FX Bot --- | test | fx testing project fx testing job uat env uat region us west result fail status code headers x content type options x xss protection cache control pragma expires x frame options content type transfer encoding date endpoint request response timestamp status error not found message no message available path api api alerts logs assertion resolved to result fx bot | 1 |
330,137 | 10,035,331,072 | IssuesEvent | 2019-07-18 08:05:03 | googleapis/google-cloud-java | https://api.github.com/repos/googleapis/google-cloud-java | closed | Synthesis failed for compute | api: compute autosynth failure priority: p1 type: bug | Hello! Autosynth couldn't regenerate compute. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-compute'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/google-cloud-clients/google-cloud-compute/synth.py.
synthtool > Cloning discovery-artifact-manager.
synthtool > Running generator for gapic/google/compute/artman_compute.yaml.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
0.22.0: Pulling from googleapis/artman
Digest: sha256:72f6287a42490bfe1609aed491f29411af21df3f744199fe8bb8d276c1fdf419
Status: Image is up to date for googleapis/artman:0.22.0
synthtool > Failed executing docker run --name artman-docker --rm -i -e HOST_USER_ID=1000 -e HOST_GROUP_ID=1000 -e RUNNING_IN_ARTMAN_DOCKER=True -v /home/kbuilder/.cache/synthtool/discovery-artifact-manager:/home/kbuilder/.cache/synthtool/discovery-artifact-manager -v /home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles:/home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles -w /home/kbuilder/.cache/synthtool/discovery-artifact-manager googleapis/artman:0.22.0 /bin/bash -c artman --local --config gapic/google/compute/artman_compute.yaml generate java_discogapic:
/artman/artman/config/loader.py:99: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
json_string = json.dumps(yaml.load(f))
/artman/artman/config/loader.py:126: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
artman_config_json_string = json.dumps(yaml.load(f))
artman> Final args:
artman> api_name: compute
artman> api_version: v1
artman> artifact_type: DISCOGAPIC
artman> aspect: ALL
artman> discovery_doc: discoveries/compute.v1.json
artman> gapic_code_dir: /home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles/java/gapic-google-cloud-compute-v1
artman> gapic_yaml: /home/kbuilder/.cache/synthtool/discovery-artifact-manager/gapic/google/compute/v1/compute_gapic.yaml
artman> generator_args: null
artman> import_proto_path:
artman> - /home/kbuilder/.cache/synthtool/discovery-artifact-manager
artman> language: java
artman> organization_name: google-cloud
artman> output_dir: /home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles
artman> proto_deps: []
artman> root_dir: /home/kbuilder/.cache/synthtool/discovery-artifact-manager
artman> service_yaml: ''
artman> src_proto_path: []
artman> toolkit_path: /toolkit
artman>
artman> Creating DiscoGapicClientPipeline.
artman.output >
Exception in thread "main" java.lang.NullPointerException
at com.google.api.codegen.config.DiscoGapicInterfaceContext.createWithInterface(DiscoGapicInterfaceContext.java:85)
at com.google.api.codegen.discogapic.transformer.java.JavaDiscoGapicSurfaceTransformer.newInterfaceContext(JavaDiscoGapicSurfaceTransformer.java:97)
at com.google.api.codegen.discogapic.transformer.java.JavaDiscoGapicRequestToViewTransformer.transform(JavaDiscoGapicRequestToViewTransformer.java:123)
at com.google.api.codegen.discogapic.transformer.java.JavaDiscoGapicRequestToViewTransformer.transform(JavaDiscoGapicRequestToViewTransformer.java:68)
at com.google.api.codegen.discogapic.DiscoGapicGenerator.generate(DiscoGapicGenerator.java:66)
at com.google.api.codegen.discogapic.DiscoGapicGeneratorApp.run(DiscoGapicGeneratorApp.java:185)
at com.google.api.codegen.GeneratorMain.discoGapicMain(GeneratorMain.java:475)
at com.google.api.codegen.GeneratorMain.main(GeneratorMain.java:184)
artman> Traceback (most recent call last):
File "/artman/artman/cli/main.py", line 71, in main
engine.run()
File "/usr/local/lib/python3.5/dist-packages/taskflow/engines/action_engine/engine.py", line 159, in run
for _state in self.run_iter():
File "/usr/local/lib/python3.5/dist-packages/taskflow/engines/action_engine/engine.py", line 223, in run_iter
failure.Failure.reraise_if_any(it)
File "/usr/local/lib/python3.5/dist-packages/taskflow/types/failure.py", line 292, in reraise_if_any
failures[0].reraise()
File "/usr/local/lib/python3.5/dist-packages/taskflow/types/failure.py", line 299, in reraise
six.reraise(*self._exc_info)
File "/usr/local/lib/python3.5/dist-packages/six.py", line 693, in reraise
raise value
File "/usr/local/lib/python3.5/dist-packages/taskflow/engines/action_engine/executor.py", line 82, in _execute_task
result = task.execute(**arguments)
File "/artman/artman/tasks/gapic_tasks.py", line 159, in execute
task_utils.gapic_gen_task(toolkit_path, ['LEGACY_DISCOGAPIC_AND_PACKAGE'] + args))
File "/artman/artman/tasks/task_base.py", line 64, in exec_command
raise e
File "/artman/artman/tasks/task_base.py", line 56, in exec_command
output = subprocess.check_output(args, stderr=subprocess.STDOUT)
File "/usr/lib/python3.5/subprocess.py", line 626, in check_output
**kwargs).stdout
File "/usr/lib/python3.5/subprocess.py", line 708, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['java', '-cp', '/toolkit/build/libs/gapic-generator-latest-fatjar.jar', 'com.google.api.codegen.GeneratorMain', 'LEGACY_DISCOGAPIC_AND_PACKAGE', '--discovery_doc=/home/kbuilder/.cache/synthtool/discovery-artifact-manager/discoveries/compute.v1.json', '--package_yaml2=/home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles/java_google-cloud-compute-v1_package2.yaml', '--output=/home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles/java/gapic-google-cloud-compute-v1', '--language=java', '--gapic_yaml=/home/kbuilder/.cache/synthtool/discovery-artifact-manager/gapic/google/compute/v1/compute_gapic.yaml']' returned non-zero exit status 1
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/google-cloud-clients/google-cloud-compute/synth.py", line 32, in <module>
artman_output_name='')
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/discogapic_generator.py", line 52, in java_library
return self._generate_code(service, version, "java", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/discogapic_generator.py", line 98, in _generate_code
gapic_language_arg,
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/artman.py", line 127, in run
shell.run(cmd, cwd=root_dir)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['docker', 'run', '--name', 'artman-docker', '--rm', '-i', '-e', 'HOST_USER_ID=1000', '-e', 'HOST_GROUP_ID=1000', '-e', 'RUNNING_IN_ARTMAN_DOCKER=True', '-v', '/home/kbuilder/.cache/synthtool/discovery-artifact-manager:/home/kbuilder/.cache/synthtool/discovery-artifact-manager', '-v', '/home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles:/home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles', '-w', PosixPath('/home/kbuilder/.cache/synthtool/discovery-artifact-manager'), 'googleapis/artman:0.22.0', '/bin/bash', '-c', 'artman --local --config gapic/google/compute/artman_compute.yaml generate java_discogapic']' returned non-zero exit status 32.
synthtool > Cleaned up 0 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/62e8be32-56bf-4013-b924-7b4e1a78effd).
| 1.0 | Synthesis failed for compute - Hello! Autosynth couldn't regenerate compute. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-compute'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/google-cloud-clients/google-cloud-compute/synth.py.
synthtool > Cloning discovery-artifact-manager.
synthtool > Running generator for gapic/google/compute/artman_compute.yaml.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
0.22.0: Pulling from googleapis/artman
Digest: sha256:72f6287a42490bfe1609aed491f29411af21df3f744199fe8bb8d276c1fdf419
Status: Image is up to date for googleapis/artman:0.22.0
synthtool > Failed executing docker run --name artman-docker --rm -i -e HOST_USER_ID=1000 -e HOST_GROUP_ID=1000 -e RUNNING_IN_ARTMAN_DOCKER=True -v /home/kbuilder/.cache/synthtool/discovery-artifact-manager:/home/kbuilder/.cache/synthtool/discovery-artifact-manager -v /home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles:/home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles -w /home/kbuilder/.cache/synthtool/discovery-artifact-manager googleapis/artman:0.22.0 /bin/bash -c artman --local --config gapic/google/compute/artman_compute.yaml generate java_discogapic:
/artman/artman/config/loader.py:99: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
json_string = json.dumps(yaml.load(f))
/artman/artman/config/loader.py:126: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
artman_config_json_string = json.dumps(yaml.load(f))
artman> Final args:
artman> api_name: compute
artman> api_version: v1
artman> artifact_type: DISCOGAPIC
artman> aspect: ALL
artman> discovery_doc: discoveries/compute.v1.json
artman> gapic_code_dir: /home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles/java/gapic-google-cloud-compute-v1
artman> gapic_yaml: /home/kbuilder/.cache/synthtool/discovery-artifact-manager/gapic/google/compute/v1/compute_gapic.yaml
artman> generator_args: null
artman> import_proto_path:
artman> - /home/kbuilder/.cache/synthtool/discovery-artifact-manager
artman> language: java
artman> organization_name: google-cloud
artman> output_dir: /home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles
artman> proto_deps: []
artman> root_dir: /home/kbuilder/.cache/synthtool/discovery-artifact-manager
artman> service_yaml: ''
artman> src_proto_path: []
artman> toolkit_path: /toolkit
artman>
artman> Creating DiscoGapicClientPipeline.
artman.output >
Exception in thread "main" java.lang.NullPointerException
at com.google.api.codegen.config.DiscoGapicInterfaceContext.createWithInterface(DiscoGapicInterfaceContext.java:85)
at com.google.api.codegen.discogapic.transformer.java.JavaDiscoGapicSurfaceTransformer.newInterfaceContext(JavaDiscoGapicSurfaceTransformer.java:97)
at com.google.api.codegen.discogapic.transformer.java.JavaDiscoGapicRequestToViewTransformer.transform(JavaDiscoGapicRequestToViewTransformer.java:123)
at com.google.api.codegen.discogapic.transformer.java.JavaDiscoGapicRequestToViewTransformer.transform(JavaDiscoGapicRequestToViewTransformer.java:68)
at com.google.api.codegen.discogapic.DiscoGapicGenerator.generate(DiscoGapicGenerator.java:66)
at com.google.api.codegen.discogapic.DiscoGapicGeneratorApp.run(DiscoGapicGeneratorApp.java:185)
at com.google.api.codegen.GeneratorMain.discoGapicMain(GeneratorMain.java:475)
at com.google.api.codegen.GeneratorMain.main(GeneratorMain.java:184)
artman> Traceback (most recent call last):
File "/artman/artman/cli/main.py", line 71, in main
engine.run()
File "/usr/local/lib/python3.5/dist-packages/taskflow/engines/action_engine/engine.py", line 159, in run
for _state in self.run_iter():
File "/usr/local/lib/python3.5/dist-packages/taskflow/engines/action_engine/engine.py", line 223, in run_iter
failure.Failure.reraise_if_any(it)
File "/usr/local/lib/python3.5/dist-packages/taskflow/types/failure.py", line 292, in reraise_if_any
failures[0].reraise()
File "/usr/local/lib/python3.5/dist-packages/taskflow/types/failure.py", line 299, in reraise
six.reraise(*self._exc_info)
File "/usr/local/lib/python3.5/dist-packages/six.py", line 693, in reraise
raise value
File "/usr/local/lib/python3.5/dist-packages/taskflow/engines/action_engine/executor.py", line 82, in _execute_task
result = task.execute(**arguments)
File "/artman/artman/tasks/gapic_tasks.py", line 159, in execute
task_utils.gapic_gen_task(toolkit_path, ['LEGACY_DISCOGAPIC_AND_PACKAGE'] + args))
File "/artman/artman/tasks/task_base.py", line 64, in exec_command
raise e
File "/artman/artman/tasks/task_base.py", line 56, in exec_command
output = subprocess.check_output(args, stderr=subprocess.STDOUT)
File "/usr/lib/python3.5/subprocess.py", line 626, in check_output
**kwargs).stdout
File "/usr/lib/python3.5/subprocess.py", line 708, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['java', '-cp', '/toolkit/build/libs/gapic-generator-latest-fatjar.jar', 'com.google.api.codegen.GeneratorMain', 'LEGACY_DISCOGAPIC_AND_PACKAGE', '--discovery_doc=/home/kbuilder/.cache/synthtool/discovery-artifact-manager/discoveries/compute.v1.json', '--package_yaml2=/home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles/java_google-cloud-compute-v1_package2.yaml', '--output=/home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles/java/gapic-google-cloud-compute-v1', '--language=java', '--gapic_yaml=/home/kbuilder/.cache/synthtool/discovery-artifact-manager/gapic/google/compute/v1/compute_gapic.yaml']' returned non-zero exit status 1
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/google-cloud-clients/google-cloud-compute/synth.py", line 32, in <module>
artman_output_name='')
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/discogapic_generator.py", line 52, in java_library
return self._generate_code(service, version, "java", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/discogapic_generator.py", line 98, in _generate_code
gapic_language_arg,
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/artman.py", line 127, in run
shell.run(cmd, cwd=root_dir)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['docker', 'run', '--name', 'artman-docker', '--rm', '-i', '-e', 'HOST_USER_ID=1000', '-e', 'HOST_GROUP_ID=1000', '-e', 'RUNNING_IN_ARTMAN_DOCKER=True', '-v', '/home/kbuilder/.cache/synthtool/discovery-artifact-manager:/home/kbuilder/.cache/synthtool/discovery-artifact-manager', '-v', '/home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles:/home/kbuilder/.cache/synthtool/discovery-artifact-manager/artman-genfiles', '-w', PosixPath('/home/kbuilder/.cache/synthtool/discovery-artifact-manager'), 'googleapis/artman:0.22.0', '/bin/bash', '-c', 'artman --local --config gapic/google/compute/artman_compute.yaml generate java_discogapic']' returned non-zero exit status 32.
synthtool > Cleaned up 0 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/62e8be32-56bf-4013-b924-7b4e1a78effd).
| non_test | synthesis failed for compute hello autosynth couldn t regenerate compute broken heart here s the output from running synth py cloning into working repo switched to branch autosynth compute running synthtool synthtool executing tmpfs src git autosynth working repo google cloud clients google cloud compute synth py synthtool cloning discovery artifact manager synthtool running generator for gapic google compute artman compute yaml synthtool ensuring dependencies synthtool pulling artman image pulling from googleapis artman digest status image is up to date for googleapis artman synthtool failed executing docker run name artman docker rm i e host user id e host group id e running in artman docker true v home kbuilder cache synthtool discovery artifact manager home kbuilder cache synthtool discovery artifact manager v home kbuilder cache synthtool discovery artifact manager artman genfiles home kbuilder cache synthtool discovery artifact manager artman genfiles w home kbuilder cache synthtool discovery artifact manager googleapis artman bin bash c artman local config gapic google compute artman compute yaml generate java discogapic artman artman config loader py yamlloadwarning calling yaml load without loader is deprecated as the default loader is unsafe please read for full details json string json dumps yaml load f artman artman config loader py yamlloadwarning calling yaml load without loader is deprecated as the default loader is unsafe please read for full details artman config json string json dumps yaml load f artman final args artman api name compute artman api version artman artifact type discogapic artman aspect all artman discovery doc discoveries compute json artman gapic code dir home kbuilder cache synthtool discovery artifact manager artman genfiles java gapic google cloud compute artman gapic yaml home kbuilder cache synthtool discovery artifact manager gapic google compute compute gapic yaml artman generator args null artman import proto path artman home kbuilder cache synthtool discovery artifact manager artman language java artman organization name google cloud artman output dir home kbuilder cache synthtool discovery artifact manager artman genfiles artman proto deps artman root dir home kbuilder cache synthtool discovery artifact manager artman service yaml artman src proto path artman toolkit path toolkit artman artman creating discogapicclientpipeline artman output exception in thread main java lang nullpointerexception at com google api codegen config discogapicinterfacecontext createwithinterface discogapicinterfacecontext java at com google api codegen discogapic transformer java javadiscogapicsurfacetransformer newinterfacecontext javadiscogapicsurfacetransformer java at com google api codegen discogapic transformer java javadiscogapicrequesttoviewtransformer transform javadiscogapicrequesttoviewtransformer java at com google api codegen discogapic transformer java javadiscogapicrequesttoviewtransformer transform javadiscogapicrequesttoviewtransformer java at com google api codegen discogapic discogapicgenerator generate discogapicgenerator java at com google api codegen discogapic discogapicgeneratorapp run discogapicgeneratorapp java at com google api codegen generatormain discogapicmain generatormain java at com google api codegen generatormain main generatormain java artman traceback most recent call last file artman artman cli main py line in main engine run file usr local lib dist packages taskflow engines action engine engine py line in run for state in self run iter file usr local lib dist packages taskflow engines action engine engine py line in run iter failure failure reraise if any it file usr local lib dist packages taskflow types failure py line in reraise if any failures reraise file usr local lib dist packages taskflow types failure py line in reraise six reraise self exc info file usr local lib dist packages six py line in reraise raise value file usr local lib dist packages taskflow engines action engine executor py line in execute task result task execute arguments file artman artman tasks gapic tasks py line in execute task utils gapic gen task toolkit path args file artman artman tasks task base py line in exec command raise e file artman artman tasks task base py line in exec command output subprocess check output args stderr subprocess stdout file usr lib subprocess py line in check output kwargs stdout file usr lib subprocess py line in run output stdout stderr stderr subprocess calledprocesserror command returned non zero exit status traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src git autosynth env lib site packages synthtool main py line in main file tmpfs src git autosynth env lib site packages click core py line in call return self main args kwargs file tmpfs src git autosynth env lib site packages click core py line in main rv self invoke ctx file tmpfs src git autosynth env lib site packages click core py line in invoke return ctx invoke self callback ctx params file tmpfs src git autosynth env lib site packages click core py line in invoke return callback args kwargs file tmpfs src git autosynth env lib site packages synthtool main py line in main spec loader exec module synth module type ignore file line in exec module file line in call with frames removed file tmpfs src git autosynth working repo google cloud clients google cloud compute synth py line in artman output name file tmpfs src git autosynth env lib site packages synthtool gcp discogapic generator py line in java library return self generate code service version java kwargs file tmpfs src git autosynth env lib site packages synthtool gcp discogapic generator py line in generate code gapic language arg file tmpfs src git autosynth env lib site packages synthtool gcp artman py line in run shell run cmd cwd root dir file tmpfs src git autosynth env lib site packages synthtool shell py line in run raise exc file tmpfs src git autosynth env lib site packages synthtool shell py line in run encoding utf file home kbuilder pyenv versions lib subprocess py line in run output stdout stderr stderr subprocess calledprocesserror command returned non zero exit status synthtool cleaned up temporary directories synthtool wrote metadata to synth metadata synthesis failed google internal developers can see the full log | 0 |
270,944 | 23,548,835,665 | IssuesEvent | 2022-08-21 14:24:56 | nodejs/node | https://api.github.com/repos/nodejs/node | closed | python tools/test.py -J --mode=release parallel/test-* | test tools | `python tools/test.py -J --mode=release parallel/test-*` doesn't seem to work anymore?
```
node$ python tools/test.py -J --mode=release parallel/test-*
File "tools/test.py", line 230
print(f" failed {len([i for i in outputs if i.UnexpectedOutput()]) + 1} out of {self.measure_flakiness + 1}")
``` | 1.0 | python tools/test.py -J --mode=release parallel/test-* - `python tools/test.py -J --mode=release parallel/test-*` doesn't seem to work anymore?
```
node$ python tools/test.py -J --mode=release parallel/test-*
File "tools/test.py", line 230
print(f" failed {len([i for i in outputs if i.UnexpectedOutput()]) + 1} out of {self.measure_flakiness + 1}")
``` | test | python tools test py j mode release parallel test python tools test py j mode release parallel test doesn t seem to work anymore node python tools test py j mode release parallel test file tools test py line print f failed len out of self measure flakiness | 1 |
254,548 | 21,792,654,712 | IssuesEvent | 2022-05-15 06:13:25 | eklem/otp-encryption-decryption-lib | https://api.github.com/repos/eklem/otp-encryption-decryption-lib | closed | setup exports + build/bundle + tests | enhancement tests | * [x] export functions
* [x] Build CJS, ESM and UMD
* [x] Test each function | 1.0 | setup exports + build/bundle + tests - * [x] export functions
* [x] Build CJS, ESM and UMD
* [x] Test each function | test | setup exports build bundle tests export functions build cjs esm and umd test each function | 1 |
148,742 | 23,371,926,172 | IssuesEvent | 2022-08-10 20:44:45 | gravitational/teleport | https://api.github.com/repos/gravitational/teleport | closed | Filter and sort servers and applications by labels | feature-request design c-ar | ### What
Come up with a more scalable UI that supports filtering and sorting servers and apps by labels.
| 1.0 | Filter and sort servers and applications by labels - ### What
Come up with a more scalable UI that supports filtering and sorting servers and apps by labels.
| non_test | filter and sort servers and applications by labels what come up with a more scalable ui that supports filtering and sorting servers and apps by labels | 0 |
161,497 | 20,154,068,514 | IssuesEvent | 2022-02-09 14:59:46 | kapseliboi/crowdfunding-frontend | https://api.github.com/repos/kapseliboi/crowdfunding-frontend | opened | CVE-2021-3918 (High) detected in json-schema-0.2.3.tgz | security vulnerability | ## CVE-2021-3918 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>json-schema-0.2.3.tgz</b></p></summary>
<p>JSON Schema validation and specifications</p>
<p>Library home page: <a href="https://registry.npmjs.org/json-schema/-/json-schema-0.2.3.tgz">https://registry.npmjs.org/json-schema/-/json-schema-0.2.3.tgz</a></p>
<p>
Dependency Hierarchy:
- nodemon-1.11.0.tgz (Root Library)
- chokidar-1.7.0.tgz
- fsevents-1.1.2.tgz
- node-pre-gyp-0.6.36.tgz
- request-2.81.0.tgz
- http-signature-1.1.1.tgz
- jsprim-1.4.0.tgz
- :x: **json-schema-0.2.3.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/kapseliboi/crowdfunding-frontend/commit/3c5fb7d3f2932ae2023353faafd51bf99b953a5f">3c5fb7d3f2932ae2023353faafd51bf99b953a5f</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
json-schema is vulnerable to Improperly Controlled Modification of Object Prototype Attributes ('Prototype Pollution')
<p>Publish Date: 2021-11-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3918>CVE-2021-3918</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-3918">https://nvd.nist.gov/vuln/detail/CVE-2021-3918</a></p>
<p>Release Date: 2021-11-13</p>
<p>Fix Resolution (json-schema): 0.4.0</p>
<p>Direct dependency fix Resolution (nodemon): 1.11.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-3918 (High) detected in json-schema-0.2.3.tgz - ## CVE-2021-3918 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>json-schema-0.2.3.tgz</b></p></summary>
<p>JSON Schema validation and specifications</p>
<p>Library home page: <a href="https://registry.npmjs.org/json-schema/-/json-schema-0.2.3.tgz">https://registry.npmjs.org/json-schema/-/json-schema-0.2.3.tgz</a></p>
<p>
Dependency Hierarchy:
- nodemon-1.11.0.tgz (Root Library)
- chokidar-1.7.0.tgz
- fsevents-1.1.2.tgz
- node-pre-gyp-0.6.36.tgz
- request-2.81.0.tgz
- http-signature-1.1.1.tgz
- jsprim-1.4.0.tgz
- :x: **json-schema-0.2.3.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/kapseliboi/crowdfunding-frontend/commit/3c5fb7d3f2932ae2023353faafd51bf99b953a5f">3c5fb7d3f2932ae2023353faafd51bf99b953a5f</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
json-schema is vulnerable to Improperly Controlled Modification of Object Prototype Attributes ('Prototype Pollution')
<p>Publish Date: 2021-11-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3918>CVE-2021-3918</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-3918">https://nvd.nist.gov/vuln/detail/CVE-2021-3918</a></p>
<p>Release Date: 2021-11-13</p>
<p>Fix Resolution (json-schema): 0.4.0</p>
<p>Direct dependency fix Resolution (nodemon): 1.11.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_test | cve high detected in json schema tgz cve high severity vulnerability vulnerable library json schema tgz json schema validation and specifications library home page a href dependency hierarchy nodemon tgz root library chokidar tgz fsevents tgz node pre gyp tgz request tgz http signature tgz jsprim tgz x json schema tgz vulnerable library found in head commit a href found in base branch master vulnerability details json schema is vulnerable to improperly controlled modification of object prototype attributes prototype pollution publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution json schema direct dependency fix resolution nodemon step up your open source security game with whitesource | 0 |
705,100 | 24,221,624,909 | IssuesEvent | 2022-09-26 11:22:36 | ZupIT/beagle-web-angular | https://api.github.com/repos/ZupIT/beagle-web-angular | closed | beagle view-engine (script de geração de código) não funciona em conjunto com "aliases" | enhancement wontfix low priority | O script para o build AOT não consegue resolver imports que utilizam aliases. É importante suportarmos aliases, pois essa é uma facilidade utilizada em diversos projetos. | 1.0 | beagle view-engine (script de geração de código) não funciona em conjunto com "aliases" - O script para o build AOT não consegue resolver imports que utilizam aliases. É importante suportarmos aliases, pois essa é uma facilidade utilizada em diversos projetos. | non_test | beagle view engine script de geração de código não funciona em conjunto com aliases o script para o build aot não consegue resolver imports que utilizam aliases é importante suportarmos aliases pois essa é uma facilidade utilizada em diversos projetos | 0 |
261,806 | 27,823,886,945 | IssuesEvent | 2023-03-19 14:47:50 | UrielProd/WebGoat | https://api.github.com/repos/UrielProd/WebGoat | opened | spring-boot-starter-validation-2.1.12.RELEASE.jar: 13 vulnerabilities (highest severity is: 9.8) | Mend: dependency security vulnerability | <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-boot-starter-validation-2.1.12.RELEASE.jar</b></p></summary>
<p></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (spring-boot-starter-validation version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2022-1471](https://www.mend.io/vulnerability-database/CVE-2022-1471) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | snakeyaml-1.23.jar | Transitive | N/A* | ❌ |
| [CVE-2022-27772](https://www.mend.io/vulnerability-database/CVE-2022-27772) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.8 | spring-boot-2.1.12.RELEASE.jar | Transitive | 2.2.11.RELEASE | ✅ |
| [CVE-2017-18640](https://www.mend.io/vulnerability-database/CVE-2017-18640) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | snakeyaml-1.23.jar | Transitive | 2.3.0.RELEASE | ✅ |
| [CVE-2022-25857](https://www.mend.io/vulnerability-database/CVE-2022-25857) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | snakeyaml-1.23.jar | Transitive | 3.0.0 | ✅ |
| [CVE-2021-42550](https://www.mend.io/vulnerability-database/CVE-2021-42550) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.6 | detected in multiple dependencies | Transitive | 2.5.8 | ✅ |
| [CVE-2022-41854](https://www.mend.io/vulnerability-database/CVE-2022-41854) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | snakeyaml-1.23.jar | Transitive | 3.0.0 | ✅ |
| [CVE-2022-38752](https://www.mend.io/vulnerability-database/CVE-2022-38752) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | snakeyaml-1.23.jar | Transitive | 3.0.0 | ✅ |
| [CVE-2022-38751](https://www.mend.io/vulnerability-database/CVE-2022-38751) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | snakeyaml-1.23.jar | Transitive | 3.0.0 | ✅ |
| [CVE-2022-38749](https://www.mend.io/vulnerability-database/CVE-2022-38749) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | snakeyaml-1.23.jar | Transitive | 3.0.0 | ✅ |
| [CVE-2022-38750](https://www.mend.io/vulnerability-database/CVE-2022-38750) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.5 | snakeyaml-1.23.jar | Transitive | 3.0.0 | ✅ |
| [CVE-2020-10693](https://www.mend.io/vulnerability-database/CVE-2020-10693) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | hibernate-validator-6.0.18.Final.jar | Transitive | 2.1.15.RELEASE | ✅ |
| [CVE-2022-22970](https://www.mend.io/vulnerability-database/CVE-2022-22970) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | spring-core-5.1.13.RELEASE.jar | Transitive | 2.4.0 | ✅ |
| [CVE-2021-22096](https://www.mend.io/vulnerability-database/CVE-2021-22096) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.3 | spring-core-5.1.13.RELEASE.jar | Transitive | 2.4.0 | ✅ |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the "Details" section below to see if there is a version of transitive dependency where vulnerability is fixed.</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-1471</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
SnakeYaml's Constructor() class does not restrict types which can be instantiated during deserialization. Deserializing yaml content provided by an attacker can lead to remote code execution. We recommend using SnakeYaml's SafeConsturctor when parsing untrusted content to restrict deserialization.
<p>Publish Date: 2022-12-01
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-1471>CVE-2022-1471</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bitbucket.org/snakeyaml/snakeyaml/issues/561/cve-2022-1471-vulnerability-in#comment-64634374">https://bitbucket.org/snakeyaml/snakeyaml/issues/561/cve-2022-1471-vulnerability-in#comment-64634374</a></p>
<p>Release Date: 2022-12-01</p>
<p>Fix Resolution: org.yaml:snakeyaml:2.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-27772</summary>
### Vulnerable Library - <b>spring-boot-2.1.12.RELEASE.jar</b></p>
<p>Spring Boot</p>
<p>Library home page: <a href="https://projects.spring.io/spring-boot/#/spring-boot-parent/spring-boot">https://projects.spring.io/spring-boot/#/spring-boot-parent/spring-boot</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/boot/spring-boot/2.1.12.RELEASE/spring-boot-2.1.12.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **spring-boot-2.1.12.RELEASE.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
** UNSUPPORTED WHEN ASSIGNED ** spring-boot versions prior to version v2.2.11.RELEASE was vulnerable to temporary directory hijacking. This vulnerability impacted the org.springframework.boot.web.server.AbstractConfigurableWebServerFactory.createTempDir method. NOTE: This vulnerability only affects products and/or versions that are no longer supported by the maintainer.
<p>Publish Date: 2022-03-30
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-27772>CVE-2022-27772</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/JLLeitschuh/security-research/security/advisories/GHSA-cm59-pr5q-cw85">https://github.com/JLLeitschuh/security-research/security/advisories/GHSA-cm59-pr5q-cw85</a></p>
<p>Release Date: 2022-03-30</p>
<p>Fix Resolution (org.springframework.boot:spring-boot): 2.2.11.RELEASE</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.2.11.RELEASE</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-18640</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The Alias feature in SnakeYAML before 1.26 allows entity expansion during a load operation, a related issue to CVE-2003-1564.
<p>Publish Date: 2019-12-12
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-18640>CVE-2017-18640</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-18640">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-18640</a></p>
<p>Release Date: 2019-12-12</p>
<p>Fix Resolution (org.yaml:snakeyaml): 1.26</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.3.0.RELEASE</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-25857</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The package org.yaml:snakeyaml from 0 and before 1.31 are vulnerable to Denial of Service (DoS) due missing to nested depth limitation for collections.
<p>Publish Date: 2022-08-30
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25857>CVE-2022-25857</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25857">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25857</a></p>
<p>Release Date: 2022-08-30</p>
<p>Fix Resolution (org.yaml:snakeyaml): 1.31</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-42550</summary>
### Vulnerable Libraries - <b>logback-classic-1.2.3.jar</b>, <b>logback-core-1.2.3.jar</b></p>
<p>
### <b>logback-classic-1.2.3.jar</b></p>
<p>logback-classic module</p>
<p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- spring-boot-starter-logging-2.1.12.RELEASE.jar
- :x: **logback-classic-1.2.3.jar** (Vulnerable Library)
### <b>logback-core-1.2.3.jar</b></p>
<p>logback-core module</p>
<p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- spring-boot-starter-logging-2.1.12.RELEASE.jar
- logback-classic-1.2.3.jar
- :x: **logback-core-1.2.3.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In logback version 1.2.7 and prior versions, an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from LDAP servers.
<p>Publish Date: 2021-12-16
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-42550>CVE-2021-42550</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.6</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550">https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550</a></p>
<p>Release Date: 2021-12-16</p>
<p>Fix Resolution (ch.qos.logback:logback-classic): 1.2.8</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.5.8</p><p>Fix Resolution (ch.qos.logback:logback-core): 1.2.8</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.5.8</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-41854</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Those using Snakeyaml to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stack overflow. This effect may support a denial of service attack.
<p>Publish Date: 2022-11-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-41854>CVE-2022-41854</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bitbucket.org/snakeyaml/snakeyaml/issues/531/">https://bitbucket.org/snakeyaml/snakeyaml/issues/531/</a></p>
<p>Release Date: 2022-11-11</p>
<p>Fix Resolution (org.yaml:snakeyaml): 1.32</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-38752</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Using snakeYAML to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stack-overflow.
<p>Publish Date: 2022-09-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-38752>CVE-2022-38752</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-9w3m-gqgf-c4p9">https://github.com/advisories/GHSA-9w3m-gqgf-c4p9</a></p>
<p>Release Date: 2022-09-05</p>
<p>Fix Resolution (org.yaml:snakeyaml): 1.32</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-38751</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Using snakeYAML to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stackoverflow.
<p>Publish Date: 2022-09-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-38751>CVE-2022-38751</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=47039">https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=47039</a></p>
<p>Release Date: 2022-09-05</p>
<p>Fix Resolution (org.yaml:snakeyaml): 1.31</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-38749</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Using snakeYAML to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stackoverflow.
<p>Publish Date: 2022-09-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-38749>CVE-2022-38749</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bitbucket.org/snakeyaml/snakeyaml/issues/526/stackoverflow-oss-fuzz-47027">https://bitbucket.org/snakeyaml/snakeyaml/issues/526/stackoverflow-oss-fuzz-47027</a></p>
<p>Release Date: 2022-09-05</p>
<p>Fix Resolution (org.yaml:snakeyaml): 1.31</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-38750</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Using snakeYAML to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stackoverflow.
<p>Publish Date: 2022-09-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-38750>CVE-2022-38750</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=47027">https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=47027</a></p>
<p>Release Date: 2022-09-05</p>
<p>Fix Resolution (org.yaml:snakeyaml): 1.31</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-10693</summary>
### Vulnerable Library - <b>hibernate-validator-6.0.18.Final.jar</b></p>
<p>Hibernate's Bean Validation (JSR-380) reference implementation.</p>
<p>Library home page: <a href="http://hibernate.org/validator">http://hibernate.org/validator</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/hibernate/validator/hibernate-validator/6.0.18.Final/hibernate-validator-6.0.18.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- :x: **hibernate-validator-6.0.18.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A flaw was found in Hibernate Validator version 6.1.2.Final. A bug in the message interpolation processor enables invalid EL expressions to be evaluated as if they were valid. This flaw allows attackers to bypass input sanitation (escaping, stripping) controls that developers may have put in place when handling user-controlled data in error messages.
<p>Publish Date: 2020-05-06
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-10693>CVE-2020-10693</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://in.relation.to/2020/05/07/hibernate-validator-615-6020-released/">https://in.relation.to/2020/05/07/hibernate-validator-615-6020-released/</a></p>
<p>Release Date: 2020-05-06</p>
<p>Fix Resolution (org.hibernate.validator:hibernate-validator): 6.0.20.Final</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.1.15.RELEASE</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-22970</summary>
### Vulnerable Library - <b>spring-core-5.1.13.RELEASE.jar</b></p>
<p>Spring Core</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-core/5.1.13.RELEASE/spring-core-5.1.13.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **spring-core-5.1.13.RELEASE.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In spring framework versions prior to 5.3.20+ , 5.2.22+ and old unsupported versions, applications that handle file uploads are vulnerable to DoS attack if they rely on data binding to set a MultipartFile or javax.servlet.Part to a field in a model object.
<p>Publish Date: 2022-05-12
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-22970>CVE-2022-22970</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://tanzu.vmware.com/security/cve-2022-22970">https://tanzu.vmware.com/security/cve-2022-22970</a></p>
<p>Release Date: 2022-05-12</p>
<p>Fix Resolution (org.springframework:spring-core): 5.2.22.RELEASE</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.4.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-22096</summary>
### Vulnerable Library - <b>spring-core-5.1.13.RELEASE.jar</b></p>
<p>Spring Core</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-core/5.1.13.RELEASE/spring-core-5.1.13.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **spring-core-5.1.13.RELEASE.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Spring Framework versions 5.3.0 - 5.3.10, 5.2.0 - 5.2.17, and older unsupported versions, it is possible for a user to provide malicious input to cause the insertion of additional log entries.
<p>Publish Date: 2021-10-28
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-22096>CVE-2021-22096</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>4.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://tanzu.vmware.com/security/cve-2021-22096">https://tanzu.vmware.com/security/cve-2021-22096</a></p>
<p>Release Date: 2021-10-28</p>
<p>Fix Resolution (org.springframework:spring-core): 5.2.18.RELEASE</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.4.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p> | True | spring-boot-starter-validation-2.1.12.RELEASE.jar: 13 vulnerabilities (highest severity is: 9.8) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-boot-starter-validation-2.1.12.RELEASE.jar</b></p></summary>
<p></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (spring-boot-starter-validation version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2022-1471](https://www.mend.io/vulnerability-database/CVE-2022-1471) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | snakeyaml-1.23.jar | Transitive | N/A* | ❌ |
| [CVE-2022-27772](https://www.mend.io/vulnerability-database/CVE-2022-27772) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.8 | spring-boot-2.1.12.RELEASE.jar | Transitive | 2.2.11.RELEASE | ✅ |
| [CVE-2017-18640](https://www.mend.io/vulnerability-database/CVE-2017-18640) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | snakeyaml-1.23.jar | Transitive | 2.3.0.RELEASE | ✅ |
| [CVE-2022-25857](https://www.mend.io/vulnerability-database/CVE-2022-25857) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | snakeyaml-1.23.jar | Transitive | 3.0.0 | ✅ |
| [CVE-2021-42550](https://www.mend.io/vulnerability-database/CVE-2021-42550) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.6 | detected in multiple dependencies | Transitive | 2.5.8 | ✅ |
| [CVE-2022-41854](https://www.mend.io/vulnerability-database/CVE-2022-41854) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | snakeyaml-1.23.jar | Transitive | 3.0.0 | ✅ |
| [CVE-2022-38752](https://www.mend.io/vulnerability-database/CVE-2022-38752) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | snakeyaml-1.23.jar | Transitive | 3.0.0 | ✅ |
| [CVE-2022-38751](https://www.mend.io/vulnerability-database/CVE-2022-38751) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | snakeyaml-1.23.jar | Transitive | 3.0.0 | ✅ |
| [CVE-2022-38749](https://www.mend.io/vulnerability-database/CVE-2022-38749) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | snakeyaml-1.23.jar | Transitive | 3.0.0 | ✅ |
| [CVE-2022-38750](https://www.mend.io/vulnerability-database/CVE-2022-38750) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.5 | snakeyaml-1.23.jar | Transitive | 3.0.0 | ✅ |
| [CVE-2020-10693](https://www.mend.io/vulnerability-database/CVE-2020-10693) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | hibernate-validator-6.0.18.Final.jar | Transitive | 2.1.15.RELEASE | ✅ |
| [CVE-2022-22970](https://www.mend.io/vulnerability-database/CVE-2022-22970) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | spring-core-5.1.13.RELEASE.jar | Transitive | 2.4.0 | ✅ |
| [CVE-2021-22096](https://www.mend.io/vulnerability-database/CVE-2021-22096) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.3 | spring-core-5.1.13.RELEASE.jar | Transitive | 2.4.0 | ✅ |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the "Details" section below to see if there is a version of transitive dependency where vulnerability is fixed.</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-1471</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
SnakeYaml's Constructor() class does not restrict types which can be instantiated during deserialization. Deserializing yaml content provided by an attacker can lead to remote code execution. We recommend using SnakeYaml's SafeConsturctor when parsing untrusted content to restrict deserialization.
<p>Publish Date: 2022-12-01
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-1471>CVE-2022-1471</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bitbucket.org/snakeyaml/snakeyaml/issues/561/cve-2022-1471-vulnerability-in#comment-64634374">https://bitbucket.org/snakeyaml/snakeyaml/issues/561/cve-2022-1471-vulnerability-in#comment-64634374</a></p>
<p>Release Date: 2022-12-01</p>
<p>Fix Resolution: org.yaml:snakeyaml:2.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-27772</summary>
### Vulnerable Library - <b>spring-boot-2.1.12.RELEASE.jar</b></p>
<p>Spring Boot</p>
<p>Library home page: <a href="https://projects.spring.io/spring-boot/#/spring-boot-parent/spring-boot">https://projects.spring.io/spring-boot/#/spring-boot-parent/spring-boot</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/boot/spring-boot/2.1.12.RELEASE/spring-boot-2.1.12.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **spring-boot-2.1.12.RELEASE.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
** UNSUPPORTED WHEN ASSIGNED ** spring-boot versions prior to version v2.2.11.RELEASE was vulnerable to temporary directory hijacking. This vulnerability impacted the org.springframework.boot.web.server.AbstractConfigurableWebServerFactory.createTempDir method. NOTE: This vulnerability only affects products and/or versions that are no longer supported by the maintainer.
<p>Publish Date: 2022-03-30
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-27772>CVE-2022-27772</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/JLLeitschuh/security-research/security/advisories/GHSA-cm59-pr5q-cw85">https://github.com/JLLeitschuh/security-research/security/advisories/GHSA-cm59-pr5q-cw85</a></p>
<p>Release Date: 2022-03-30</p>
<p>Fix Resolution (org.springframework.boot:spring-boot): 2.2.11.RELEASE</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.2.11.RELEASE</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-18640</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The Alias feature in SnakeYAML before 1.26 allows entity expansion during a load operation, a related issue to CVE-2003-1564.
<p>Publish Date: 2019-12-12
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-18640>CVE-2017-18640</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-18640">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-18640</a></p>
<p>Release Date: 2019-12-12</p>
<p>Fix Resolution (org.yaml:snakeyaml): 1.26</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.3.0.RELEASE</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-25857</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The package org.yaml:snakeyaml from 0 and before 1.31 are vulnerable to Denial of Service (DoS) due missing to nested depth limitation for collections.
<p>Publish Date: 2022-08-30
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25857>CVE-2022-25857</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25857">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25857</a></p>
<p>Release Date: 2022-08-30</p>
<p>Fix Resolution (org.yaml:snakeyaml): 1.31</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-42550</summary>
### Vulnerable Libraries - <b>logback-classic-1.2.3.jar</b>, <b>logback-core-1.2.3.jar</b></p>
<p>
### <b>logback-classic-1.2.3.jar</b></p>
<p>logback-classic module</p>
<p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- spring-boot-starter-logging-2.1.12.RELEASE.jar
- :x: **logback-classic-1.2.3.jar** (Vulnerable Library)
### <b>logback-core-1.2.3.jar</b></p>
<p>logback-core module</p>
<p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- spring-boot-starter-logging-2.1.12.RELEASE.jar
- logback-classic-1.2.3.jar
- :x: **logback-core-1.2.3.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In logback version 1.2.7 and prior versions, an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from LDAP servers.
<p>Publish Date: 2021-12-16
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-42550>CVE-2021-42550</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.6</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550">https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550</a></p>
<p>Release Date: 2021-12-16</p>
<p>Fix Resolution (ch.qos.logback:logback-classic): 1.2.8</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.5.8</p><p>Fix Resolution (ch.qos.logback:logback-core): 1.2.8</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.5.8</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-41854</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Those using Snakeyaml to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stack overflow. This effect may support a denial of service attack.
<p>Publish Date: 2022-11-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-41854>CVE-2022-41854</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bitbucket.org/snakeyaml/snakeyaml/issues/531/">https://bitbucket.org/snakeyaml/snakeyaml/issues/531/</a></p>
<p>Release Date: 2022-11-11</p>
<p>Fix Resolution (org.yaml:snakeyaml): 1.32</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-38752</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Using snakeYAML to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stack-overflow.
<p>Publish Date: 2022-09-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-38752>CVE-2022-38752</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-9w3m-gqgf-c4p9">https://github.com/advisories/GHSA-9w3m-gqgf-c4p9</a></p>
<p>Release Date: 2022-09-05</p>
<p>Fix Resolution (org.yaml:snakeyaml): 1.32</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-38751</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Using snakeYAML to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stackoverflow.
<p>Publish Date: 2022-09-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-38751>CVE-2022-38751</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=47039">https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=47039</a></p>
<p>Release Date: 2022-09-05</p>
<p>Fix Resolution (org.yaml:snakeyaml): 1.31</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-38749</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Using snakeYAML to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stackoverflow.
<p>Publish Date: 2022-09-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-38749>CVE-2022-38749</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bitbucket.org/snakeyaml/snakeyaml/issues/526/stackoverflow-oss-fuzz-47027">https://bitbucket.org/snakeyaml/snakeyaml/issues/526/stackoverflow-oss-fuzz-47027</a></p>
<p>Release Date: 2022-09-05</p>
<p>Fix Resolution (org.yaml:snakeyaml): 1.31</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-38750</summary>
### Vulnerable Library - <b>snakeyaml-1.23.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **snakeyaml-1.23.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Using snakeYAML to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stackoverflow.
<p>Publish Date: 2022-09-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-38750>CVE-2022-38750</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=47027">https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=47027</a></p>
<p>Release Date: 2022-09-05</p>
<p>Fix Resolution (org.yaml:snakeyaml): 1.31</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-10693</summary>
### Vulnerable Library - <b>hibernate-validator-6.0.18.Final.jar</b></p>
<p>Hibernate's Bean Validation (JSR-380) reference implementation.</p>
<p>Library home page: <a href="http://hibernate.org/validator">http://hibernate.org/validator</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/hibernate/validator/hibernate-validator/6.0.18.Final/hibernate-validator-6.0.18.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- :x: **hibernate-validator-6.0.18.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A flaw was found in Hibernate Validator version 6.1.2.Final. A bug in the message interpolation processor enables invalid EL expressions to be evaluated as if they were valid. This flaw allows attackers to bypass input sanitation (escaping, stripping) controls that developers may have put in place when handling user-controlled data in error messages.
<p>Publish Date: 2020-05-06
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-10693>CVE-2020-10693</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://in.relation.to/2020/05/07/hibernate-validator-615-6020-released/">https://in.relation.to/2020/05/07/hibernate-validator-615-6020-released/</a></p>
<p>Release Date: 2020-05-06</p>
<p>Fix Resolution (org.hibernate.validator:hibernate-validator): 6.0.20.Final</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.1.15.RELEASE</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-22970</summary>
### Vulnerable Library - <b>spring-core-5.1.13.RELEASE.jar</b></p>
<p>Spring Core</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-core/5.1.13.RELEASE/spring-core-5.1.13.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **spring-core-5.1.13.RELEASE.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In spring framework versions prior to 5.3.20+ , 5.2.22+ and old unsupported versions, applications that handle file uploads are vulnerable to DoS attack if they rely on data binding to set a MultipartFile or javax.servlet.Part to a field in a model object.
<p>Publish Date: 2022-05-12
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-22970>CVE-2022-22970</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://tanzu.vmware.com/security/cve-2022-22970">https://tanzu.vmware.com/security/cve-2022-22970</a></p>
<p>Release Date: 2022-05-12</p>
<p>Fix Resolution (org.springframework:spring-core): 5.2.22.RELEASE</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.4.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-22096</summary>
### Vulnerable Library - <b>spring-core-5.1.13.RELEASE.jar</b></p>
<p>Spring Core</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-core/5.1.13.RELEASE/spring-core-5.1.13.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.1.12.RELEASE.jar (Root Library)
- spring-boot-starter-2.1.12.RELEASE.jar
- :x: **spring-core-5.1.13.RELEASE.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UrielProd/WebGoat/commit/3785d355b5ceb4898fb011d8d0db28ef0e335159">3785d355b5ceb4898fb011d8d0db28ef0e335159</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Spring Framework versions 5.3.0 - 5.3.10, 5.2.0 - 5.2.17, and older unsupported versions, it is possible for a user to provide malicious input to cause the insertion of additional log entries.
<p>Publish Date: 2021-10-28
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-22096>CVE-2021-22096</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>4.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://tanzu.vmware.com/security/cve-2021-22096">https://tanzu.vmware.com/security/cve-2021-22096</a></p>
<p>Release Date: 2021-10-28</p>
<p>Fix Resolution (org.springframework:spring-core): 5.2.18.RELEASE</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.4.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p> | non_test | spring boot starter validation release jar vulnerabilities highest severity is vulnerable library spring boot starter validation release jar path to dependency file pom xml path to vulnerable library home wss scanner repository ch qos logback logback core logback core jar found in head commit a href vulnerabilities cve severity cvss dependency type fixed in spring boot starter validation version remediation available high snakeyaml jar transitive n a high spring boot release jar transitive release high snakeyaml jar transitive release high snakeyaml jar transitive medium detected in multiple dependencies transitive medium snakeyaml jar transitive medium snakeyaml jar transitive medium snakeyaml jar transitive medium snakeyaml jar transitive medium snakeyaml jar transitive medium hibernate validator final jar transitive release medium spring core release jar transitive medium spring core release jar transitive for some transitive vulnerabilities there is no version of direct dependency with a fix check the details section below to see if there is a version of transitive dependency where vulnerability is fixed details cve vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org yaml snakeyaml snakeyaml jar dependency hierarchy spring boot starter validation release jar root library spring boot starter release jar x snakeyaml jar vulnerable library found in head commit a href found in base branch main vulnerability details snakeyaml s constructor class does not restrict types which can be instantiated during deserialization deserializing yaml content provided by an attacker can lead to remote code execution we recommend using snakeyaml s safeconsturctor when parsing untrusted content to restrict deserialization publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org yaml snakeyaml cve vulnerable library spring boot release jar spring boot library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org springframework boot spring boot release spring boot release jar dependency hierarchy spring boot starter validation release jar root library spring boot starter release jar x spring boot release jar vulnerable library found in head commit a href found in base branch main vulnerability details unsupported when assigned spring boot versions prior to version release was vulnerable to temporary directory hijacking this vulnerability impacted the org springframework boot web server abstractconfigurablewebserverfactory createtempdir method note this vulnerability only affects products and or versions that are no longer supported by the maintainer publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org springframework boot spring boot release direct dependency fix resolution org springframework boot spring boot starter validation release rescue worker helmet automatic remediation is available for this issue cve vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org yaml snakeyaml snakeyaml jar dependency hierarchy spring boot starter validation release jar root library spring boot starter release jar x snakeyaml jar vulnerable library found in head commit a href found in base branch main vulnerability details the alias feature in snakeyaml before allows entity expansion during a load operation a related issue to cve publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org yaml snakeyaml direct dependency fix resolution org springframework boot spring boot starter validation release rescue worker helmet automatic remediation is available for this issue cve vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org yaml snakeyaml snakeyaml jar dependency hierarchy spring boot starter validation release jar root library spring boot starter release jar x snakeyaml jar vulnerable library found in head commit a href found in base branch main vulnerability details the package org yaml snakeyaml from and before are vulnerable to denial of service dos due missing to nested depth limitation for collections publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org yaml snakeyaml direct dependency fix resolution org springframework boot spring boot starter validation rescue worker helmet automatic remediation is available for this issue cve vulnerable libraries logback classic jar logback core jar logback classic jar logback classic module library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository ch qos logback logback classic logback classic jar dependency hierarchy spring boot starter validation release jar root library spring boot starter release jar spring boot starter logging release jar x logback classic jar vulnerable library logback core jar logback core module library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository ch qos logback logback core logback core jar dependency hierarchy spring boot starter validation release jar root library spring boot starter release jar spring boot starter logging release jar logback classic jar x logback core jar vulnerable library found in head commit a href found in base branch main vulnerability details in logback version and prior versions an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from ldap servers publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ch qos logback logback classic direct dependency fix resolution org springframework boot spring boot starter validation fix resolution ch qos logback logback core direct dependency fix resolution org springframework boot spring boot starter validation rescue worker helmet automatic remediation is available for this issue cve vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org yaml snakeyaml snakeyaml jar dependency hierarchy spring boot starter validation release jar root library spring boot starter release jar x snakeyaml jar vulnerable library found in head commit a href found in base branch main vulnerability details those using snakeyaml to parse untrusted yaml files may be vulnerable to denial of service attacks dos if the parser is running on user supplied input an attacker may supply content that causes the parser to crash by stack overflow this effect may support a denial of service attack publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org yaml snakeyaml direct dependency fix resolution org springframework boot spring boot starter validation rescue worker helmet automatic remediation is available for this issue cve vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org yaml snakeyaml snakeyaml jar dependency hierarchy spring boot starter validation release jar root library spring boot starter release jar x snakeyaml jar vulnerable library found in head commit a href found in base branch main vulnerability details using snakeyaml to parse untrusted yaml files may be vulnerable to denial of service attacks dos if the parser is running on user supplied input an attacker may supply content that causes the parser to crash by stack overflow publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org yaml snakeyaml direct dependency fix resolution org springframework boot spring boot starter validation rescue worker helmet automatic remediation is available for this issue cve vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org yaml snakeyaml snakeyaml jar dependency hierarchy spring boot starter validation release jar root library spring boot starter release jar x snakeyaml jar vulnerable library found in head commit a href found in base branch main vulnerability details using snakeyaml to parse untrusted yaml files may be vulnerable to denial of service attacks dos if the parser is running on user supplied input an attacker may supply content that causes the parser to crash by stackoverflow publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org yaml snakeyaml direct dependency fix resolution org springframework boot spring boot starter validation rescue worker helmet automatic remediation is available for this issue cve vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org yaml snakeyaml snakeyaml jar dependency hierarchy spring boot starter validation release jar root library spring boot starter release jar x snakeyaml jar vulnerable library found in head commit a href found in base branch main vulnerability details using snakeyaml to parse untrusted yaml files may be vulnerable to denial of service attacks dos if the parser is running on user supplied input an attacker may supply content that causes the parser to crash by stackoverflow publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org yaml snakeyaml direct dependency fix resolution org springframework boot spring boot starter validation rescue worker helmet automatic remediation is available for this issue cve vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org yaml snakeyaml snakeyaml jar dependency hierarchy spring boot starter validation release jar root library spring boot starter release jar x snakeyaml jar vulnerable library found in head commit a href found in base branch main vulnerability details using snakeyaml to parse untrusted yaml files may be vulnerable to denial of service attacks dos if the parser is running on user supplied input an attacker may supply content that causes the parser to crash by stackoverflow publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org yaml snakeyaml direct dependency fix resolution org springframework boot spring boot starter validation rescue worker helmet automatic remediation is available for this issue cve vulnerable library hibernate validator final jar hibernate s bean validation jsr reference implementation library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org hibernate validator hibernate validator final hibernate validator final jar dependency hierarchy spring boot starter validation release jar root library x hibernate validator final jar vulnerable library found in head commit a href found in base branch main vulnerability details a flaw was found in hibernate validator version final a bug in the message interpolation processor enables invalid el expressions to be evaluated as if they were valid this flaw allows attackers to bypass input sanitation escaping stripping controls that developers may have put in place when handling user controlled data in error messages publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org hibernate validator hibernate validator final direct dependency fix resolution org springframework boot spring boot starter validation release rescue worker helmet automatic remediation is available for this issue cve vulnerable library spring core release jar spring core library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org springframework spring core release spring core release jar dependency hierarchy spring boot starter validation release jar root library spring boot starter release jar x spring core release jar vulnerable library found in head commit a href found in base branch main vulnerability details in spring framework versions prior to and old unsupported versions applications that handle file uploads are vulnerable to dos attack if they rely on data binding to set a multipartfile or javax servlet part to a field in a model object publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org springframework spring core release direct dependency fix resolution org springframework boot spring boot starter validation rescue worker helmet automatic remediation is available for this issue cve vulnerable library spring core release jar spring core library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org springframework spring core release spring core release jar dependency hierarchy spring boot starter validation release jar root library spring boot starter release jar x spring core release jar vulnerable library found in head commit a href found in base branch main vulnerability details in spring framework versions and older unsupported versions it is possible for a user to provide malicious input to cause the insertion of additional log entries publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org springframework spring core release direct dependency fix resolution org springframework boot spring boot starter validation rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue | 0 |
278,124 | 24,125,255,376 | IssuesEvent | 2022-09-20 23:16:20 | jdi-testing/jdi-light | https://api.github.com/repos/jdi-testing/jdi-light | opened | Update test-site: skeleton loader | TestSite Vuetify | Add these features to test site:
- [ ] theme - dark
- [ ] animation - on
- [ ] loading curson - on
- [ ] min/max/fixed width/height
- [ ] elevation
- [ ] tile - on | 1.0 | Update test-site: skeleton loader - Add these features to test site:
- [ ] theme - dark
- [ ] animation - on
- [ ] loading curson - on
- [ ] min/max/fixed width/height
- [ ] elevation
- [ ] tile - on | test | update test site skeleton loader add these features to test site theme dark animation on loading curson on min max fixed width height elevation tile on | 1 |
94,476 | 11,874,503,015 | IssuesEvent | 2020-03-26 19:07:10 | OpenLiberty/open-liberty | https://api.github.com/repos/OpenLiberty/open-liberty | closed | Support OpenShift service account credentials for authentication | Design Approved Epic FAT complete ID Required In Progress ReleaseCheckListAdded beta:20300 focalApproved:accessibility focalApproved:demo focalApproved:fat focalApproved:globalization focalApproved:id focalApproved:performance focalApproved:serviceability focalApproved:ste focalApproved:svt in:Security target:ga team:Security SSO | Open Liberty should be able to accept OpenShift service account credentials for authentication and role based access when configured to do so. This would provide the ability for platform-supporting apps in OpenShift (such as Prometheus for monitoring) to authenticate with monitored Liberty servers with minimal required configuration. When this is enabled, Liberty will map OpenShift project names to group names for the role-based access check.
UFO: https://ibm.box.com/s/wxrkaw020wt6ws9mxmlnv3opxrxikw7y | 1.0 | Support OpenShift service account credentials for authentication - Open Liberty should be able to accept OpenShift service account credentials for authentication and role based access when configured to do so. This would provide the ability for platform-supporting apps in OpenShift (such as Prometheus for monitoring) to authenticate with monitored Liberty servers with minimal required configuration. When this is enabled, Liberty will map OpenShift project names to group names for the role-based access check.
UFO: https://ibm.box.com/s/wxrkaw020wt6ws9mxmlnv3opxrxikw7y | non_test | support openshift service account credentials for authentication open liberty should be able to accept openshift service account credentials for authentication and role based access when configured to do so this would provide the ability for platform supporting apps in openshift such as prometheus for monitoring to authenticate with monitored liberty servers with minimal required configuration when this is enabled liberty will map openshift project names to group names for the role based access check ufo | 0 |
38,567 | 12,558,906,190 | IssuesEvent | 2020-06-07 17:18:40 | MicrosoftDocs/CloudAppSecurityDocs | https://api.github.com/repos/MicrosoftDocs/CloudAppSecurityDocs | closed | Unnecessary hyperlinked text | /prod /tech Pri2 assigned-to-author cloud-app-security/svc doc-bug triaged |
A large swathe of the page starting at [Tag apps as sanctioned or unsanctioned](https://docs.microsoft.com/en-us/cloud-app-security/investigate#tag-apps-as-sanctioned-or-unsanctioned) has unnecessary hyperlinked text.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 5bfee6e2-2879-2e43-0a8a-d3167762d6ee
* Version Independent ID: f5f6ae9a-b605-dc4d-7be6-335f47e101d5
* Content: [Investigate cloud app risks & suspicious activity - Cloud App Security](https://docs.microsoft.com/en-us/cloud-app-security/investigate)
* Content Source: [CloudAppSecurityDocs/investigate.md](https://github.com/Microsoft/CloudAppSecurityDocs/blob/master/CloudAppSecurityDocs/investigate.md)
* Service: **cloud-app-security**
* Product: ****
* Technology: ****
* GitHub Login: @shsagir
* Microsoft Alias: **shsagir** | True | Unnecessary hyperlinked text -
A large swathe of the page starting at [Tag apps as sanctioned or unsanctioned](https://docs.microsoft.com/en-us/cloud-app-security/investigate#tag-apps-as-sanctioned-or-unsanctioned) has unnecessary hyperlinked text.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 5bfee6e2-2879-2e43-0a8a-d3167762d6ee
* Version Independent ID: f5f6ae9a-b605-dc4d-7be6-335f47e101d5
* Content: [Investigate cloud app risks & suspicious activity - Cloud App Security](https://docs.microsoft.com/en-us/cloud-app-security/investigate)
* Content Source: [CloudAppSecurityDocs/investigate.md](https://github.com/Microsoft/CloudAppSecurityDocs/blob/master/CloudAppSecurityDocs/investigate.md)
* Service: **cloud-app-security**
* Product: ****
* Technology: ****
* GitHub Login: @shsagir
* Microsoft Alias: **shsagir** | non_test | unnecessary hyperlinked text a large swathe of the page starting at has unnecessary hyperlinked text document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service cloud app security product technology github login shsagir microsoft alias shsagir | 0 |
126,948 | 10,441,209,232 | IssuesEvent | 2019-09-18 10:18:27 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | closed | teamcity: failed test: TestDockerCLI | C-test-failure O-robot | The following tests appear to have failed on master (acceptance): TestDockerCLI/test_multiple_nodes.tcl, TestDockerCLI
You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+TestDockerCLI).
[#1495152](https://teamcity.cockroachdb.com/viewLog.html?buildId=1495152):
```
TestDockerCLI/test_multiple_nodes.tcl
--- FAIL: acceptance/TestDockerCLI: TestDockerCLI/test_multiple_nodes.tcl (38.020s)
TestDockerCLI
--- FAIL: acceptance/TestDockerCLI (354.660s)
------- Stdout: -------
test_log_scope.go:77: test logs captured to: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/acceptance/logTestDockerCLI925366613
test_log_scope.go:58: use -show-logs to present logs inline
```
Please assign, take a look and update the issue accordingly.
| 1.0 | teamcity: failed test: TestDockerCLI - The following tests appear to have failed on master (acceptance): TestDockerCLI/test_multiple_nodes.tcl, TestDockerCLI
You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+TestDockerCLI).
[#1495152](https://teamcity.cockroachdb.com/viewLog.html?buildId=1495152):
```
TestDockerCLI/test_multiple_nodes.tcl
--- FAIL: acceptance/TestDockerCLI: TestDockerCLI/test_multiple_nodes.tcl (38.020s)
TestDockerCLI
--- FAIL: acceptance/TestDockerCLI (354.660s)
------- Stdout: -------
test_log_scope.go:77: test logs captured to: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/acceptance/logTestDockerCLI925366613
test_log_scope.go:58: use -show-logs to present logs inline
```
Please assign, take a look and update the issue accordingly.
| test | teamcity failed test testdockercli the following tests appear to have failed on master acceptance testdockercli test multiple nodes tcl testdockercli you may want to check testdockercli test multiple nodes tcl fail acceptance testdockercli testdockercli test multiple nodes tcl testdockercli fail acceptance testdockercli stdout test log scope go test logs captured to home agent work go src github com cockroachdb cockroach artifacts acceptance test log scope go use show logs to present logs inline please assign take a look and update the issue accordingly | 1 |
299,680 | 25,917,606,179 | IssuesEvent | 2022-12-15 18:45:08 | IQSS/dataverse | https://api.github.com/repos/IQSS/dataverse | closed | Expand the suite of automated tests of the Harvesting functionality | Feature: Harvesting NIH OTA: 1.4.1 Testing: API Size: 80 | A followup issue for #8372 (PR #8753).
The existing restassured tests are fairly primitive. We need tests covering all the OAI functionality we provide. | 1.0 | Expand the suite of automated tests of the Harvesting functionality - A followup issue for #8372 (PR #8753).
The existing restassured tests are fairly primitive. We need tests covering all the OAI functionality we provide. | test | expand the suite of automated tests of the harvesting functionality a followup issue for pr the existing restassured tests are fairly primitive we need tests covering all the oai functionality we provide | 1 |
617,094 | 19,342,566,048 | IssuesEvent | 2021-12-15 07:11:56 | ballerina-platform/ballerina-dev-website | https://api.github.com/repos/ballerina-platform/ballerina-dev-website | closed | `Try Ballerina` needs a list of samples. | Priority/Low Area/Docs Type/Improvement | `Try Ballerina` needs a list of samples. It's unlikely people will be able to type hole programs without any help.
A drop down list of per-written samples would be good. | 1.0 | `Try Ballerina` needs a list of samples. - `Try Ballerina` needs a list of samples. It's unlikely people will be able to type hole programs without any help.
A drop down list of per-written samples would be good. | non_test | try ballerina needs a list of samples try ballerina needs a list of samples it s unlikely people will be able to type hole programs without any help a drop down list of per written samples would be good | 0 |
60,145 | 6,672,605,006 | IssuesEvent | 2017-10-04 12:22:24 | bokeh/bokeh | https://api.github.com/repos/bokeh/bokeh | opened | Reduce size of Travis CI logs | tag: component: tests type: task | They are enormous in size due to crap like this:
```
remote: Compressing objects: 0% (1/2287) [K
```
or
```
numpy-1.13.1-p 27% || Time: 0:00:00 48.01 MB/s
```
which makes work with out test automation unbearable. E.g. examples job generates 1.3 MB log file. | 1.0 | Reduce size of Travis CI logs - They are enormous in size due to crap like this:
```
remote: Compressing objects: 0% (1/2287) [K
```
or
```
numpy-1.13.1-p 27% || Time: 0:00:00 48.01 MB/s
```
which makes work with out test automation unbearable. E.g. examples job generates 1.3 MB log file. | test | reduce size of travis ci logs they are enormous in size due to crap like this remote compressing objects k or numpy p time mb s which makes work with out test automation unbearable e g examples job generates mb log file | 1 |
140,984 | 21,330,959,571 | IssuesEvent | 2022-04-18 08:17:26 | DeliTrbat/IPProjectB2 | https://api.github.com/repos/DeliTrbat/IPProjectB2 | closed | Settings page | UI Design | - resize (dimension) for font
- change username
- (radio) button for dark/light theme
- clear history (transfer log)
- button for disconnect device | 1.0 | Settings page - - resize (dimension) for font
- change username
- (radio) button for dark/light theme
- clear history (transfer log)
- button for disconnect device | non_test | settings page resize dimension for font change username radio button for dark light theme clear history transfer log button for disconnect device | 0 |
275,522 | 23,920,022,321 | IssuesEvent | 2022-09-09 15:56:24 | WordPress/gutenberg | https://api.github.com/repos/WordPress/gutenberg | closed | Wrong Viewport Width with Gutenberg plugin in the block pattern list | Needs Testing [Status] Stale [Feature] Patterns | ### Description
Hey,
I have noticed the viewport width is showing incorrect, if I using the Gutenberg plugin.
Correct view with latest WordPress 5.8.2 without installed Gutenberg plugin:
<img width="326" alt="Bildschirmfoto 2021-11-22 um 13 02 55" src="https://user-images.githubusercontent.com/31563167/142858821-20c6da39-91e6-46de-8c56-4d55f73bdffb.png">
Wrong view with latest WordPress 5.8.2 and with installed Gutenberg plugin 11.9.1:
<img width="329" alt="Bildschirmfoto 2021-11-22 um 13 06 08" src="https://user-images.githubusercontent.com/31563167/142858993-a03c25a6-d1b3-40a3-a921-aa28f7dce2ca.png">
In my case "FSE" Full Site Editing is disabled.
My viewport width is 1300px.
```
register_block_pattern(
'example-name',
array(
'title' => __( 'example title', 'wphave'),
'categories' => array( 'example-cat' ),
'viewportWidth' => 1300,
...
```
How I can solve this or is this an existing bug? If so, I think this should be solved before WordPres 5.9 will released. Thank you.
### Step-by-step reproduction instructions
1. Click on the block inserter
2. Click in Pattern
3. You will see the issue
### Screenshots, screen recording, code snippet
Correct view with latest WordPress 5.8.2 without installed Gutenberg plugin:
<img width="326" alt="Bildschirmfoto 2021-11-22 um 13 02 55" src="https://user-images.githubusercontent.com/31563167/142858821-20c6da39-91e6-46de-8c56-4d55f73bdffb.png">
Wrong view with latest WordPress 5.8.2 and with installed Gutenberg plugin 11.9.1:
<img width="329" alt="Bildschirmfoto 2021-11-22 um 13 06 08" src="https://user-images.githubusercontent.com/31563167/142858993-a03c25a6-d1b3-40a3-a921-aa28f7dce2ca.png">
### Environment info
WordPress 5.8.2
Gutenberg plugin 11.9.1
NO FSE Full Site Editing activated
### Please confirm that you have searched existing issues in the repo.
Yes
### Please confirm that you have tested with all plugins deactivated except Gutenberg.
Yes | 1.0 | Wrong Viewport Width with Gutenberg plugin in the block pattern list - ### Description
Hey,
I have noticed the viewport width is showing incorrect, if I using the Gutenberg plugin.
Correct view with latest WordPress 5.8.2 without installed Gutenberg plugin:
<img width="326" alt="Bildschirmfoto 2021-11-22 um 13 02 55" src="https://user-images.githubusercontent.com/31563167/142858821-20c6da39-91e6-46de-8c56-4d55f73bdffb.png">
Wrong view with latest WordPress 5.8.2 and with installed Gutenberg plugin 11.9.1:
<img width="329" alt="Bildschirmfoto 2021-11-22 um 13 06 08" src="https://user-images.githubusercontent.com/31563167/142858993-a03c25a6-d1b3-40a3-a921-aa28f7dce2ca.png">
In my case "FSE" Full Site Editing is disabled.
My viewport width is 1300px.
```
register_block_pattern(
'example-name',
array(
'title' => __( 'example title', 'wphave'),
'categories' => array( 'example-cat' ),
'viewportWidth' => 1300,
...
```
How I can solve this or is this an existing bug? If so, I think this should be solved before WordPres 5.9 will released. Thank you.
### Step-by-step reproduction instructions
1. Click on the block inserter
2. Click in Pattern
3. You will see the issue
### Screenshots, screen recording, code snippet
Correct view with latest WordPress 5.8.2 without installed Gutenberg plugin:
<img width="326" alt="Bildschirmfoto 2021-11-22 um 13 02 55" src="https://user-images.githubusercontent.com/31563167/142858821-20c6da39-91e6-46de-8c56-4d55f73bdffb.png">
Wrong view with latest WordPress 5.8.2 and with installed Gutenberg plugin 11.9.1:
<img width="329" alt="Bildschirmfoto 2021-11-22 um 13 06 08" src="https://user-images.githubusercontent.com/31563167/142858993-a03c25a6-d1b3-40a3-a921-aa28f7dce2ca.png">
### Environment info
WordPress 5.8.2
Gutenberg plugin 11.9.1
NO FSE Full Site Editing activated
### Please confirm that you have searched existing issues in the repo.
Yes
### Please confirm that you have tested with all plugins deactivated except Gutenberg.
Yes | test | wrong viewport width with gutenberg plugin in the block pattern list description hey i have noticed the viewport width is showing incorrect if i using the gutenberg plugin correct view with latest wordpress without installed gutenberg plugin img width alt bildschirmfoto um src wrong view with latest wordpress and with installed gutenberg plugin img width alt bildschirmfoto um src in my case fse full site editing is disabled my viewport width is register block pattern example name array title example title wphave categories array example cat viewportwidth how i can solve this or is this an existing bug if so i think this should be solved before wordpres will released thank you step by step reproduction instructions click on the block inserter click in pattern you will see the issue screenshots screen recording code snippet correct view with latest wordpress without installed gutenberg plugin img width alt bildschirmfoto um src wrong view with latest wordpress and with installed gutenberg plugin img width alt bildschirmfoto um src environment info wordpress gutenberg plugin no fse full site editing activated please confirm that you have searched existing issues in the repo yes please confirm that you have tested with all plugins deactivated except gutenberg yes | 1 |
198,973 | 15,017,420,026 | IssuesEvent | 2021-02-01 10:49:39 | fabric8-ui/fabric8-planner | https://api.github.com/repos/fabric8-ui/fabric8-planner | closed | Improve test coverage of AlmSearchHighlight pipe | enhancement test | New unit tests for AlmSearchHighlight should be created:
- should work when first parameter is an empty string
- should work when second parameter is an empty string
- should work when highlighting special characters such as !@#$%^&*()_+{}[]"|,./<>?`~
I'm not sure what should happen when search pattern contains only white spaces. Now it does following which doesn't feel right:
```
transform('Some string', ' ') => 'Some<b> </b>string'
```
| 1.0 | Improve test coverage of AlmSearchHighlight pipe - New unit tests for AlmSearchHighlight should be created:
- should work when first parameter is an empty string
- should work when second parameter is an empty string
- should work when highlighting special characters such as !@#$%^&*()_+{}[]"|,./<>?`~
I'm not sure what should happen when search pattern contains only white spaces. Now it does following which doesn't feel right:
```
transform('Some string', ' ') => 'Some<b> </b>string'
```
| test | improve test coverage of almsearchhighlight pipe new unit tests for almsearchhighlight should be created should work when first parameter is an empty string should work when second parameter is an empty string should work when highlighting special characters such as i m not sure what should happen when search pattern contains only white spaces now it does following which doesn t feel right transform some string some string | 1 |
89,143 | 25,593,422,780 | IssuesEvent | 2022-12-01 14:35:20 | googlefonts/gftools | https://api.github.com/repos/googlefonts/gftools | closed | Builder: Pinning dependencies for reproducible builds | builder | At the moment, we do not pin the dependencies which are used for building fonts e.g fontmake. By not pinning we cannot guarantee that font builds are reproducible.
Before the builder arrived, we used to build fonts by writing a shell script which glued all the tools together. The font repo included a requirements.txt file which contained all the dependencies pinned to a specific version. This was generated by using the command `pip freeze > requirements.txt`. Basically, each font repo had to pin its own dependencies.
If we decide to pin font build dependencies in gftools, we cannot simply pin the dependencies in the setup.py to a specific version because these dependencies may have child dependencies which are unpinned or use the “>=“ syntax. You must literally pin the whole dependency tree.
Another downside if we do pin is that we may not be able to use the libs in this repo in other projects. Imagine if a developer requires a specific version of glyphsLib, they may not be able to use `gftools.fix` as a dependency because gftools is pinned to an older version glyphsLib. However, Afaik, it looks like we only use gftools in font repos and not in other libraries so this may not be an issue.
I just chatted to Cosimo and he has the following idea:
The builder config.yml file must specify which version of gftools has been used. If the user’s version of gftools is incorrect, they will be asked to upgrade/downgrade it.
All the dependencies in gftools setup.py and requirements.txt which are used to build the fonts are pinned to a specific version. If we want to upgrade any of these dependencies, we must bump the dependency and also cut a new release of gftools.
Cosimo, feel free to point out any inaccuracies
@anthrotype @simoncozens @RosaWagner @vv-monsalve @alerque | 1.0 | Builder: Pinning dependencies for reproducible builds - At the moment, we do not pin the dependencies which are used for building fonts e.g fontmake. By not pinning we cannot guarantee that font builds are reproducible.
Before the builder arrived, we used to build fonts by writing a shell script which glued all the tools together. The font repo included a requirements.txt file which contained all the dependencies pinned to a specific version. This was generated by using the command `pip freeze > requirements.txt`. Basically, each font repo had to pin its own dependencies.
If we decide to pin font build dependencies in gftools, we cannot simply pin the dependencies in the setup.py to a specific version because these dependencies may have child dependencies which are unpinned or use the “>=“ syntax. You must literally pin the whole dependency tree.
Another downside if we do pin is that we may not be able to use the libs in this repo in other projects. Imagine if a developer requires a specific version of glyphsLib, they may not be able to use `gftools.fix` as a dependency because gftools is pinned to an older version glyphsLib. However, Afaik, it looks like we only use gftools in font repos and not in other libraries so this may not be an issue.
I just chatted to Cosimo and he has the following idea:
The builder config.yml file must specify which version of gftools has been used. If the user’s version of gftools is incorrect, they will be asked to upgrade/downgrade it.
All the dependencies in gftools setup.py and requirements.txt which are used to build the fonts are pinned to a specific version. If we want to upgrade any of these dependencies, we must bump the dependency and also cut a new release of gftools.
Cosimo, feel free to point out any inaccuracies
@anthrotype @simoncozens @RosaWagner @vv-monsalve @alerque | non_test | builder pinning dependencies for reproducible builds at the moment we do not pin the dependencies which are used for building fonts e g fontmake by not pinning we cannot guarantee that font builds are reproducible before the builder arrived we used to build fonts by writing a shell script which glued all the tools together the font repo included a requirements txt file which contained all the dependencies pinned to a specific version this was generated by using the command pip freeze requirements txt basically each font repo had to pin its own dependencies if we decide to pin font build dependencies in gftools we cannot simply pin the dependencies in the setup py to a specific version because these dependencies may have child dependencies which are unpinned or use the “ “ syntax you must literally pin the whole dependency tree another downside if we do pin is that we may not be able to use the libs in this repo in other projects imagine if a developer requires a specific version of glyphslib they may not be able to use gftools fix as a dependency because gftools is pinned to an older version glyphslib however afaik it looks like we only use gftools in font repos and not in other libraries so this may not be an issue i just chatted to cosimo and he has the following idea the builder config yml file must specify which version of gftools has been used if the user’s version of gftools is incorrect they will be asked to upgrade downgrade it all the dependencies in gftools setup py and requirements txt which are used to build the fonts are pinned to a specific version if we want to upgrade any of these dependencies we must bump the dependency and also cut a new release of gftools cosimo feel free to point out any inaccuracies anthrotype simoncozens rosawagner vv monsalve alerque | 0 |
336,302 | 30,186,600,943 | IssuesEvent | 2023-07-04 12:33:12 | TencentBlueKing/bk-repo | https://api.github.com/repos/TencentBlueKing/bk-repo | closed | generic仓库文件展示列表上方面包屑在文件夹名称超长之后导致页面显示效果有问题且导致目录树不能拖动 | bug frontend for test tested | <!-- Please use this template while reporting a bug and provide as much info as possible. Not doing so may result in your bug not being addressed in a timely manner. Thanks!-->
**What happened**:
generic仓库文件展示列表上方面包屑在文件夹名称超长之后导致页面显示效果有问题且导致目录树不能拖动
**What you expected to happen**:
页面显示正常且目录树可以正常拖动改变宽度;
面包屑内容超过两个后中间文件夹名称省略,且每个文件夹名称最长限制为200px,超过显示省略号

**How to reproduce it (as minimally and precisely as possible)**:
多个文件夹名称超长的嵌套或者一个特别长的文件夹名称
**Anything else we need to know?**:

**Environment**:
- bk-repo version (use `cat VERSION` in installed dir):
- Cloud provider or hardware configuration:
- OS (e.g: `cat /etc/os-release`):
- Kernel (e.g. `uname -a`):
- Install tools:
- Others:
| 2.0 | generic仓库文件展示列表上方面包屑在文件夹名称超长之后导致页面显示效果有问题且导致目录树不能拖动 - <!-- Please use this template while reporting a bug and provide as much info as possible. Not doing so may result in your bug not being addressed in a timely manner. Thanks!-->
**What happened**:
generic仓库文件展示列表上方面包屑在文件夹名称超长之后导致页面显示效果有问题且导致目录树不能拖动
**What you expected to happen**:
页面显示正常且目录树可以正常拖动改变宽度;
面包屑内容超过两个后中间文件夹名称省略,且每个文件夹名称最长限制为200px,超过显示省略号

**How to reproduce it (as minimally and precisely as possible)**:
多个文件夹名称超长的嵌套或者一个特别长的文件夹名称
**Anything else we need to know?**:

**Environment**:
- bk-repo version (use `cat VERSION` in installed dir):
- Cloud provider or hardware configuration:
- OS (e.g: `cat /etc/os-release`):
- Kernel (e.g. `uname -a`):
- Install tools:
- Others:
| test | generic仓库文件展示列表上方面包屑在文件夹名称超长之后导致页面显示效果有问题且导致目录树不能拖动 what happened generic仓库文件展示列表上方面包屑在文件夹名称超长之后导致页面显示效果有问题且导致目录树不能拖动 what you expected to happen 页面显示正常且目录树可以正常拖动改变宽度; 面包屑内容超过两个后中间文件夹名称省略, ,超过显示省略号 how to reproduce it as minimally and precisely as possible 多个文件夹名称超长的嵌套或者一个特别长的文件夹名称 anything else we need to know environment bk repo version use cat version in installed dir cloud provider or hardware configuration os e g cat etc os release kernel e g uname a install tools others | 1 |
79,500 | 7,718,155,615 | IssuesEvent | 2018-05-23 15:29:42 | KhronosGroup/MoltenVK | https://api.github.com/repos/KhronosGroup/MoltenVK | closed | vkGetPhysicalDeviceImageFormatProperties returns "Success" for ETC2 formats when they are not supported | bug fixed - please test & close | To decide between BC3 and ETC2 textures for one of my projects, I first check if ETC2 is supported. I do this by calling vkGetPhysicalDeviceImageFormatProperties. On MoltenVK, this returns "Success", and fills in the structure with regular info (e.g. maxExtent=16384x16384x1). Notably, the VkPhysicalDeviceFeatures for this physical device has "false" for "textureCompressionETC2" -- so on some level it's known to be unsupported.
For completeness, I'm passing this to vkGetPhysicalDeviceImageFormatProperties:
format: VK_FORMAT_ETC2_R8G8B8_UNORM_BLOCK
type: VK_IMAGE_TYPE_2D
tiling: VK_IMAGE_TILING_OPTIMAL
usage: VK_IMAGE_USAGE_TRANSFER_SRC_BIT | VK_IMAGE_USAGE_TRANSFER_DST_BIT | VK_IMAGE_USAGE_TRANSFER_SAMPLED_BIT
flags: 0
It looks like [the implementation](https://github.com/KhronosGroup/MoltenVK/blob/d5272484acc4c5c5a70d71ce45768bc29d704167/MoltenVK/MoltenVK/GPUObjects/MVKDevice.mm#L100) isn't doing any support checks, and will always return VK_SUCCESS. | 1.0 | vkGetPhysicalDeviceImageFormatProperties returns "Success" for ETC2 formats when they are not supported - To decide between BC3 and ETC2 textures for one of my projects, I first check if ETC2 is supported. I do this by calling vkGetPhysicalDeviceImageFormatProperties. On MoltenVK, this returns "Success", and fills in the structure with regular info (e.g. maxExtent=16384x16384x1). Notably, the VkPhysicalDeviceFeatures for this physical device has "false" for "textureCompressionETC2" -- so on some level it's known to be unsupported.
For completeness, I'm passing this to vkGetPhysicalDeviceImageFormatProperties:
format: VK_FORMAT_ETC2_R8G8B8_UNORM_BLOCK
type: VK_IMAGE_TYPE_2D
tiling: VK_IMAGE_TILING_OPTIMAL
usage: VK_IMAGE_USAGE_TRANSFER_SRC_BIT | VK_IMAGE_USAGE_TRANSFER_DST_BIT | VK_IMAGE_USAGE_TRANSFER_SAMPLED_BIT
flags: 0
It looks like [the implementation](https://github.com/KhronosGroup/MoltenVK/blob/d5272484acc4c5c5a70d71ce45768bc29d704167/MoltenVK/MoltenVK/GPUObjects/MVKDevice.mm#L100) isn't doing any support checks, and will always return VK_SUCCESS. | test | vkgetphysicaldeviceimageformatproperties returns success for formats when they are not supported to decide between and textures for one of my projects i first check if is supported i do this by calling vkgetphysicaldeviceimageformatproperties on moltenvk this returns success and fills in the structure with regular info e g maxextent notably the vkphysicaldevicefeatures for this physical device has false for so on some level it s known to be unsupported for completeness i m passing this to vkgetphysicaldeviceimageformatproperties format vk format unorm block type vk image type tiling vk image tiling optimal usage vk image usage transfer src bit vk image usage transfer dst bit vk image usage transfer sampled bit flags it looks like isn t doing any support checks and will always return vk success | 1 |
291,812 | 25,177,778,484 | IssuesEvent | 2022-11-11 10:49:36 | wazuh/wazuh-qa | https://api.github.com/repos/wazuh/wazuh-qa | closed | Windows agent MSI installer disables Restart Manager and closes GUI | team/qa target/4.4.0 type/dev-testing subteam/qa-main | | Target version | Related issue | Related PR |
|--------------------|--------------------|-----------------|
| 4.4 | https://github.com/wazuh/wazuh-packages/issues/1485| https://github.com/wazuh/wazuh/pull/15207 |
<!-- Important: No section may be left blank. If not, delete it directly (in principle only Steps to reproduce could be left blank in case of not proceeding, although there are always exceptions). -->
## Description
<!-- Description that puts into context and shows the QA tester the changes that have been made by the developer and need to be tested. -->
The Wazuh agent Windows MSI installer creator now disables the Restart Manager and manually closes the GUI, as it gave some problems to upgrade when it was open.
## Proposed checks
<!-- Indicate through a list of checkboxes the suggested checks to be carried out by the QA tester -->
For every Windows supported (xp, 2012R2, 2016, 2019, 10, 2022, 11):
- [x] Agent installation
- [x] Agent upgrade from version 4.2.x and 4.3.x with open GUI
- [x] Agent upgrade from version 4.2.x and 4.3.x without an open GUI
## Expected results
<!-- Indicate expected results such as behaviors, logs... -->
The upgrade must finish without errors and the installation can't force a reboot no matter what options are chosen in the FilesInUse Dialog
| 1.0 | Windows agent MSI installer disables Restart Manager and closes GUI - | Target version | Related issue | Related PR |
|--------------------|--------------------|-----------------|
| 4.4 | https://github.com/wazuh/wazuh-packages/issues/1485| https://github.com/wazuh/wazuh/pull/15207 |
<!-- Important: No section may be left blank. If not, delete it directly (in principle only Steps to reproduce could be left blank in case of not proceeding, although there are always exceptions). -->
## Description
<!-- Description that puts into context and shows the QA tester the changes that have been made by the developer and need to be tested. -->
The Wazuh agent Windows MSI installer creator now disables the Restart Manager and manually closes the GUI, as it gave some problems to upgrade when it was open.
## Proposed checks
<!-- Indicate through a list of checkboxes the suggested checks to be carried out by the QA tester -->
For every Windows supported (xp, 2012R2, 2016, 2019, 10, 2022, 11):
- [x] Agent installation
- [x] Agent upgrade from version 4.2.x and 4.3.x with open GUI
- [x] Agent upgrade from version 4.2.x and 4.3.x without an open GUI
## Expected results
<!-- Indicate expected results such as behaviors, logs... -->
The upgrade must finish without errors and the installation can't force a reboot no matter what options are chosen in the FilesInUse Dialog
| test | windows agent msi installer disables restart manager and closes gui target version related issue related pr description the wazuh agent windows msi installer creator now disables the restart manager and manually closes the gui as it gave some problems to upgrade when it was open proposed checks for every windows supported xp agent installation agent upgrade from version x and x with open gui agent upgrade from version x and x without an open gui expected results the upgrade must finish without errors and the installation can t force a reboot no matter what options are chosen in the filesinuse dialog | 1 |
30,033 | 8,468,137,254 | IssuesEvent | 2018-10-23 18:52:36 | angular/angular-cli | https://api.github.com/repos/angular/angular-cli | closed | tslib helpers replacement causes build failure "Module parse failed: 'import' and 'export' may only appear at the top level" | comp: devkit/build-optimizer effort1: easy (hours) freq1: low severity3: broken | <!--
IF YOU DON'T FILL OUT THE FOLLOWING INFORMATION YOUR ISSUE MIGHT BE CLOSED WITHOUT INVESTIGATING
-->
### Bug Report or Feature Request (mark with an `x`)
```
- [x] bug report -> please search issues before submitting
- [ ] feature request
```
### Command (mark with an `x`)
```
- [ ] new
- [x] build
- [ ] serve
- [ ] test
- [ ] e2e
- [ ] generate
- [ ] add
- [ ] update
- [ ] lint
- [ ] xi18n
- [ ] run
- [ ] config
- [ ] help
- [ ] version
- [ ] doc
```
### Versions
<!--
Output from: `node --version`, `npm --version` and `ng --version`.
Windows (7/8/10). Linux (incl. distribution). macOS (El Capitan? Sierra? High Sierra?)
-->
npm 6.4.1
Node: 10.12.0
OS: win32 x64
Angular: 6.1.10
... animations, common, compiler, compiler-cli, core, forms
... http, language-service, platform-browser
... platform-browser-dynamic, router
Package Version
-----------------------------------------------------------
@angular-devkit/architect 0.8.5
@angular-devkit/build-angular 0.8.5
@angular-devkit/build-optimizer 0.8.5
@angular-devkit/build-webpack 0.8.5
@angular-devkit/core 0.8.4
@angular-devkit/schematics 0.8.4
@angular/cli 6.2.4
@ngtools/json-schema 1.1.0
@ngtools/webpack 6.2.5
@schematics/angular 0.8.4
@schematics/update 0.8.4
ng-packagr 4.3.0
rxjs 6.3.3
typescript 2.9.2
webpack 4.20.2
### Repro steps
<!--
Simple steps to reproduce this bug.
Please include: commands run (incl args), packages added, related code changes.
A link to a sample repo would help too.
-->
using a library with the following code
```
/*! Built with http://stenciljs.com */
export default function appGlobal(namespace, Context, window, document, resourcesUrl, hydratedCssClass) {(function(resourcesUrl){
var __extends = (this && this.__extends) || (function () {
var extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; };
return function (d, b) {
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
```
optimizer replaces __extends in the wrong place, the result is
```
Module parse failed: 'import' and 'export' may only appear at the top level (3:8)
You may need an appropriate loader to handle this file type.
| export default function appGlobal(namespace, Context, window, document, resourcesUrl, hydratedCssClass) {
| (function (resourcesUrl) {
> import { __extends } from "tslib";
| /*!
```
### Desired functionality
<!--
What would like to see implemented?
What is the usecase?
-->
Can you put the imports uniquely at the top of the file?
| 1.0 | tslib helpers replacement causes build failure "Module parse failed: 'import' and 'export' may only appear at the top level" - <!--
IF YOU DON'T FILL OUT THE FOLLOWING INFORMATION YOUR ISSUE MIGHT BE CLOSED WITHOUT INVESTIGATING
-->
### Bug Report or Feature Request (mark with an `x`)
```
- [x] bug report -> please search issues before submitting
- [ ] feature request
```
### Command (mark with an `x`)
```
- [ ] new
- [x] build
- [ ] serve
- [ ] test
- [ ] e2e
- [ ] generate
- [ ] add
- [ ] update
- [ ] lint
- [ ] xi18n
- [ ] run
- [ ] config
- [ ] help
- [ ] version
- [ ] doc
```
### Versions
<!--
Output from: `node --version`, `npm --version` and `ng --version`.
Windows (7/8/10). Linux (incl. distribution). macOS (El Capitan? Sierra? High Sierra?)
-->
npm 6.4.1
Node: 10.12.0
OS: win32 x64
Angular: 6.1.10
... animations, common, compiler, compiler-cli, core, forms
... http, language-service, platform-browser
... platform-browser-dynamic, router
Package Version
-----------------------------------------------------------
@angular-devkit/architect 0.8.5
@angular-devkit/build-angular 0.8.5
@angular-devkit/build-optimizer 0.8.5
@angular-devkit/build-webpack 0.8.5
@angular-devkit/core 0.8.4
@angular-devkit/schematics 0.8.4
@angular/cli 6.2.4
@ngtools/json-schema 1.1.0
@ngtools/webpack 6.2.5
@schematics/angular 0.8.4
@schematics/update 0.8.4
ng-packagr 4.3.0
rxjs 6.3.3
typescript 2.9.2
webpack 4.20.2
### Repro steps
<!--
Simple steps to reproduce this bug.
Please include: commands run (incl args), packages added, related code changes.
A link to a sample repo would help too.
-->
using a library with the following code
```
/*! Built with http://stenciljs.com */
export default function appGlobal(namespace, Context, window, document, resourcesUrl, hydratedCssClass) {(function(resourcesUrl){
var __extends = (this && this.__extends) || (function () {
var extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; };
return function (d, b) {
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
```
optimizer replaces __extends in the wrong place, the result is
```
Module parse failed: 'import' and 'export' may only appear at the top level (3:8)
You may need an appropriate loader to handle this file type.
| export default function appGlobal(namespace, Context, window, document, resourcesUrl, hydratedCssClass) {
| (function (resourcesUrl) {
> import { __extends } from "tslib";
| /*!
```
### Desired functionality
<!--
What would like to see implemented?
What is the usecase?
-->
Can you put the imports uniquely at the top of the file?
| non_test | tslib helpers replacement causes build failure module parse failed import and export may only appear at the top level if you don t fill out the following information your issue might be closed without investigating bug report or feature request mark with an x bug report please search issues before submitting feature request command mark with an x new build serve test generate add update lint run config help version doc versions output from node version npm version and ng version windows linux incl distribution macos el capitan sierra high sierra npm node os angular animations common compiler compiler cli core forms http language service platform browser platform browser dynamic router package version angular devkit architect angular devkit build angular angular devkit build optimizer angular devkit build webpack angular devkit core angular devkit schematics angular cli ngtools json schema ngtools webpack schematics angular schematics update ng packagr rxjs typescript webpack repro steps simple steps to reproduce this bug please include commands run incl args packages added related code changes a link to a sample repo would help too using a library with the following code built with export default function appglobal namespace context window document resourcesurl hydratedcssclass function resourcesurl var extends this this extends function var extendstatics object setprototypeof proto instanceof array function d b d proto b function d b for var p in b if b hasownproperty p d b return function d b extendstatics d b function this constructor d d prototype b null object create b prototype b prototype new optimizer replaces extends in the wrong place the result is module parse failed import and export may only appear at the top level you may need an appropriate loader to handle this file type export default function appglobal namespace context window document resourcesurl hydratedcssclass function resourcesurl import extends from tslib desired functionality what would like to see implemented what is the usecase can you put the imports uniquely at the top of the file | 0 |
545,671 | 15,954,907,520 | IssuesEvent | 2021-04-15 14:04:17 | OpenNebula/one | https://api.github.com/repos/OpenNebula/one | closed | Service template registration time is not added when cloning an existing one | Category: Orchestration - Flow Priority: Normal Sponsored Status: Accepted Type: Bug | **Description**
When cloning an existing service template that have no registration time the time will not add any registration timestamp to the new template.
**To Reproduce**
1. Clone a service template from a template without registration time.
2. Verify JSON or Sunstone.
**Expected behavior**
Registration should be added to the new service template for new created or cloned service templates
**Details**
- Affected Component: OneFlow server
- Version: 6.0.0
<!--////////////////////////////////////////////-->
<!-- THIS SECTION IS FOR THE DEVELOPMENT TEAM -->
<!-- BOTH FOR BUGS AND ENHANCEMENT REQUESTS -->
<!-- PROGRESS WILL BE REFLECTED HERE -->
<!--////////////////////////////////////////////-->
## Progress Status
- [ ] Branch created
- [ ] Code committed to development branch
- [ ] Testing - QA
- [ ] Documentation
- [ ] Release notes - resolved issues, compatibility, known issues
- [ ] Code committed to upstream release/hotfix branches
- [ ] Documentation committed to upstream release/hotfix branches
| 1.0 | Service template registration time is not added when cloning an existing one - **Description**
When cloning an existing service template that have no registration time the time will not add any registration timestamp to the new template.
**To Reproduce**
1. Clone a service template from a template without registration time.
2. Verify JSON or Sunstone.
**Expected behavior**
Registration should be added to the new service template for new created or cloned service templates
**Details**
- Affected Component: OneFlow server
- Version: 6.0.0
<!--////////////////////////////////////////////-->
<!-- THIS SECTION IS FOR THE DEVELOPMENT TEAM -->
<!-- BOTH FOR BUGS AND ENHANCEMENT REQUESTS -->
<!-- PROGRESS WILL BE REFLECTED HERE -->
<!--////////////////////////////////////////////-->
## Progress Status
- [ ] Branch created
- [ ] Code committed to development branch
- [ ] Testing - QA
- [ ] Documentation
- [ ] Release notes - resolved issues, compatibility, known issues
- [ ] Code committed to upstream release/hotfix branches
- [ ] Documentation committed to upstream release/hotfix branches
| non_test | service template registration time is not added when cloning an existing one description when cloning an existing service template that have no registration time the time will not add any registration timestamp to the new template to reproduce clone a service template from a template without registration time verify json or sunstone expected behavior registration should be added to the new service template for new created or cloned service templates details affected component oneflow server version progress status branch created code committed to development branch testing qa documentation release notes resolved issues compatibility known issues code committed to upstream release hotfix branches documentation committed to upstream release hotfix branches | 0 |
14,047 | 3,219,725,598 | IssuesEvent | 2015-10-08 11:23:38 | google/WebFundamentals | https://api.github.com/repos/google/WebFundamentals | closed | Material design: scrolling page, header bar jumps instead of transitions | P2 type-Design | Scrolling the page, the header quickly jumps to its smaller fixed position. It doesn't feel right and isn't the MD spec. Instead, the bar should ease/transition to it's final state.
Something like:
https://elements.polymer-project.org/elements/paper-scroll-header-panel?view=demo:demo/index.html | 1.0 | Material design: scrolling page, header bar jumps instead of transitions - Scrolling the page, the header quickly jumps to its smaller fixed position. It doesn't feel right and isn't the MD spec. Instead, the bar should ease/transition to it's final state.
Something like:
https://elements.polymer-project.org/elements/paper-scroll-header-panel?view=demo:demo/index.html | non_test | material design scrolling page header bar jumps instead of transitions scrolling the page the header quickly jumps to its smaller fixed position it doesn t feel right and isn t the md spec instead the bar should ease transition to it s final state something like | 0 |
38,686 | 5,195,884,875 | IssuesEvent | 2017-01-23 10:52:52 | DynamoRIO/dynamorio | https://api.github.com/repos/DynamoRIO/dynamorio | closed | unit_tests failing: kernel32_file.c:2378 test_file_times | Bug-Assert Component-Tests GoodContrib GoodFirstBug OpSys-Windows | http://dynamorio.org/CDash/testDetails.php?test=178469&build=16800
http://dynamorio.org/CDash/testDetails.php?test=178476&build=16801
EXPECT failed at D:\derek\dr\nightly\src\core\win32\drwinapi\kernel32_file.c:2378 in test test_file_times: memcmp(&st_ours, &st_native, sizeof(st_ours))
There were no commits recently and the failure was on Dec 31. Perhaps there's some end-of-year issue: maybe a bug in redirect_FileTimeToSystemTime() in computing the year. | 1.0 | unit_tests failing: kernel32_file.c:2378 test_file_times - http://dynamorio.org/CDash/testDetails.php?test=178469&build=16800
http://dynamorio.org/CDash/testDetails.php?test=178476&build=16801
EXPECT failed at D:\derek\dr\nightly\src\core\win32\drwinapi\kernel32_file.c:2378 in test test_file_times: memcmp(&st_ours, &st_native, sizeof(st_ours))
There were no commits recently and the failure was on Dec 31. Perhaps there's some end-of-year issue: maybe a bug in redirect_FileTimeToSystemTime() in computing the year. | test | unit tests failing file c test file times expect failed at d derek dr nightly src core drwinapi file c in test test file times memcmp st ours st native sizeof st ours there were no commits recently and the failure was on dec perhaps there s some end of year issue maybe a bug in redirect filetimetosystemtime in computing the year | 1 |
11,504 | 13,494,804,454 | IssuesEvent | 2020-09-11 22:15:05 | KazDragon/munin | https://api.github.com/repos/KazDragon/munin | closed | Migrate from INSTANTIATE_TEST_CASE_P | Compatibility | Google test has deprecated this in favour of INSTANTIATE_TEST_SUITE_P. | True | Migrate from INSTANTIATE_TEST_CASE_P - Google test has deprecated this in favour of INSTANTIATE_TEST_SUITE_P. | non_test | migrate from instantiate test case p google test has deprecated this in favour of instantiate test suite p | 0 |
466,175 | 13,398,146,994 | IssuesEvent | 2020-09-03 12:44:50 | googleapis/google-cloud-php | https://api.github.com/repos/googleapis/google-cloud-php | closed | Synthesis failed for oslogin | api: oslogin autosynth failure priority: p1 type: bug | Hello! Autosynth couldn't regenerate oslogin. :broken_heart:
Here's the output from running `synth.py`:
```
c_generator_python/requirements.txt (line 4))
Using cached https://files.pythonhosted.org/packages/30/9e/f663a2aa66a09d838042ae1a2c5659828bb9b41ea3a6efa20a20fd92b121/Jinja2-2.11.2-py2.py3-none-any.whl
Saved ./Jinja2-2.11.2-py2.py3-none-any.whl
Collecting MarkupSafe==1.1.1 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 5))
Using cached https://files.pythonhosted.org/packages/b2/5f/23e0023be6bb885d00ffbefad2942bc51a620328ee910f64abe5a8d18dd1/MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
Saved ./MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
Collecting protobuf==3.13.0 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 6))
Using cached https://files.pythonhosted.org/packages/30/79/510974552cebff2ba04038544799450defe75e96ea5f1675dbf72cc8744f/protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
Saved ./protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
Collecting pypandoc==1.5 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 7))
Using cached https://files.pythonhosted.org/packages/d6/b7/5050dc1769c8a93d3ec7c4bd55be161991c94b8b235f88bf7c764449e708/pypandoc-1.5.tar.gz
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/tmpfs/tmp/tmpg8yutz55/setuptools-tmp/setuptools/__init__.py", line 6, in <module>
import distutils.core
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/_distutils_hack/__init__.py", line 83, in create_module
return importlib.import_module('setuptools._distutils')
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
ModuleNotFoundError: No module named 'setuptools._distutils'
----------------------------------------
(Command "python setup.py egg_info" failed with error code 1 in /tmpfs/tmp/pip-build-izvkik37/pypandoc/
)
ERROR: no such package '@gapic_generator_python_pip_deps//': pip_import failed: Collecting click==7.1.2 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 1))
Using cached https://files.pythonhosted.org/packages/d2/3d/fa76db83bf75c4f8d338c2fd15c8d33fdd7ad23a9b5e57eb6c5de26b430e/click-7.1.2-py2.py3-none-any.whl
Saved ./click-7.1.2-py2.py3-none-any.whl
Collecting google-api-core==1.22.1 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 2))
Using cached https://files.pythonhosted.org/packages/e0/2d/7c6c75013105e1d2b6eaa1bf18a56995be1dbc673c38885aea31136e9918/google_api_core-1.22.1-py2.py3-none-any.whl
Saved ./google_api_core-1.22.1-py2.py3-none-any.whl
Collecting googleapis-common-protos==1.52.0 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 3))
Using cached https://files.pythonhosted.org/packages/03/74/3956721ea1eb4bcf7502a311fdaa60b85bd751de4e57d1943afe9b334141/googleapis_common_protos-1.52.0-py2.py3-none-any.whl
Saved ./googleapis_common_protos-1.52.0-py2.py3-none-any.whl
Collecting jinja2==2.11.2 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 4))
Using cached https://files.pythonhosted.org/packages/30/9e/f663a2aa66a09d838042ae1a2c5659828bb9b41ea3a6efa20a20fd92b121/Jinja2-2.11.2-py2.py3-none-any.whl
Saved ./Jinja2-2.11.2-py2.py3-none-any.whl
Collecting MarkupSafe==1.1.1 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 5))
Using cached https://files.pythonhosted.org/packages/b2/5f/23e0023be6bb885d00ffbefad2942bc51a620328ee910f64abe5a8d18dd1/MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
Saved ./MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
Collecting protobuf==3.13.0 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 6))
Using cached https://files.pythonhosted.org/packages/30/79/510974552cebff2ba04038544799450defe75e96ea5f1675dbf72cc8744f/protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
Saved ./protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
Collecting pypandoc==1.5 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 7))
Using cached https://files.pythonhosted.org/packages/d6/b7/5050dc1769c8a93d3ec7c4bd55be161991c94b8b235f88bf7c764449e708/pypandoc-1.5.tar.gz
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/tmpfs/tmp/tmpg8yutz55/setuptools-tmp/setuptools/__init__.py", line 6, in <module>
import distutils.core
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/_distutils_hack/__init__.py", line 83, in create_module
return importlib.import_module('setuptools._distutils')
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
ModuleNotFoundError: No module named 'setuptools._distutils'
----------------------------------------
(Command "python setup.py egg_info" failed with error code 1 in /tmpfs/tmp/pip-build-izvkik37/pypandoc/
)
INFO: Elapsed time: 2.362s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
FAILED: Build did NOT complete successfully (0 packages loaded)
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/kbuilder/.cache/synthtool/google-cloud-php/OsLogin/synth.py", line 29, in <module>
bazel_target='//google/cloud/oslogin/v1beta:google-cloud-oslogin-v1beta-php'
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 58, in php_library
return self._generate_code(service, version, "php", **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 183, in _generate_code
shell.run(bazel_run_args)
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/cloud/oslogin/v1beta:google-cloud-oslogin-v1beta-php']' returned non-zero exit status 1.
2020-09-02 03:03:25,205 autosynth [ERROR] > Synthesis failed
2020-09-02 03:03:25,206 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 8038912e chore(deps): update dependency google/cloud-tools to ^0.11.0 (#3363)
2020-09-02 03:03:25,239 autosynth [DEBUG] > Running: git checkout autosynth-oslogin
Switched to branch 'autosynth-oslogin'
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 690, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 539, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 670, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 375, in synthesize_loop
has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 273, in synthesize_version_in_new_branch
synthesizer.synthesize(synth_log_path, self.environ)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
```
Google internal developers can see the full log [here](http://sponge2/results/invocations/c2163f87-9245-498a-8edb-c7613ed58dfe/targets/github%2Fsynthtool;config=default/tests;query=google-cloud-php;failed=false).
| 1.0 | Synthesis failed for oslogin - Hello! Autosynth couldn't regenerate oslogin. :broken_heart:
Here's the output from running `synth.py`:
```
c_generator_python/requirements.txt (line 4))
Using cached https://files.pythonhosted.org/packages/30/9e/f663a2aa66a09d838042ae1a2c5659828bb9b41ea3a6efa20a20fd92b121/Jinja2-2.11.2-py2.py3-none-any.whl
Saved ./Jinja2-2.11.2-py2.py3-none-any.whl
Collecting MarkupSafe==1.1.1 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 5))
Using cached https://files.pythonhosted.org/packages/b2/5f/23e0023be6bb885d00ffbefad2942bc51a620328ee910f64abe5a8d18dd1/MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
Saved ./MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
Collecting protobuf==3.13.0 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 6))
Using cached https://files.pythonhosted.org/packages/30/79/510974552cebff2ba04038544799450defe75e96ea5f1675dbf72cc8744f/protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
Saved ./protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
Collecting pypandoc==1.5 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 7))
Using cached https://files.pythonhosted.org/packages/d6/b7/5050dc1769c8a93d3ec7c4bd55be161991c94b8b235f88bf7c764449e708/pypandoc-1.5.tar.gz
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/tmpfs/tmp/tmpg8yutz55/setuptools-tmp/setuptools/__init__.py", line 6, in <module>
import distutils.core
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/_distutils_hack/__init__.py", line 83, in create_module
return importlib.import_module('setuptools._distutils')
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
ModuleNotFoundError: No module named 'setuptools._distutils'
----------------------------------------
(Command "python setup.py egg_info" failed with error code 1 in /tmpfs/tmp/pip-build-izvkik37/pypandoc/
)
ERROR: no such package '@gapic_generator_python_pip_deps//': pip_import failed: Collecting click==7.1.2 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 1))
Using cached https://files.pythonhosted.org/packages/d2/3d/fa76db83bf75c4f8d338c2fd15c8d33fdd7ad23a9b5e57eb6c5de26b430e/click-7.1.2-py2.py3-none-any.whl
Saved ./click-7.1.2-py2.py3-none-any.whl
Collecting google-api-core==1.22.1 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 2))
Using cached https://files.pythonhosted.org/packages/e0/2d/7c6c75013105e1d2b6eaa1bf18a56995be1dbc673c38885aea31136e9918/google_api_core-1.22.1-py2.py3-none-any.whl
Saved ./google_api_core-1.22.1-py2.py3-none-any.whl
Collecting googleapis-common-protos==1.52.0 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 3))
Using cached https://files.pythonhosted.org/packages/03/74/3956721ea1eb4bcf7502a311fdaa60b85bd751de4e57d1943afe9b334141/googleapis_common_protos-1.52.0-py2.py3-none-any.whl
Saved ./googleapis_common_protos-1.52.0-py2.py3-none-any.whl
Collecting jinja2==2.11.2 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 4))
Using cached https://files.pythonhosted.org/packages/30/9e/f663a2aa66a09d838042ae1a2c5659828bb9b41ea3a6efa20a20fd92b121/Jinja2-2.11.2-py2.py3-none-any.whl
Saved ./Jinja2-2.11.2-py2.py3-none-any.whl
Collecting MarkupSafe==1.1.1 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 5))
Using cached https://files.pythonhosted.org/packages/b2/5f/23e0023be6bb885d00ffbefad2942bc51a620328ee910f64abe5a8d18dd1/MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
Saved ./MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
Collecting protobuf==3.13.0 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 6))
Using cached https://files.pythonhosted.org/packages/30/79/510974552cebff2ba04038544799450defe75e96ea5f1675dbf72cc8744f/protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
Saved ./protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
Collecting pypandoc==1.5 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 7))
Using cached https://files.pythonhosted.org/packages/d6/b7/5050dc1769c8a93d3ec7c4bd55be161991c94b8b235f88bf7c764449e708/pypandoc-1.5.tar.gz
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/tmpfs/tmp/tmpg8yutz55/setuptools-tmp/setuptools/__init__.py", line 6, in <module>
import distutils.core
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/_distutils_hack/__init__.py", line 83, in create_module
return importlib.import_module('setuptools._distutils')
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
ModuleNotFoundError: No module named 'setuptools._distutils'
----------------------------------------
(Command "python setup.py egg_info" failed with error code 1 in /tmpfs/tmp/pip-build-izvkik37/pypandoc/
)
INFO: Elapsed time: 2.362s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
FAILED: Build did NOT complete successfully (0 packages loaded)
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/kbuilder/.cache/synthtool/google-cloud-php/OsLogin/synth.py", line 29, in <module>
bazel_target='//google/cloud/oslogin/v1beta:google-cloud-oslogin-v1beta-php'
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 58, in php_library
return self._generate_code(service, version, "php", **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 183, in _generate_code
shell.run(bazel_run_args)
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/cloud/oslogin/v1beta:google-cloud-oslogin-v1beta-php']' returned non-zero exit status 1.
2020-09-02 03:03:25,205 autosynth [ERROR] > Synthesis failed
2020-09-02 03:03:25,206 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 8038912e chore(deps): update dependency google/cloud-tools to ^0.11.0 (#3363)
2020-09-02 03:03:25,239 autosynth [DEBUG] > Running: git checkout autosynth-oslogin
Switched to branch 'autosynth-oslogin'
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 690, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 539, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 670, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 375, in synthesize_loop
has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 273, in synthesize_version_in_new_branch
synthesizer.synthesize(synth_log_path, self.environ)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
```
Google internal developers can see the full log [here](http://sponge2/results/invocations/c2163f87-9245-498a-8edb-c7613ed58dfe/targets/github%2Fsynthtool;config=default/tests;query=google-cloud-php;failed=false).
| non_test | synthesis failed for oslogin hello autosynth couldn t regenerate oslogin broken heart here s the output from running synth py c generator python requirements txt line using cached saved none any whl collecting markupsafe from r home kbuilder cache bazel bazel kbuilder external gapic generator python requirements txt line using cached saved markupsafe whl collecting protobuf from r home kbuilder cache bazel bazel kbuilder external gapic generator python requirements txt line using cached saved protobuf whl collecting pypandoc from r home kbuilder cache bazel bazel kbuilder external gapic generator python requirements txt line using cached complete output from command python setup py egg info traceback most recent call last file line in file tmpfs tmp setuptools tmp setuptools init py line in import distutils core file tmpfs src github synthtool env lib site packages distutils hack init py line in create module return importlib import module setuptools distutils file home kbuilder pyenv versions lib importlib init py line in import module return bootstrap gcd import name package level modulenotfounderror no module named setuptools distutils command python setup py egg info failed with error code in tmpfs tmp pip build pypandoc error no such package gapic generator python pip deps pip import failed collecting click from r home kbuilder cache bazel bazel kbuilder external gapic generator python requirements txt line using cached saved click none any whl collecting google api core from r home kbuilder cache bazel bazel kbuilder external gapic generator python requirements txt line using cached saved google api core none any whl collecting googleapis common protos from r home kbuilder cache bazel bazel kbuilder external gapic generator python requirements txt line using cached saved googleapis common protos none any whl collecting from r home kbuilder cache bazel bazel kbuilder external gapic generator python requirements txt line using cached saved none any whl collecting markupsafe from r home kbuilder cache bazel bazel kbuilder external gapic generator python requirements txt line using cached saved markupsafe whl collecting protobuf from r home kbuilder cache bazel bazel kbuilder external gapic generator python requirements txt line using cached saved protobuf whl collecting pypandoc from r home kbuilder cache bazel bazel kbuilder external gapic generator python requirements txt line using cached complete output from command python setup py egg info traceback most recent call last file line in file tmpfs tmp setuptools tmp setuptools init py line in import distutils core file tmpfs src github synthtool env lib site packages distutils hack init py line in create module return importlib import module setuptools distutils file home kbuilder pyenv versions lib importlib init py line in import module return bootstrap gcd import name package level modulenotfounderror no module named setuptools distutils command python setup py egg info failed with error code in tmpfs tmp pip build pypandoc info elapsed time info processes failed build did not complete successfully packages loaded failed build did not complete successfully packages loaded traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src github synthtool synthtool main py line in main file tmpfs src github synthtool env lib site packages click core py line in call return self main args kwargs file tmpfs src github synthtool env lib site packages click core py line in main rv self invoke ctx file tmpfs src github synthtool env lib site packages click core py line in invoke return ctx invoke self callback ctx params file tmpfs src github synthtool env lib site packages click core py line in invoke return callback args kwargs file tmpfs src github synthtool synthtool main py line in main spec loader exec module synth module type ignore file line in exec module file line in call with frames removed file home kbuilder cache synthtool google cloud php oslogin synth py line in bazel target google cloud oslogin google cloud oslogin php file tmpfs src github synthtool synthtool gcp gapic bazel py line in php library return self generate code service version php kwargs file tmpfs src github synthtool synthtool gcp gapic bazel py line in generate code shell run bazel run args file tmpfs src github synthtool synthtool shell py line in run raise exc file tmpfs src github synthtool synthtool shell py line in run encoding utf file home kbuilder pyenv versions lib subprocess py line in run output stdout stderr stderr subprocess calledprocesserror command returned non zero exit status autosynth synthesis failed autosynth running git reset hard head head is now at chore deps update dependency google cloud tools to autosynth running git checkout autosynth oslogin switched to branch autosynth oslogin traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src github synthtool autosynth synth py line in main file tmpfs src github synthtool autosynth synth py line in main return inner main temp dir file tmpfs src github synthtool autosynth synth py line in inner main commit count synthesize loop x multiple prs change pusher synthesizer file tmpfs src github synthtool autosynth synth py line in synthesize loop has changes toolbox synthesize version in new branch synthesizer youngest file tmpfs src github synthtool autosynth synth py line in synthesize version in new branch synthesizer synthesize synth log path self environ file tmpfs src github synthtool autosynth synthesizer py line in synthesize synth proc check returncode raise an exception file home kbuilder pyenv versions lib subprocess py line in check returncode self stderr subprocess calledprocesserror command returned non zero exit status google internal developers can see the full log | 0 |
279,530 | 30,706,474,413 | IssuesEvent | 2023-07-27 06:38:43 | amaybaum-dev/vprofile-project2 | https://api.github.com/repos/amaybaum-dev/vprofile-project2 | opened | CVE-2015-4473 (Critical) detected in jackson-databind-2.9.10.4.jar | Mend: dependency security vulnerability | ## CVE-2015-4473 - Critical Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.10.4.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.10.4/jackson-databind-2.9.10.4.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.10.4.jar** (Vulnerable Library)
<p>Found in base branch: <b>vp-rem</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/critical_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Multiple unspecified vulnerabilities in the browser engine in Mozilla Firefox before 40.0 and Firefox ESR 38.x before 38.2 allow remote attackers to cause a denial of service (memory corruption and application crash) or possibly execute arbitrary code via unknown vectors.
<p>Publish Date: 2015-08-16
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-4473>CVE-2015-4473</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="test">test</a></p>
<p>Release Date: 2015-08-16</p>
<p>Fix Resolution: test</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue | True | CVE-2015-4473 (Critical) detected in jackson-databind-2.9.10.4.jar - ## CVE-2015-4473 - Critical Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.10.4.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.10.4/jackson-databind-2.9.10.4.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.10.4.jar** (Vulnerable Library)
<p>Found in base branch: <b>vp-rem</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/critical_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Multiple unspecified vulnerabilities in the browser engine in Mozilla Firefox before 40.0 and Firefox ESR 38.x before 38.2 allow remote attackers to cause a denial of service (memory corruption and application crash) or possibly execute arbitrary code via unknown vectors.
<p>Publish Date: 2015-08-16
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-4473>CVE-2015-4473</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="test">test</a></p>
<p>Release Date: 2015-08-16</p>
<p>Fix Resolution: test</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue | non_test | cve critical detected in jackson databind jar cve critical severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in base branch vp rem vulnerability details multiple unspecified vulnerabilities in the browser engine in mozilla firefox before and firefox esr x before allow remote attackers to cause a denial of service memory corruption and application crash or possibly execute arbitrary code via unknown vectors publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin test release date fix resolution test rescue worker helmet automatic remediation is available for this issue | 0 |
101,916 | 16,534,798,275 | IssuesEvent | 2021-05-27 10:30:59 | CatalystOne/ngx-jira-issue-collector | https://api.github.com/repos/CatalystOne/ngx-jira-issue-collector | opened | CVE-2021-28092 (High) detected in is-svg-3.0.0.tgz | security vulnerability | ## CVE-2021-28092 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>is-svg-3.0.0.tgz</b></p></summary>
<p>Check if a string or buffer is SVG</p>
<p>Library home page: <a href="https://registry.npmjs.org/is-svg/-/is-svg-3.0.0.tgz">https://registry.npmjs.org/is-svg/-/is-svg-3.0.0.tgz</a></p>
<p>Path to dependency file: ngx-jira-issue-collector/package.json</p>
<p>Path to vulnerable library: ngx-jira-issue-collector/node_modules/is-svg/package.json</p>
<p>
Dependency Hierarchy:
- build-angular-0.901.7.tgz (Root Library)
- cssnano-4.1.10.tgz
- cssnano-preset-default-4.0.7.tgz
- postcss-svgo-4.0.2.tgz
- :x: **is-svg-3.0.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/CatalystOne/ngx-jira-issue-collector/commit/fd73bf999713801f06b8d55964e8e03ae1759dc2">fd73bf999713801f06b8d55964e8e03ae1759dc2</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The is-svg package 2.1.0 through 4.2.1 for Node.js uses a regular expression that is vulnerable to Regular Expression Denial of Service (ReDoS). If an attacker provides a malicious string, is-svg will get stuck processing the input for a very long time.
<p>Publish Date: 2021-03-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28092>CVE-2021-28092</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-28092">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-28092</a></p>
<p>Release Date: 2021-03-12</p>
<p>Fix Resolution: v4.2.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-28092 (High) detected in is-svg-3.0.0.tgz - ## CVE-2021-28092 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>is-svg-3.0.0.tgz</b></p></summary>
<p>Check if a string or buffer is SVG</p>
<p>Library home page: <a href="https://registry.npmjs.org/is-svg/-/is-svg-3.0.0.tgz">https://registry.npmjs.org/is-svg/-/is-svg-3.0.0.tgz</a></p>
<p>Path to dependency file: ngx-jira-issue-collector/package.json</p>
<p>Path to vulnerable library: ngx-jira-issue-collector/node_modules/is-svg/package.json</p>
<p>
Dependency Hierarchy:
- build-angular-0.901.7.tgz (Root Library)
- cssnano-4.1.10.tgz
- cssnano-preset-default-4.0.7.tgz
- postcss-svgo-4.0.2.tgz
- :x: **is-svg-3.0.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/CatalystOne/ngx-jira-issue-collector/commit/fd73bf999713801f06b8d55964e8e03ae1759dc2">fd73bf999713801f06b8d55964e8e03ae1759dc2</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The is-svg package 2.1.0 through 4.2.1 for Node.js uses a regular expression that is vulnerable to Regular Expression Denial of Service (ReDoS). If an attacker provides a malicious string, is-svg will get stuck processing the input for a very long time.
<p>Publish Date: 2021-03-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28092>CVE-2021-28092</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-28092">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-28092</a></p>
<p>Release Date: 2021-03-12</p>
<p>Fix Resolution: v4.2.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_test | cve high detected in is svg tgz cve high severity vulnerability vulnerable library is svg tgz check if a string or buffer is svg library home page a href path to dependency file ngx jira issue collector package json path to vulnerable library ngx jira issue collector node modules is svg package json dependency hierarchy build angular tgz root library cssnano tgz cssnano preset default tgz postcss svgo tgz x is svg tgz vulnerable library found in head commit a href found in base branch master vulnerability details the is svg package through for node js uses a regular expression that is vulnerable to regular expression denial of service redos if an attacker provides a malicious string is svg will get stuck processing the input for a very long time publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
152,984 | 12,132,223,860 | IssuesEvent | 2020-04-23 06:49:46 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | opened | roachtest: acceptance/gossip/restart-node-one failed | C-test-failure O-roachtest O-robot branch-release-19.2 release-blocker | [(roachtest).acceptance/gossip/restart-node-one failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=1894808&tab=buildLog) on [release-19.2@4d7486c6f678f6c34c33a150d69cbf6355cb2107](https://github.com/cockroachdb/cockroach/commits/4d7486c6f678f6c34c33a150d69cbf6355cb2107):
```
The test failed on branch=release-19.2, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/acceptance/gossip/restart-node-one/run_1
gossip.go:230,gossip.go:439,acceptance.go:91,test_runner.go:753: dial tcp 127.0.0.1:5432: connect: connection refused
cluster.go:1481,context.go:135,cluster.go:1470,test_runner.go:825: dead node detection: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod monitor teamcity-1894808-1587624410-07-n4cpu4 --oneshot --ignore-empty-nodes: exit status 1 4: 4013
1: dead
3: 4080
2: 3946
Error: UNCLASSIFIED_PROBLEM: 1: dead
(1) UNCLASSIFIED_PROBLEM
Wraps: (2) 1: dead
| main.glob..func13
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:1129
| main.wrap.func1
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:272
| github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra.(*Command).execute
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:766
| github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra.(*Command).ExecuteC
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:852
| github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra.(*Command).Execute
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:800
| main.main
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:1793
| runtime.main
| /usr/local/go/src/runtime/proc.go:203
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1357
Error types: (1) errors.Unclassified (2) *errors.fundamental
```
<details><summary>More</summary><p>
Artifacts: [/acceptance/gossip/restart-node-one](https://teamcity.cockroachdb.com/viewLog.html?buildId=1894808&tab=artifacts#/acceptance/gossip/restart-node-one)
Related:
- #47808 roachtest: acceptance/gossip/restart-node-one failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-master](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-master) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Aacceptance%2Fgossip%2Frestart-node-one.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
| 2.0 | roachtest: acceptance/gossip/restart-node-one failed - [(roachtest).acceptance/gossip/restart-node-one failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=1894808&tab=buildLog) on [release-19.2@4d7486c6f678f6c34c33a150d69cbf6355cb2107](https://github.com/cockroachdb/cockroach/commits/4d7486c6f678f6c34c33a150d69cbf6355cb2107):
```
The test failed on branch=release-19.2, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/acceptance/gossip/restart-node-one/run_1
gossip.go:230,gossip.go:439,acceptance.go:91,test_runner.go:753: dial tcp 127.0.0.1:5432: connect: connection refused
cluster.go:1481,context.go:135,cluster.go:1470,test_runner.go:825: dead node detection: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod monitor teamcity-1894808-1587624410-07-n4cpu4 --oneshot --ignore-empty-nodes: exit status 1 4: 4013
1: dead
3: 4080
2: 3946
Error: UNCLASSIFIED_PROBLEM: 1: dead
(1) UNCLASSIFIED_PROBLEM
Wraps: (2) 1: dead
| main.glob..func13
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:1129
| main.wrap.func1
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:272
| github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra.(*Command).execute
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:766
| github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra.(*Command).ExecuteC
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:852
| github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra.(*Command).Execute
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:800
| main.main
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:1793
| runtime.main
| /usr/local/go/src/runtime/proc.go:203
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1357
Error types: (1) errors.Unclassified (2) *errors.fundamental
```
<details><summary>More</summary><p>
Artifacts: [/acceptance/gossip/restart-node-one](https://teamcity.cockroachdb.com/viewLog.html?buildId=1894808&tab=artifacts#/acceptance/gossip/restart-node-one)
Related:
- #47808 roachtest: acceptance/gossip/restart-node-one failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-master](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-master) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Aacceptance%2Fgossip%2Frestart-node-one.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
| test | roachtest acceptance gossip restart node one failed on the test failed on branch release cloud gce test artifacts and logs in home agent work go src github com cockroachdb cockroach artifacts acceptance gossip restart node one run gossip go gossip go acceptance go test runner go dial tcp connect connection refused cluster go context go cluster go test runner go dead node detection home agent work go src github com cockroachdb cockroach bin roachprod monitor teamcity oneshot ignore empty nodes exit status dead error unclassified problem dead unclassified problem wraps dead main glob home agent work go src github com cockroachdb cockroach pkg cmd roachprod main go main wrap home agent work go src github com cockroachdb cockroach pkg cmd roachprod main go github com cockroachdb cockroach vendor github com cobra command execute home agent work go src github com cockroachdb cockroach vendor github com cobra command go github com cockroachdb cockroach vendor github com cobra command executec home agent work go src github com cockroachdb cockroach vendor github com cobra command go github com cockroachdb cockroach vendor github com cobra command execute home agent work go src github com cockroachdb cockroach vendor github com cobra command go main main home agent work go src github com cockroachdb cockroach pkg cmd roachprod main go runtime main usr local go src runtime proc go runtime goexit usr local go src runtime asm s error types errors unclassified errors fundamental more artifacts related roachtest acceptance gossip restart node one failed powered by | 1 |
267,158 | 23,285,769,625 | IssuesEvent | 2022-08-05 16:19:16 | pytorch/pytorch | https://api.github.com/repos/pytorch/pytorch | closed | TestNNAPI fails when not build with QNNPACK | module: nn module: tests triaged | ### 🐛 Describe the bug
Since https://github.com/pytorch/pytorch/commit/3802edd9ab7823b2400908b6dc364c7257f7e9ff there is a TestNNAPI test case which in its setUp unconditionally loads QNNPACK: https://github.com/pytorch/pytorch/blob/3802edd9ab7823b2400908b6dc364c7257f7e9ff/test/test_nnapi.py#L25
So when PyTorch is built without it this will hard error:
```
======================================================================
ERROR: test_unsqueeze (jit.test_backend_nnapi.TestNnapiBackend)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/dev/shm/s3248973-EasyBuild/PyTorch/1.10.0/foss-2021a/pytorch/test/jit/test_backend_nnapi.py", line 37, in setUp
super().setUp()
File "/dev/shm/s3248973-EasyBuild/PyTorch/1.10.0/foss-2021a/pytorch/test/test_nnapi.py", line 25, in setUp
torch.backends.quantized.engine = 'qnnpack'
File "/tmp/easybuild-tmp/eb-5FO9wX/tmpIMOoeD/lib/python3.9/site-packages/torch/backends/quantized/__init__.py", line 29, in __set__
torch._C._set_qengine(_get_qengine_id(val))
RuntimeError: quantized engine QNNPACK is not supported
```
I'm not sure what the correct way to solve this here is:
- Skip the test if no QNNPACK is there
- Don't set it there if it isn't available
### Versions
Affected is 1.9.0 and higher.
cc @albanD @mruberry @jbschlosser @walterddr @kshitij12345 @saketh-are | 1.0 | TestNNAPI fails when not build with QNNPACK - ### 🐛 Describe the bug
Since https://github.com/pytorch/pytorch/commit/3802edd9ab7823b2400908b6dc364c7257f7e9ff there is a TestNNAPI test case which in its setUp unconditionally loads QNNPACK: https://github.com/pytorch/pytorch/blob/3802edd9ab7823b2400908b6dc364c7257f7e9ff/test/test_nnapi.py#L25
So when PyTorch is built without it this will hard error:
```
======================================================================
ERROR: test_unsqueeze (jit.test_backend_nnapi.TestNnapiBackend)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/dev/shm/s3248973-EasyBuild/PyTorch/1.10.0/foss-2021a/pytorch/test/jit/test_backend_nnapi.py", line 37, in setUp
super().setUp()
File "/dev/shm/s3248973-EasyBuild/PyTorch/1.10.0/foss-2021a/pytorch/test/test_nnapi.py", line 25, in setUp
torch.backends.quantized.engine = 'qnnpack'
File "/tmp/easybuild-tmp/eb-5FO9wX/tmpIMOoeD/lib/python3.9/site-packages/torch/backends/quantized/__init__.py", line 29, in __set__
torch._C._set_qengine(_get_qengine_id(val))
RuntimeError: quantized engine QNNPACK is not supported
```
I'm not sure what the correct way to solve this here is:
- Skip the test if no QNNPACK is there
- Don't set it there if it isn't available
### Versions
Affected is 1.9.0 and higher.
cc @albanD @mruberry @jbschlosser @walterddr @kshitij12345 @saketh-are | test | testnnapi fails when not build with qnnpack 🐛 describe the bug since there is a testnnapi test case which in its setup unconditionally loads qnnpack so when pytorch is built without it this will hard error error test unsqueeze jit test backend nnapi testnnapibackend traceback most recent call last file dev shm easybuild pytorch foss pytorch test jit test backend nnapi py line in setup super setup file dev shm easybuild pytorch foss pytorch test test nnapi py line in setup torch backends quantized engine qnnpack file tmp easybuild tmp eb tmpimooed lib site packages torch backends quantized init py line in set torch c set qengine get qengine id val runtimeerror quantized engine qnnpack is not supported i m not sure what the correct way to solve this here is skip the test if no qnnpack is there don t set it there if it isn t available versions affected is and higher cc alband mruberry jbschlosser walterddr saketh are | 1 |
306,563 | 23,164,755,378 | IssuesEvent | 2022-07-29 22:36:49 | bucanero/apollo-vita | https://api.github.com/repos/bucanero/apollo-vita | opened | PSP save game keys | documentation | ## PSP game keys
Save game keys to decrypt PSP save-games.
> Users are welcome to submit dumped keys, either binary files or fingerprint results from Apollo.
```
NPJH70001=60cf88e45f1a49f1f958d9b82064d283
NPJH70002=60cf88e45f1a49f1f958d9b82064d283
ULUS10104=47474a47414d4549444143534e554d00
ULUS10244=d88ea9782d50e4659885a2642eeee9cc
ULUS10455=307830312c307865662c307836662c30
ULUS10495=01af6f00020070d52e2412c7e1ff83ba
ULUS10519=01fa6f00020007d5e22412c71eff83ab
ULUS10567=7d7a0339d82e05e56e5cdcc5e1cbdc6d
```
| 1.0 | PSP save game keys - ## PSP game keys
Save game keys to decrypt PSP save-games.
> Users are welcome to submit dumped keys, either binary files or fingerprint results from Apollo.
```
NPJH70001=60cf88e45f1a49f1f958d9b82064d283
NPJH70002=60cf88e45f1a49f1f958d9b82064d283
ULUS10104=47474a47414d4549444143534e554d00
ULUS10244=d88ea9782d50e4659885a2642eeee9cc
ULUS10455=307830312c307865662c307836662c30
ULUS10495=01af6f00020070d52e2412c7e1ff83ba
ULUS10519=01fa6f00020007d5e22412c71eff83ab
ULUS10567=7d7a0339d82e05e56e5cdcc5e1cbdc6d
```
| non_test | psp save game keys psp game keys save game keys to decrypt psp save games users are welcome to submit dumped keys either binary files or fingerprint results from apollo | 0 |
215,323 | 16,664,571,615 | IssuesEvent | 2021-06-06 23:30:59 | backend-br/vagas | https://api.github.com/repos/backend-br/vagas | closed | [São Paulo] [Remoto] Back-end Developer @Copastur | .NET ASP Alocado Angular CLT MVC NodeJS PJ Pleno Remoto SQL Testes Unitários | ## Nossa empresa
Trabalho em uma empresa de recrutamento humanizado que se chama HunterCo, possuímos várias vagas e consultores que acompanham os candidatos no recrutamento, auxiliando-o e dando feedback's.
Sobre a empresa contratante:
Nos diferenciamos ao cuidar das pessoas e de suas experiências, sejam clientes, fornecedores, colaboradores, parceiros ou sociedade.
Somos reconhecidos como um ótimo lugar para se trabalhar com a premiação GPTW (Great Place to Work), certificados pela ISO 9001 da qualidade e a única empresa do segmento que conta com o certificado ISO 14001 de sustentabilidade.
## Descrição da vaga
Realizar levantamento de requisitos com o cliente final (interno ou externo);
Desenvolver o código-fonte necessário para que os requisitos sejam atendidos na sua totalidade.
Criar testes unitários para garantir a integridade do código-fonte e facilitar a refatoração.
Atuar junto a equipe de TI para retirar os impedimentos.
Elaborar arquitetura dos sistemas.
Auxiliar os demais Analistas de Desenvolvimento.
Visão analítica das demandas
## Local
São Paulo ou remoto
## Requisitos
**Obrigatórios:**
Experiência em desenvolvimento de softwares.
Ensino superior
Boa comunicação
Experiência em desenvolvimento nas plataformas Microsoft: C#, Dot.net, ASP. net MVC, SQL server, angularJS, Web API, WCF.
**Desejáveis:**
Experiência com metodologias ágeis, metodologias MVP, angular 4.0, API Restfull, Ionic framework, NodeJS, React
Conhecer o mercado de turismo
Inglês
**Diferenciais:**
Possibilidade de Home Office
## Benefícios
VR - para todo mundo se alimentar bem!
VT - o valor necessário para a sua vinda ao trabalho;
Plano de saúde - Bradesco;,Plano odontológico - Bradesco;
Seguro de Vida - Zurich;
Pacote de Seguros com vantagens especiais; automóvel, residencial, pet, equipamentos portáteis;
GymPass - movimente-se!
Allya - descontos diversos para o seu salário render mais,
Apoio Pass - suportes diversos ao colaborador;
Mindfulness - começando o dia bem com meditação;
Frutas - todos os dias no escritório;
Ginástica laboral - nos alongamos com música e diversão;
Incentivo a vir de bike - bicicletário e chuveiros;
Licença Paternidade - para aquele que se torna papai, 30 dias em casa com seu(sua) filho(a).
## Contratação
combinar
## Como se candidatar
Cadastre-se no link abaixo:
https://vagas.hunterco.com.br/job/609c1980933a3f001d1f29bc?c=4019
## Tempo médio de feedbacks
Costumamos enviar feedbacks em até 05 dias após cada processo.
E-mail para contato em caso de não haver resposta: janaina.nessi@parceirohunterco.com.br
## Labels
#### Alocação
- Alocado
- Remoto
#### Regime
- CLT
- PJ
#### Nível
- Pleno
| 1.0 | [São Paulo] [Remoto] Back-end Developer @Copastur - ## Nossa empresa
Trabalho em uma empresa de recrutamento humanizado que se chama HunterCo, possuímos várias vagas e consultores que acompanham os candidatos no recrutamento, auxiliando-o e dando feedback's.
Sobre a empresa contratante:
Nos diferenciamos ao cuidar das pessoas e de suas experiências, sejam clientes, fornecedores, colaboradores, parceiros ou sociedade.
Somos reconhecidos como um ótimo lugar para se trabalhar com a premiação GPTW (Great Place to Work), certificados pela ISO 9001 da qualidade e a única empresa do segmento que conta com o certificado ISO 14001 de sustentabilidade.
## Descrição da vaga
Realizar levantamento de requisitos com o cliente final (interno ou externo);
Desenvolver o código-fonte necessário para que os requisitos sejam atendidos na sua totalidade.
Criar testes unitários para garantir a integridade do código-fonte e facilitar a refatoração.
Atuar junto a equipe de TI para retirar os impedimentos.
Elaborar arquitetura dos sistemas.
Auxiliar os demais Analistas de Desenvolvimento.
Visão analítica das demandas
## Local
São Paulo ou remoto
## Requisitos
**Obrigatórios:**
Experiência em desenvolvimento de softwares.
Ensino superior
Boa comunicação
Experiência em desenvolvimento nas plataformas Microsoft: C#, Dot.net, ASP. net MVC, SQL server, angularJS, Web API, WCF.
**Desejáveis:**
Experiência com metodologias ágeis, metodologias MVP, angular 4.0, API Restfull, Ionic framework, NodeJS, React
Conhecer o mercado de turismo
Inglês
**Diferenciais:**
Possibilidade de Home Office
## Benefícios
VR - para todo mundo se alimentar bem!
VT - o valor necessário para a sua vinda ao trabalho;
Plano de saúde - Bradesco;,Plano odontológico - Bradesco;
Seguro de Vida - Zurich;
Pacote de Seguros com vantagens especiais; automóvel, residencial, pet, equipamentos portáteis;
GymPass - movimente-se!
Allya - descontos diversos para o seu salário render mais,
Apoio Pass - suportes diversos ao colaborador;
Mindfulness - começando o dia bem com meditação;
Frutas - todos os dias no escritório;
Ginástica laboral - nos alongamos com música e diversão;
Incentivo a vir de bike - bicicletário e chuveiros;
Licença Paternidade - para aquele que se torna papai, 30 dias em casa com seu(sua) filho(a).
## Contratação
combinar
## Como se candidatar
Cadastre-se no link abaixo:
https://vagas.hunterco.com.br/job/609c1980933a3f001d1f29bc?c=4019
## Tempo médio de feedbacks
Costumamos enviar feedbacks em até 05 dias após cada processo.
E-mail para contato em caso de não haver resposta: janaina.nessi@parceirohunterco.com.br
## Labels
#### Alocação
- Alocado
- Remoto
#### Regime
- CLT
- PJ
#### Nível
- Pleno
| test | back end developer copastur nossa empresa trabalho em uma empresa de recrutamento humanizado que se chama hunterco possuímos várias vagas e consultores que acompanham os candidatos no recrutamento auxiliando o e dando feedback s sobre a empresa contratante nos diferenciamos ao cuidar das pessoas e de suas experiências sejam clientes fornecedores colaboradores parceiros ou sociedade somos reconhecidos como um ótimo lugar para se trabalhar com a premiação gptw great place to work certificados pela iso da qualidade e a única empresa do segmento que conta com o certificado iso de sustentabilidade descrição da vaga realizar levantamento de requisitos com o cliente final interno ou externo desenvolver o código fonte necessário para que os requisitos sejam atendidos na sua totalidade criar testes unitários para garantir a integridade do código fonte e facilitar a refatoração atuar junto a equipe de ti para retirar os impedimentos elaborar arquitetura dos sistemas auxiliar os demais analistas de desenvolvimento visão analítica das demandas local são paulo ou remoto requisitos obrigatórios experiência em desenvolvimento de softwares ensino superior boa comunicação experiência em desenvolvimento nas plataformas microsoft c dot net asp net mvc sql server angularjs web api wcf desejáveis experiência com metodologias ágeis metodologias mvp angular api restfull ionic framework nodejs react conhecer o mercado de turismo inglês diferenciais possibilidade de home office benefícios vr para todo mundo se alimentar bem vt o valor necessário para a sua vinda ao trabalho plano de saúde bradesco plano odontológico bradesco seguro de vida zurich pacote de seguros com vantagens especiais automóvel residencial pet equipamentos portáteis gympass movimente se allya descontos diversos para o seu salário render mais apoio pass suportes diversos ao colaborador mindfulness começando o dia bem com meditação frutas todos os dias no escritório ginástica laboral nos alongamos com música e diversão incentivo a vir de bike bicicletário e chuveiros licença paternidade para aquele que se torna papai dias em casa com seu sua filho a contratação combinar como se candidatar cadastre se no link abaixo tempo médio de feedbacks costumamos enviar feedbacks em até dias após cada processo e mail para contato em caso de não haver resposta janaina nessi parceirohunterco com br labels alocação alocado remoto regime clt pj nível pleno | 1 |
60,585 | 14,562,630,489 | IssuesEvent | 2020-12-17 00:31:59 | Dima2021/lodash | https://api.github.com/repos/Dima2021/lodash | opened | WS-2019-0332 (Medium) detected in handlebars-4.0.11.tgz | security vulnerability | ## WS-2019-0332 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.0.11.tgz</b></p></summary>
<p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p>
<p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.0.11.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.0.11.tgz</a></p>
<p>Path to dependency file: lodash/package.json</p>
<p>Path to vulnerable library: lodash/node_modules/handlebars/package.json</p>
<p>
Dependency Hierarchy:
- istanbul-0.4.5.tgz (Root Library)
- :x: **handlebars-4.0.11.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/Dima2021/lodash/commits/50af7e310d557bae73ac2badb6ac6ecda0cebcb0">50af7e310d557bae73ac2badb6ac6ecda0cebcb0</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Arbitrary Code Execution vulnerability found in handlebars before 4.5.3. Lookup helper fails to validate templates. Attack may submit templates that execute arbitrary JavaScript in the system.It is due to an incomplete fix for a WS-2019-0331.
<p>Publish Date: 2019-11-17
<p>URL: <a href=https://github.com/wycats/handlebars.js/commit/198887808780bbef9dba67a8af68ece091d5baa7>WS-2019-0332</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1324">https://www.npmjs.com/advisories/1324</a></p>
<p>Release Date: 2019-12-05</p>
<p>Fix Resolution: handlebars - 4.5.3</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"handlebars","packageVersion":"4.0.11","isTransitiveDependency":true,"dependencyTree":"istanbul:0.4.5;handlebars:4.0.11","isMinimumFixVersionAvailable":true,"minimumFixVersion":"handlebars - 4.5.3"}],"vulnerabilityIdentifier":"WS-2019-0332","vulnerabilityDetails":"Arbitrary Code Execution vulnerability found in handlebars before 4.5.3. Lookup helper fails to validate templates. Attack may submit templates that execute arbitrary JavaScript in the system.It is due to an incomplete fix for a WS-2019-0331.","vulnerabilityUrl":"https://github.com/wycats/handlebars.js/commit/198887808780bbef9dba67a8af68ece091d5baa7","cvss2Severity":"medium","cvss2Score":"5.0","extraData":{}}</REMEDIATE> --> | True | WS-2019-0332 (Medium) detected in handlebars-4.0.11.tgz - ## WS-2019-0332 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.0.11.tgz</b></p></summary>
<p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p>
<p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.0.11.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.0.11.tgz</a></p>
<p>Path to dependency file: lodash/package.json</p>
<p>Path to vulnerable library: lodash/node_modules/handlebars/package.json</p>
<p>
Dependency Hierarchy:
- istanbul-0.4.5.tgz (Root Library)
- :x: **handlebars-4.0.11.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/Dima2021/lodash/commits/50af7e310d557bae73ac2badb6ac6ecda0cebcb0">50af7e310d557bae73ac2badb6ac6ecda0cebcb0</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Arbitrary Code Execution vulnerability found in handlebars before 4.5.3. Lookup helper fails to validate templates. Attack may submit templates that execute arbitrary JavaScript in the system.It is due to an incomplete fix for a WS-2019-0331.
<p>Publish Date: 2019-11-17
<p>URL: <a href=https://github.com/wycats/handlebars.js/commit/198887808780bbef9dba67a8af68ece091d5baa7>WS-2019-0332</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1324">https://www.npmjs.com/advisories/1324</a></p>
<p>Release Date: 2019-12-05</p>
<p>Fix Resolution: handlebars - 4.5.3</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"handlebars","packageVersion":"4.0.11","isTransitiveDependency":true,"dependencyTree":"istanbul:0.4.5;handlebars:4.0.11","isMinimumFixVersionAvailable":true,"minimumFixVersion":"handlebars - 4.5.3"}],"vulnerabilityIdentifier":"WS-2019-0332","vulnerabilityDetails":"Arbitrary Code Execution vulnerability found in handlebars before 4.5.3. Lookup helper fails to validate templates. Attack may submit templates that execute arbitrary JavaScript in the system.It is due to an incomplete fix for a WS-2019-0331.","vulnerabilityUrl":"https://github.com/wycats/handlebars.js/commit/198887808780bbef9dba67a8af68ece091d5baa7","cvss2Severity":"medium","cvss2Score":"5.0","extraData":{}}</REMEDIATE> --> | non_test | ws medium detected in handlebars tgz ws medium severity vulnerability vulnerable library handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file lodash package json path to vulnerable library lodash node modules handlebars package json dependency hierarchy istanbul tgz root library x handlebars tgz vulnerable library found in head commit a href found in base branch master vulnerability details arbitrary code execution vulnerability found in handlebars before lookup helper fails to validate templates attack may submit templates that execute arbitrary javascript in the system it is due to an incomplete fix for a ws publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution handlebars isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier ws vulnerabilitydetails arbitrary code execution vulnerability found in handlebars before lookup helper fails to validate templates attack may submit templates that execute arbitrary javascript in the system it is due to an incomplete fix for a ws vulnerabilityurl | 0 |
241,945 | 20,174,964,709 | IssuesEvent | 2022-02-10 13:47:29 | ONSdigital/design-system | https://api.github.com/repos/ONSdigital/design-system | closed | Percy baseline not up to date | Testing | Percy visual regression tests are still flagging up old changes because the baseline is missing many updated screenshots.
Also, too many screenshots are being flagged up as 'changed' due to minor typography rendering differences. | 1.0 | Percy baseline not up to date - Percy visual regression tests are still flagging up old changes because the baseline is missing many updated screenshots.
Also, too many screenshots are being flagged up as 'changed' due to minor typography rendering differences. | test | percy baseline not up to date percy visual regression tests are still flagging up old changes because the baseline is missing many updated screenshots also too many screenshots are being flagged up as changed due to minor typography rendering differences | 1 |
332,121 | 29,185,746,359 | IssuesEvent | 2023-05-19 15:17:09 | unifyai/ivy | https://api.github.com/repos/unifyai/ivy | reopened | Fix gradients.test_adam_update | Sub Task Failing Test | | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5023808200/jobs/9008821374" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/5023808200/jobs/9008821374" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5023808200/jobs/9008821374" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/5023808200/jobs/9008821374" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|paddle|<a href="null" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
<details>
<summary>FAILED ivy_tests/test_ivy/test_functional/test_core/test_gradients.py::test_adam_update[cpu-ivy.functional.backends.numpy-False-False]</summary>
2023-05-19T12:09:21.3128235Z E KeyError: 'cpu'2023-05-19T12:09:21.3128475Z E 2023-05-19T12:09:21.3129034Z E You can reproduce this example by temporarily adding @reproduce_failure('6.75.3', b'AA==') as a decorator on your test case
</details>
| 1.0 | Fix gradients.test_adam_update - | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5023808200/jobs/9008821374" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/5023808200/jobs/9008821374" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5023808200/jobs/9008821374" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/5023808200/jobs/9008821374" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|paddle|<a href="null" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
<details>
<summary>FAILED ivy_tests/test_ivy/test_functional/test_core/test_gradients.py::test_adam_update[cpu-ivy.functional.backends.numpy-False-False]</summary>
2023-05-19T12:09:21.3128235Z E KeyError: 'cpu'2023-05-19T12:09:21.3128475Z E 2023-05-19T12:09:21.3129034Z E You can reproduce this example by temporarily adding @reproduce_failure('6.75.3', b'AA==') as a decorator on your test case
</details>
| test | fix gradients test adam update tensorflow img src torch img src numpy img src jax img src paddle img src failed ivy tests test ivy test functional test core test gradients py test adam update e keyerror cpu e e you can reproduce this example by temporarily adding reproduce failure b aa as a decorator on your test case | 1 |
289,852 | 25,017,961,831 | IssuesEvent | 2022-11-03 20:40:40 | sonic-net/sonic-mgmt | https://api.github.com/repos/sonic-net/sonic-mgmt | opened | Validate that device ignores PFC frame received with no bit set in the class enable vector | Test gap | <!--
If you are reporting a new issue, make sure that we do not have any duplicates
already open. You can ensure this by searching the issue list for this
repository. If there is a duplicate, please close your issue and add a comment
to the existing issue instead.
If you suspect your issue is a bug, please edit your issue description to
include the BUG REPORT INFORMATION shown below. If you fail to provide this
information within 7 days, we cannot debug your issue and will close it. We
will, however, reopen it if you later provide the information.
For more information about reporting issues, see
https://github.com/sonic-net/SONiC/wiki#report-issues
---------------------------------------------------
GENERAL SUPPORT INFORMATION
---------------------------------------------------
The GitHub issue tracker is for bug reports and feature requests.
General support can be found at the following locations:
- SONiC Support Forums - https://groups.google.com/forum/#!forum/sonicproject
---------------------------------------------------
BUG REPORT INFORMATION
---------------------------------------------------
Use the commands below to provide key information from your environment:
You do NOT have to include this information if this is a FEATURE REQUEST
-->
**Description**
We need to cover 2 cases here
1. pfc frame accounting on the DUT
2. DUT reaction to the pfc frames
Device should ignore PFC frame received with no bit set in the class enable vector field. It should not account for those and not react to pfc frames
With the following topology,
ixia port 1 --> dut --> ixia port2
1. Send lossless and lossy traffic from ixia port 1 to ixia port 2
3. Send pfc frames with class enable vector field set to 0 and max pause quanta specified in all time class fields from ixia port2 at a duration needed to fully block the egress queue
**Describe the results you expected:**
1. "show pfc counters" on the dut should not account those packets in RX pfc
2. With traffic running and pause frames being sent, "show queue counters" should show a continuous increase in traffic
| 1.0 | Validate that device ignores PFC frame received with no bit set in the class enable vector - <!--
If you are reporting a new issue, make sure that we do not have any duplicates
already open. You can ensure this by searching the issue list for this
repository. If there is a duplicate, please close your issue and add a comment
to the existing issue instead.
If you suspect your issue is a bug, please edit your issue description to
include the BUG REPORT INFORMATION shown below. If you fail to provide this
information within 7 days, we cannot debug your issue and will close it. We
will, however, reopen it if you later provide the information.
For more information about reporting issues, see
https://github.com/sonic-net/SONiC/wiki#report-issues
---------------------------------------------------
GENERAL SUPPORT INFORMATION
---------------------------------------------------
The GitHub issue tracker is for bug reports and feature requests.
General support can be found at the following locations:
- SONiC Support Forums - https://groups.google.com/forum/#!forum/sonicproject
---------------------------------------------------
BUG REPORT INFORMATION
---------------------------------------------------
Use the commands below to provide key information from your environment:
You do NOT have to include this information if this is a FEATURE REQUEST
-->
**Description**
We need to cover 2 cases here
1. pfc frame accounting on the DUT
2. DUT reaction to the pfc frames
Device should ignore PFC frame received with no bit set in the class enable vector field. It should not account for those and not react to pfc frames
With the following topology,
ixia port 1 --> dut --> ixia port2
1. Send lossless and lossy traffic from ixia port 1 to ixia port 2
3. Send pfc frames with class enable vector field set to 0 and max pause quanta specified in all time class fields from ixia port2 at a duration needed to fully block the egress queue
**Describe the results you expected:**
1. "show pfc counters" on the dut should not account those packets in RX pfc
2. With traffic running and pause frames being sent, "show queue counters" should show a continuous increase in traffic
| test | validate that device ignores pfc frame received with no bit set in the class enable vector if you are reporting a new issue make sure that we do not have any duplicates already open you can ensure this by searching the issue list for this repository if there is a duplicate please close your issue and add a comment to the existing issue instead if you suspect your issue is a bug please edit your issue description to include the bug report information shown below if you fail to provide this information within days we cannot debug your issue and will close it we will however reopen it if you later provide the information for more information about reporting issues see general support information the github issue tracker is for bug reports and feature requests general support can be found at the following locations sonic support forums bug report information use the commands below to provide key information from your environment you do not have to include this information if this is a feature request description we need to cover cases here pfc frame accounting on the dut dut reaction to the pfc frames device should ignore pfc frame received with no bit set in the class enable vector field it should not account for those and not react to pfc frames with the following topology ixia port dut ixia send lossless and lossy traffic from ixia port to ixia port send pfc frames with class enable vector field set to and max pause quanta specified in all time class fields from ixia at a duration needed to fully block the egress queue describe the results you expected show pfc counters on the dut should not account those packets in rx pfc with traffic running and pause frames being sent show queue counters should show a continuous increase in traffic | 1 |
13,391 | 8,894,936,989 | IssuesEvent | 2019-01-16 06:46:27 | istio/istio | https://api.github.com/repos/istio/istio | closed | Support custom attributes in Istio Authorization Policy | area/security/aaa | Working on a doc to summarize our discussions, will update once it's ready.
/cc @rshriram @liminw | True | Support custom attributes in Istio Authorization Policy - Working on a doc to summarize our discussions, will update once it's ready.
/cc @rshriram @liminw | non_test | support custom attributes in istio authorization policy working on a doc to summarize our discussions will update once it s ready cc rshriram liminw | 0 |
662,701 | 22,150,064,223 | IssuesEvent | 2022-06-03 15:52:59 | Cotalker/documentation | https://api.github.com/repos/Cotalker/documentation | closed | Bug report: Cuando se crea un nuevo cargo con roles, éstos no se asocian al final | Bug report Bug high priority | ### Affected system
Cotalker Web Application
### Affected system (other)
_No response_
### Affected environment
Production
### Affected environment (other)
_No response_
### App version
Versión 17.10.11 (252)
### Details
Cuando se crea un nuevo cargo, desde la colección cargos, con roles de accesos, estos a la hora de asignarlo a un usuario, éstos no se obtienen, queda en blanco. Por otro lado, en el momento de guardar dicho cargo, se crea el cargo, pero no se agrega los roles de accesos y propiedad
![Uploading image.png…]()
### Steps to reproduce
1.- ir a la colección "cargos"
2.-crear un cargo con accessroles y atributos
3.- guardar
4.- revisar
### Expected result
debería poder tener asignado esos roles
### Additional data
_No response_ | 1.0 | Bug report: Cuando se crea un nuevo cargo con roles, éstos no se asocian al final - ### Affected system
Cotalker Web Application
### Affected system (other)
_No response_
### Affected environment
Production
### Affected environment (other)
_No response_
### App version
Versión 17.10.11 (252)
### Details
Cuando se crea un nuevo cargo, desde la colección cargos, con roles de accesos, estos a la hora de asignarlo a un usuario, éstos no se obtienen, queda en blanco. Por otro lado, en el momento de guardar dicho cargo, se crea el cargo, pero no se agrega los roles de accesos y propiedad
![Uploading image.png…]()
### Steps to reproduce
1.- ir a la colección "cargos"
2.-crear un cargo con accessroles y atributos
3.- guardar
4.- revisar
### Expected result
debería poder tener asignado esos roles
### Additional data
_No response_ | non_test | bug report cuando se crea un nuevo cargo con roles éstos no se asocian al final affected system cotalker web application affected system other no response affected environment production affected environment other no response app version versión details cuando se crea un nuevo cargo desde la colección cargos con roles de accesos estos a la hora de asignarlo a un usuario éstos no se obtienen queda en blanco por otro lado en el momento de guardar dicho cargo se crea el cargo pero no se agrega los roles de accesos y propiedad steps to reproduce ir a la colección cargos crear un cargo con accessroles y atributos guardar revisar expected result debería poder tener asignado esos roles additional data no response | 0 |
112,118 | 9,554,204,652 | IssuesEvent | 2019-05-02 21:21:08 | saltstack/salt | https://api.github.com/repos/saltstack/salt | opened | unit.renderers.test_gpg.GPGTestCase.test_render_with_translate_newlines_should_translate_newlines test failure | Test Failure | 2019.2.1 failed [salt-windows-2019-py2](https://jenkinsci.saltstack.com/job/2019.2.1/job/salt-windows-2019-py2/14/testReport/junit/unit.renderers.test_gpg/GPGTestCase/test_render_with_translate_newlines_should_translate_newlines)
---
u'Use\u2039 more\u2039 salt.\n\nUse\u2039 more\u2039 salt.\n\nUse\u2039 more\u2039 salt.' != 'Use\x8b more\x8b salt.\n\nUse\x8b more\x8b salt.\n\nUse\x8b more\x8b salt.'
```
Traceback (most recent call last):
File "c:\users\admini~1\appdata\local\temp\kitchen\testing\tests\unit\renderers\test_gpg.py", line 169, in test_render_with_translate_newlines_should_translate_newlines
expected,
AssertionError: u'Use\u2039 more\u2039 salt.\n\nUse\u2039 more\u2039 salt.\n\nUse\u2039 more\u2039 salt.' != 'Use\x8b more\x8b salt.\n\nUse\x8b more\x8b salt.\n\nUse\x8b more\x8b salt.'
``` | 1.0 | unit.renderers.test_gpg.GPGTestCase.test_render_with_translate_newlines_should_translate_newlines test failure - 2019.2.1 failed [salt-windows-2019-py2](https://jenkinsci.saltstack.com/job/2019.2.1/job/salt-windows-2019-py2/14/testReport/junit/unit.renderers.test_gpg/GPGTestCase/test_render_with_translate_newlines_should_translate_newlines)
---
u'Use\u2039 more\u2039 salt.\n\nUse\u2039 more\u2039 salt.\n\nUse\u2039 more\u2039 salt.' != 'Use\x8b more\x8b salt.\n\nUse\x8b more\x8b salt.\n\nUse\x8b more\x8b salt.'
```
Traceback (most recent call last):
File "c:\users\admini~1\appdata\local\temp\kitchen\testing\tests\unit\renderers\test_gpg.py", line 169, in test_render_with_translate_newlines_should_translate_newlines
expected,
AssertionError: u'Use\u2039 more\u2039 salt.\n\nUse\u2039 more\u2039 salt.\n\nUse\u2039 more\u2039 salt.' != 'Use\x8b more\x8b salt.\n\nUse\x8b more\x8b salt.\n\nUse\x8b more\x8b salt.'
``` | test | unit renderers test gpg gpgtestcase test render with translate newlines should translate newlines test failure failed u use more salt n nuse more salt n nuse more salt use more salt n nuse more salt n nuse more salt traceback most recent call last file c users admini appdata local temp kitchen testing tests unit renderers test gpg py line in test render with translate newlines should translate newlines expected assertionerror u use more salt n nuse more salt n nuse more salt use more salt n nuse more salt n nuse more salt | 1 |
20,452 | 6,041,080,163 | IssuesEvent | 2017-06-10 20:32:30 | HopefulLlama/UnitTestSCAD | https://api.github.com/repos/HopefulLlama/UnitTestSCAD | closed | Fix "no-return-assign" issue in src/tester/Assertions.js | codeclimate technical | Return statement should not contain assignment.
https://codeclimate.com/github/HopefulLlama/UnitTestSCAD/src/tester/Assertions.js#issue_593b221e9711610001000034 | 1.0 | Fix "no-return-assign" issue in src/tester/Assertions.js - Return statement should not contain assignment.
https://codeclimate.com/github/HopefulLlama/UnitTestSCAD/src/tester/Assertions.js#issue_593b221e9711610001000034 | non_test | fix no return assign issue in src tester assertions js return statement should not contain assignment | 0 |
4,677 | 2,741,772,470 | IssuesEvent | 2015-04-21 13:22:03 | mysociety/theyworkforyou | https://api.github.com/repos/mysociety/theyworkforyou | opened | New homepage - handle no calendar data | Design | If we don't have any upcoming calendar data we need to have something to put in this box.
For Scotland/NI where there is no upcoming at the moment I've just pinched the existing 'What is the Scottish Parliament/NI Assembly?" text. For Westminster we probably need something else, possibly just better words that the current rather glib holding text. | 1.0 | New homepage - handle no calendar data - If we don't have any upcoming calendar data we need to have something to put in this box.
For Scotland/NI where there is no upcoming at the moment I've just pinched the existing 'What is the Scottish Parliament/NI Assembly?" text. For Westminster we probably need something else, possibly just better words that the current rather glib holding text. | non_test | new homepage handle no calendar data if we don t have any upcoming calendar data we need to have something to put in this box for scotland ni where there is no upcoming at the moment i ve just pinched the existing what is the scottish parliament ni assembly text for westminster we probably need something else possibly just better words that the current rather glib holding text | 0 |
141,122 | 11,394,888,707 | IssuesEvent | 2020-01-30 10:16:20 | elastic/elasticsearch | https://api.github.com/repos/elastic/elasticsearch | closed | [ci] SmokeTestWatcherTestSuiteIT.testMonitorClusterHealth | :Core/Features/Watcher >test-failure | Doesn't reproduce, looks like the test cluster had a problem but there's nothing too interesting in the logs. This test has failed a handful of times in the last three months but for different resaons. Not sure there's anything to do here
https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+6.x+multijob-unix-compatibility/os=centos/1199/console
CI logs
[build-1199-watcher.txt](https://github.com/elastic/elasticsearch/files/2220970/build-1199-watcher.txt)
Integ cluster logs
[run.log](https://github.com/elastic/elasticsearch/files/2220987/run.log)
```
REPRODUCE WITH: ./gradlew :x-pack:qa:smoke-test-watcher:integTestRunner \
-Dtests.seed=D4D1EB8646558B01 \
-Dtests.class=org.elasticsearch.smoketest.SmokeTestWatcherTestSuiteIT \
-Dtests.method="testMonitorClusterHealth" \
-Dtests.security.manager=true \
-Dtests.locale=ar-LB \
-Dtests.timezone=America/Cuiaba
``` | 1.0 | [ci] SmokeTestWatcherTestSuiteIT.testMonitorClusterHealth - Doesn't reproduce, looks like the test cluster had a problem but there's nothing too interesting in the logs. This test has failed a handful of times in the last three months but for different resaons. Not sure there's anything to do here
https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+6.x+multijob-unix-compatibility/os=centos/1199/console
CI logs
[build-1199-watcher.txt](https://github.com/elastic/elasticsearch/files/2220970/build-1199-watcher.txt)
Integ cluster logs
[run.log](https://github.com/elastic/elasticsearch/files/2220987/run.log)
```
REPRODUCE WITH: ./gradlew :x-pack:qa:smoke-test-watcher:integTestRunner \
-Dtests.seed=D4D1EB8646558B01 \
-Dtests.class=org.elasticsearch.smoketest.SmokeTestWatcherTestSuiteIT \
-Dtests.method="testMonitorClusterHealth" \
-Dtests.security.manager=true \
-Dtests.locale=ar-LB \
-Dtests.timezone=America/Cuiaba
``` | test | smoketestwatchertestsuiteit testmonitorclusterhealth doesn t reproduce looks like the test cluster had a problem but there s nothing too interesting in the logs this test has failed a handful of times in the last three months but for different resaons not sure there s anything to do here ci logs integ cluster logs reproduce with gradlew x pack qa smoke test watcher integtestrunner dtests seed dtests class org elasticsearch smoketest smoketestwatchertestsuiteit dtests method testmonitorclusterhealth dtests security manager true dtests locale ar lb dtests timezone america cuiaba | 1 |
555 | 2,502,400,169 | IssuesEvent | 2015-01-09 08:29:53 | fossology/fossology | https://api.github.com/repos/fossology/fossology | opened | Support license List download | Component: Rank Component: Tester Priority: Normal Status: New Tracker: Feature | ---
Author Name: **Johannes Najjar**
Original Redmine Issue: 7485, http://www.fossology.org/issues/7485
Original Date: 2014/08/21
---
This feature has been suggested by Larry in #7467
license download for monk
on ui, there are 2 menues, | License List | License List Download |, right now, just support for nomos.
| 1.0 | Support license List download - ---
Author Name: **Johannes Najjar**
Original Redmine Issue: 7485, http://www.fossology.org/issues/7485
Original Date: 2014/08/21
---
This feature has been suggested by Larry in #7467
license download for monk
on ui, there are 2 menues, | License List | License List Download |, right now, just support for nomos.
| test | support license list download author name johannes najjar original redmine issue original date this feature has been suggested by larry in license download for monk on ui there are menues license list license list download right now just support for nomos | 1 |
350,507 | 31,897,646,280 | IssuesEvent | 2023-09-18 04:24:49 | oduck-team/oduck-server | https://api.github.com/repos/oduck-team/oduck-server | closed | 로컬 회원가입 및 관리자 로그인 | docs feature refactor test | ## 📝 개요
애니메이션 등록을 포함한 관리자 기능을 위한 로컬 회원가입 및 관리자 로그인 기능 추가
## ✅ TODO
- [x] 로컬 회원가입
- [x] 관리자 로그인
- [x] 암호 해시화(bcrypt)
- [x] 관리자 로그인
## 💬 기타
| 1.0 | 로컬 회원가입 및 관리자 로그인 - ## 📝 개요
애니메이션 등록을 포함한 관리자 기능을 위한 로컬 회원가입 및 관리자 로그인 기능 추가
## ✅ TODO
- [x] 로컬 회원가입
- [x] 관리자 로그인
- [x] 암호 해시화(bcrypt)
- [x] 관리자 로그인
## 💬 기타
| test | 로컬 회원가입 및 관리자 로그인 📝 개요 애니메이션 등록을 포함한 관리자 기능을 위한 로컬 회원가입 및 관리자 로그인 기능 추가 ✅ todo 로컬 회원가입 관리자 로그인 암호 해시화 bcrypt 관리자 로그인 💬 기타 | 1 |
330,738 | 28,485,143,008 | IssuesEvent | 2023-04-18 07:19:19 | pymc-devs/pytensor | https://api.github.com/repos/pymc-devs/pytensor | closed | Run mypy job with Python 3.11 once it is supported | maintenance tests | ### Description
We should revert c3bfe6665e9d2420dfb2256c51e64ddc300f960a once mypy is sorted out (and if it's not mypy fault, then whatever we are doing wrong).
Maybe related to https://github.com/python/mypy/issues/12840, maybe we need to open a new issue over there? | 1.0 | Run mypy job with Python 3.11 once it is supported - ### Description
We should revert c3bfe6665e9d2420dfb2256c51e64ddc300f960a once mypy is sorted out (and if it's not mypy fault, then whatever we are doing wrong).
Maybe related to https://github.com/python/mypy/issues/12840, maybe we need to open a new issue over there? | test | run mypy job with python once it is supported description we should revert once mypy is sorted out and if it s not mypy fault then whatever we are doing wrong maybe related to maybe we need to open a new issue over there | 1 |
61,410 | 6,735,437,391 | IssuesEvent | 2017-10-18 21:47:40 | rook/rook | https://api.github.com/repos/rook/rook | closed | FileSystem and objectStore are being created successfully only on first cluster in a multiple rook cluster environment | test | When deploying multiple rook cluster (different cluster name and namespace), FileSystem and objectStore are being successfully created in the first cluster that was deployed, and failing in other clusters.
**Repro Steps**
1. Install Rook Operator
2. Install 2 Rook clusters - with different cluster names and in different name spaces.
3. Create FileSystem and ObjectStore for cluster 1
4. Create FileSystem and ObjectStore for cluster 2
**Expected Result**
FileSystem and object Store is created successfully in both clusters.
**Actual Result**
FileSystem and ObjectStore not created for cluster 2, errors in rook-api service.
_Additional Data_
Snippit from Api service pod logs from failed cluster
```
exec: Running command: ceph osd pool application enable test-fs-2-metadata cephfs --cluster=rook-cluster-2 --conf=/var/lib/rook/rook-cluster-2/rook-cluster-2.config --keyring=/var/lib/rook/rook-cluster-2/client.admin.keyring --format json --out-file /tmp/883951700
2017-09-07 21:12:27.739241 I | exec: enabled application 'cephfs' on pool 'test-fs-2-metadata'
2017-09-07 21:12:27.739412 I | cephclient: creating pool test-fs-2-metadata succeeded, buf:
2017-09-07 21:12:27.739562 I | exec: Running command: ceph fs new test-fs-2 test-fs-2-metadata test-fs-2-data --cluster=rook-cluster-2 --conf=/var/lib/rook/rook-cluster-2/rook-cluster-2.config --keyring=/var/lib/rook/rook-cluster-2/client.admin.keyring --format json --out-file /tmp/080018851
2017-09-07 21:12:28.335082 I | exec: new fs with metadata pool 3 and data pool 2
2017-09-07 21:12:28.335403 I | cephmds: created file system test-fs-2 on data pool test-fs-2-data and metadata pool test-fs-2-metadata
2017-09-07 21:12:28.335436 I | api-k8s: Starting the MDS
2017-09-07 21:12:28.335442 I | op-mds: start running mds
2017-09-07 21:12:28.348522 E | api: failed to start mds: failed to create mds keyring. failed to get mds secrets. User "system:serviceaccount:rook-cluster-2:rook-api" cannot get secrets in the namespace "rook-cluster-2". (get secrets rook-ceph-mds)
2017-09-07 21:12:28.348570 I | api: 69bd4d66-8a03-4660-a342-5302af11d9bb POST /filesystem CreateFileSystem 500 Internal Server Error 5.066915744s
.
.
2017-09-07 21:13:10.567447 I | exec: Running command: ceph osd pool set default-c2.rgw.buckets.data size 1 --cluster=rook-cluster-2 --conf=/var/lib/rook/rook-cluster-2/rook-cluster-2.config --keyring=/var/lib/rook/rook-cluster-2/client.admin.keyring --format json --out-file /tmp/822476748
2017-09-07 21:13:11.602351 I | exec: set pool 9 size to 1
2017-09-07 21:13:11.602503 I | exec: Running command: ceph osd pool application enable default-c2.rgw.buckets.data default-c2.rgw.buckets.data --cluster=rook-cluster-2 --conf=/var/lib/rook/rook-cluster-2/rook-cluster-2.config --keyring=/var/lib/rook/rook-cluster-2/client.admin.keyring --format json --out-file /tmp/246867643
2017-09-07 21:13:12.614549 I | exec: enabled application 'default-c2.rgw.buckets.data' on pool 'default-c2.rgw.buckets.data'
2017-09-07 21:13:12.614659 I | cephclient: creating pool default-c2.rgw.buckets.data succeeded, buf:
2017-09-07 21:13:12.616939 E | api: failed to create object store: failed to start rgw. failed to create rgw keyring. failed to get rgw secrets. User "system:serviceaccount:rook-cluster-2:rook-api" cannot get secrets in the namespace "rook-cluster-2". (get secrets rook-ceph-rgw-default-c2)
2017-09-07 21:13:12.616964 I | api: bf634e12-6eea-4876-beb0-be67eff511fc POST /objectstore CreateObjectStore 500 Internal Server Error 18.740481423s
```
please look at the attached file for complete logs
[rook-api.txt](https://github.com/rook/rook/files/1286167/rook-api.txt)
[All logs.zip](https://github.com/rook/rook/files/1286170/All.logs.zip)
| 1.0 | FileSystem and objectStore are being created successfully only on first cluster in a multiple rook cluster environment - When deploying multiple rook cluster (different cluster name and namespace), FileSystem and objectStore are being successfully created in the first cluster that was deployed, and failing in other clusters.
**Repro Steps**
1. Install Rook Operator
2. Install 2 Rook clusters - with different cluster names and in different name spaces.
3. Create FileSystem and ObjectStore for cluster 1
4. Create FileSystem and ObjectStore for cluster 2
**Expected Result**
FileSystem and object Store is created successfully in both clusters.
**Actual Result**
FileSystem and ObjectStore not created for cluster 2, errors in rook-api service.
_Additional Data_
Snippit from Api service pod logs from failed cluster
```
exec: Running command: ceph osd pool application enable test-fs-2-metadata cephfs --cluster=rook-cluster-2 --conf=/var/lib/rook/rook-cluster-2/rook-cluster-2.config --keyring=/var/lib/rook/rook-cluster-2/client.admin.keyring --format json --out-file /tmp/883951700
2017-09-07 21:12:27.739241 I | exec: enabled application 'cephfs' on pool 'test-fs-2-metadata'
2017-09-07 21:12:27.739412 I | cephclient: creating pool test-fs-2-metadata succeeded, buf:
2017-09-07 21:12:27.739562 I | exec: Running command: ceph fs new test-fs-2 test-fs-2-metadata test-fs-2-data --cluster=rook-cluster-2 --conf=/var/lib/rook/rook-cluster-2/rook-cluster-2.config --keyring=/var/lib/rook/rook-cluster-2/client.admin.keyring --format json --out-file /tmp/080018851
2017-09-07 21:12:28.335082 I | exec: new fs with metadata pool 3 and data pool 2
2017-09-07 21:12:28.335403 I | cephmds: created file system test-fs-2 on data pool test-fs-2-data and metadata pool test-fs-2-metadata
2017-09-07 21:12:28.335436 I | api-k8s: Starting the MDS
2017-09-07 21:12:28.335442 I | op-mds: start running mds
2017-09-07 21:12:28.348522 E | api: failed to start mds: failed to create mds keyring. failed to get mds secrets. User "system:serviceaccount:rook-cluster-2:rook-api" cannot get secrets in the namespace "rook-cluster-2". (get secrets rook-ceph-mds)
2017-09-07 21:12:28.348570 I | api: 69bd4d66-8a03-4660-a342-5302af11d9bb POST /filesystem CreateFileSystem 500 Internal Server Error 5.066915744s
.
.
2017-09-07 21:13:10.567447 I | exec: Running command: ceph osd pool set default-c2.rgw.buckets.data size 1 --cluster=rook-cluster-2 --conf=/var/lib/rook/rook-cluster-2/rook-cluster-2.config --keyring=/var/lib/rook/rook-cluster-2/client.admin.keyring --format json --out-file /tmp/822476748
2017-09-07 21:13:11.602351 I | exec: set pool 9 size to 1
2017-09-07 21:13:11.602503 I | exec: Running command: ceph osd pool application enable default-c2.rgw.buckets.data default-c2.rgw.buckets.data --cluster=rook-cluster-2 --conf=/var/lib/rook/rook-cluster-2/rook-cluster-2.config --keyring=/var/lib/rook/rook-cluster-2/client.admin.keyring --format json --out-file /tmp/246867643
2017-09-07 21:13:12.614549 I | exec: enabled application 'default-c2.rgw.buckets.data' on pool 'default-c2.rgw.buckets.data'
2017-09-07 21:13:12.614659 I | cephclient: creating pool default-c2.rgw.buckets.data succeeded, buf:
2017-09-07 21:13:12.616939 E | api: failed to create object store: failed to start rgw. failed to create rgw keyring. failed to get rgw secrets. User "system:serviceaccount:rook-cluster-2:rook-api" cannot get secrets in the namespace "rook-cluster-2". (get secrets rook-ceph-rgw-default-c2)
2017-09-07 21:13:12.616964 I | api: bf634e12-6eea-4876-beb0-be67eff511fc POST /objectstore CreateObjectStore 500 Internal Server Error 18.740481423s
```
please look at the attached file for complete logs
[rook-api.txt](https://github.com/rook/rook/files/1286167/rook-api.txt)
[All logs.zip](https://github.com/rook/rook/files/1286170/All.logs.zip)
| test | filesystem and objectstore are being created successfully only on first cluster in a multiple rook cluster environment when deploying multiple rook cluster different cluster name and namespace filesystem and objectstore are being successfully created in the first cluster that was deployed and failing in other clusters repro steps install rook operator install rook clusters with different cluster names and in different name spaces create filesystem and objectstore for cluster create filesystem and objectstore for cluster expected result filesystem and object store is created successfully in both clusters actual result filesystem and objectstore not created for cluster errors in rook api service additional data snippit from api service pod logs from failed cluster exec running command ceph osd pool application enable test fs metadata cephfs cluster rook cluster conf var lib rook rook cluster rook cluster config keyring var lib rook rook cluster client admin keyring format json out file tmp i exec enabled application cephfs on pool test fs metadata i cephclient creating pool test fs metadata succeeded buf i exec running command ceph fs new test fs test fs metadata test fs data cluster rook cluster conf var lib rook rook cluster rook cluster config keyring var lib rook rook cluster client admin keyring format json out file tmp i exec new fs with metadata pool and data pool i cephmds created file system test fs on data pool test fs data and metadata pool test fs metadata i api starting the mds i op mds start running mds e api failed to start mds failed to create mds keyring failed to get mds secrets user system serviceaccount rook cluster rook api cannot get secrets in the namespace rook cluster get secrets rook ceph mds i api post filesystem createfilesystem internal server error i exec running command ceph osd pool set default rgw buckets data size cluster rook cluster conf var lib rook rook cluster rook cluster config keyring var lib rook rook cluster client admin keyring format json out file tmp i exec set pool size to i exec running command ceph osd pool application enable default rgw buckets data default rgw buckets data cluster rook cluster conf var lib rook rook cluster rook cluster config keyring var lib rook rook cluster client admin keyring format json out file tmp i exec enabled application default rgw buckets data on pool default rgw buckets data i cephclient creating pool default rgw buckets data succeeded buf e api failed to create object store failed to start rgw failed to create rgw keyring failed to get rgw secrets user system serviceaccount rook cluster rook api cannot get secrets in the namespace rook cluster get secrets rook ceph rgw default i api post objectstore createobjectstore internal server error please look at the attached file for complete logs | 1 |
219,415 | 16,829,751,445 | IssuesEvent | 2021-06-18 01:35:07 | Quansight/qhub | https://api.github.com/repos/Quansight/qhub | closed | Adding documentation on premptible and gpu node groups on GCP | Stale documentation | Recently added support for accelerators(gpus) and premptible instances needs documentation | 1.0 | Adding documentation on premptible and gpu node groups on GCP - Recently added support for accelerators(gpus) and premptible instances needs documentation | non_test | adding documentation on premptible and gpu node groups on gcp recently added support for accelerators gpus and premptible instances needs documentation | 0 |
66,073 | 6,988,130,522 | IssuesEvent | 2017-12-14 11:44:06 | GTNewHorizons/NewHorizons | https://api.github.com/repos/GTNewHorizons/NewHorizons | closed | Dark Steel Tools and Weapons quest is bugged | duplicate FixedInDev need to be tested | #### Which modpack version are you using?
2.0.1.3
#
#### If in multiplayer; On which server does this happen?
Epsilon
#
#### What did you try to do, and what did you expect to happen?
I crafted a fresh set of dark steel tools for the quest, and 4 vibrant crystals
The first 2 quest stages completed successfully.
Then I empowered the tools and charged them, but the quest does not accept them for the final stage.
| 1.0 | Dark Steel Tools and Weapons quest is bugged - #### Which modpack version are you using?
2.0.1.3
#
#### If in multiplayer; On which server does this happen?
Epsilon
#
#### What did you try to do, and what did you expect to happen?
I crafted a fresh set of dark steel tools for the quest, and 4 vibrant crystals
The first 2 quest stages completed successfully.
Then I empowered the tools and charged them, but the quest does not accept them for the final stage.
| test | dark steel tools and weapons quest is bugged which modpack version are you using if in multiplayer on which server does this happen epsilon what did you try to do and what did you expect to happen i crafted a fresh set of dark steel tools for the quest and vibrant crystals the first quest stages completed successfully then i empowered the tools and charged them but the quest does not accept them for the final stage | 1 |
232,948 | 17,836,144,133 | IssuesEvent | 2021-09-03 01:32:50 | Skylar-Tech/node-red-contrib-matrix-chat | https://api.github.com/repos/Skylar-Tech/node-red-contrib-matrix-chat | closed | Double check all node docs to make sure they are correct | documentation | I did my best on the documentation for nodes but I have a feeling there are some mistakes or missing information. This should be verified.
Check off each one as it is verified:
- [x] matrix-server-config
- [x] matrix-receive
- [x] matrix-send-message
- [x] matrix-send-file
- [x] matrix-send-image
- [x] matrix-react
- [x] matrix-create-room
- [x] matrix-invite-room
- [x] matrix-join-room
- [x] matrix-decrypt-file
- [x] matrix-room-kick
- [x] matrix-room-ban
- [x] matrix-synapse-users
- [x] matrix-synapse-register
- [x] matrix-synapse-create-edit-user
- [x] matrix-synapse-deactivate-user
- [x] matrix-synapse-join-room
- [x] matrix-whois-user
- [x] matrix-room-users | 1.0 | Double check all node docs to make sure they are correct - I did my best on the documentation for nodes but I have a feeling there are some mistakes or missing information. This should be verified.
Check off each one as it is verified:
- [x] matrix-server-config
- [x] matrix-receive
- [x] matrix-send-message
- [x] matrix-send-file
- [x] matrix-send-image
- [x] matrix-react
- [x] matrix-create-room
- [x] matrix-invite-room
- [x] matrix-join-room
- [x] matrix-decrypt-file
- [x] matrix-room-kick
- [x] matrix-room-ban
- [x] matrix-synapse-users
- [x] matrix-synapse-register
- [x] matrix-synapse-create-edit-user
- [x] matrix-synapse-deactivate-user
- [x] matrix-synapse-join-room
- [x] matrix-whois-user
- [x] matrix-room-users | non_test | double check all node docs to make sure they are correct i did my best on the documentation for nodes but i have a feeling there are some mistakes or missing information this should be verified check off each one as it is verified matrix server config matrix receive matrix send message matrix send file matrix send image matrix react matrix create room matrix invite room matrix join room matrix decrypt file matrix room kick matrix room ban matrix synapse users matrix synapse register matrix synapse create edit user matrix synapse deactivate user matrix synapse join room matrix whois user matrix room users | 0 |
300,993 | 26,008,932,941 | IssuesEvent | 2022-12-20 22:32:55 | LIBCAS/DL4DH-Kramerius-plus | https://api.github.com/repos/LIBCAS/DL4DH-Kramerius-plus | closed | Vlastnost modsMetadata neobsahuje žádné hodnoty | Kramerius+ JSON TEST T::ToTests | ## Postup
Export publikace s `uuid:0c94cf70-188a-11e4-8f64-005056827e52` ve formátu JSON bez žádných nastavovaných parametrů.
Viz `api/export/download/47db86e6-75b2-4164-a0df-48dfbcc447ff`.
## Výsledek
Vlastnost `modsMetadata` je rovna hodnotě `null`.
## Očekávané chování
Vlastnost `modsMetadata` by měla obsahovat metadata o dokumentu převzatá z dokumentu typu _MODS_, viz [ukázkový dokument](https://drive.google.com/file/d/13VxvSuC4su_yzjzmqAT44H-Ka4whfSi7/view?pli=1). | 2.0 | Vlastnost modsMetadata neobsahuje žádné hodnoty - ## Postup
Export publikace s `uuid:0c94cf70-188a-11e4-8f64-005056827e52` ve formátu JSON bez žádných nastavovaných parametrů.
Viz `api/export/download/47db86e6-75b2-4164-a0df-48dfbcc447ff`.
## Výsledek
Vlastnost `modsMetadata` je rovna hodnotě `null`.
## Očekávané chování
Vlastnost `modsMetadata` by měla obsahovat metadata o dokumentu převzatá z dokumentu typu _MODS_, viz [ukázkový dokument](https://drive.google.com/file/d/13VxvSuC4su_yzjzmqAT44H-Ka4whfSi7/view?pli=1). | test | vlastnost modsmetadata neobsahuje žádné hodnoty postup export publikace s uuid ve formátu json bez žádných nastavovaných parametrů viz api export download výsledek vlastnost modsmetadata je rovna hodnotě null očekávané chování vlastnost modsmetadata by měla obsahovat metadata o dokumentu převzatá z dokumentu typu mods viz | 1 |
216,150 | 16,745,358,503 | IssuesEvent | 2021-06-11 14:52:59 | microsoft/vscode | https://api.github.com/repos/microsoft/vscode | closed | NotebookKernel: set outputs on cancel | *duplicate notebook unit-test-failure | ```
1) NotebookKernel
set outputs on cancel:
TypeError: call.getFileName is not a function
at getErrMessage (assert.js:279:25)
at innerOk (assert.js:370:17)
at Function.ok (assert.js:390:3)
at Context.<anonymous> (file:///Users/runner/work/vscode/vscode/out/vs/workbench/test/browser/api/extHostNotebookKernel.test.js:202:20)
```
https://github.com/microsoft/vscode/runs/2801405433?check_suite_focus=true#step:10:8470 | 1.0 | NotebookKernel: set outputs on cancel - ```
1) NotebookKernel
set outputs on cancel:
TypeError: call.getFileName is not a function
at getErrMessage (assert.js:279:25)
at innerOk (assert.js:370:17)
at Function.ok (assert.js:390:3)
at Context.<anonymous> (file:///Users/runner/work/vscode/vscode/out/vs/workbench/test/browser/api/extHostNotebookKernel.test.js:202:20)
```
https://github.com/microsoft/vscode/runs/2801405433?check_suite_focus=true#step:10:8470 | test | notebookkernel set outputs on cancel notebookkernel set outputs on cancel typeerror call getfilename is not a function at geterrmessage assert js at innerok assert js at function ok assert js at context file users runner work vscode vscode out vs workbench test browser api exthostnotebookkernel test js | 1 |
237,394 | 19,621,536,359 | IssuesEvent | 2022-01-07 07:27:32 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | opened | roachtest: sqlsmith/setup=empty/setting=no-ddl failed | C-test-failure O-robot O-roachtest branch-master release-blocker | roachtest.sqlsmith/setup=empty/setting=no-ddl [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=4061806&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=4061806&tab=artifacts#/sqlsmith/setup=empty/setting=no-ddl) on master @ [d191f64e5600b773043da18d28e288bf6ae86b72](https://github.com/cockroachdb/cockroach/commits/d191f64e5600b773043da18d28e288bf6ae86b72):
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/sqlsmith/setup=empty/setting=no-ddl/run_1
sqlsmith.go:278,sqlsmith.go:298,test_runner.go:780: ping node 1: dial tcp 34.138.211.10:26257: connect: connection refused
previous sql:
SELECT
(-1.3166383504867554):::FLOAT8 AS col_198719
FROM
(
VALUES
(15875:::INT8, '-Inf':::FLOAT8, 0.992249921181157:::FLOAT8, e'\x0f\\':::STRING, 4059338996:::OID),
(
(-1):::INT8,
(-2.0671825408935547):::FLOAT8,
1.401298464324817e-45:::FLOAT8,
e'y\x16\x03}a\x07w':::STRING,
333157043:::OID
),
(
(-21289):::INT8,
(-1.5068405866622925):::FLOAT8,
NULL,
sha384('\x2f04a5':::BYTES::BYTES)::STRING,
3693203294:::OID
),
(
7335:::INT8,
NULL,
(0.42906490993539437:::FLOAT8::FLOAT8 + (-1.891056623397997):::FLOAT8::FLOAT8)::FLOAT8,
e'0\x05Z M\x051ku':::STRING,
595378716:::OID
),
(NULL, 0.0:::FLOAT8, (-1.0838586669877346):::FLOAT8, e'\x03``td>V':::STRING, 4252665745:::OID)
)
AS tab_115001 (col_198712, col_198713, col_198714, col_198715, col_198716),
(
VALUES
(757214978.7415306577:::DECIMAL, 5e-324:::FLOAT8),
(615535.5586741538899:::DECIMAL, (-0.32377636591513137):::FLOAT8),
(
(-96224327760.3915802):::DECIMAL,
st_frechetdistance('0103000080010000000C000000CE9E76EF92F1EFC1401F707BAD8CA3C1F3A8A8A808D8F6C1205B307A130CFBC12D23CFC3CB4FF2C1E2A867F6DD37E6C140E31059B8C4BFC150C320B25706C4C1C24AE2734514F541AEA2B9D05B9EF041ACB2B5EB74F8EAC16514560D7D1900C2A03A0937568CF74174F6D85EFAB4ECC108D4C4FE2D8DD0C15EDF25C0A5F0F74174F3D4FA322DEAC1348CAEAB433AE5419805BDBA4BE8DA4168D06BFA93E2EE41669A6FD69A63E4C12C4E7C5C8B0EE14128A44A7AACBDF841CE0319CA98B9F141A001BFB53776C54122EC0A62E971FA41133E9E63C12CF2C16F474D0C7F16F4C1DCD65455505F0042C0D1CE7E8720A7413C121359BECBFDC1E00529E5B0BED341C0A5E2D7039CAD41CE9E76EF92F1EFC1401F707BAD8CA3C1F3A8A8A808D8F6C1':::GEOMETRY::GEOMETRY, '01040000C00300000001010000C0D7D4F76CC962FFC19429E93B6EEFFA41FC9ED8437F21E141C05A3D4ADB17D3C101010000C0C05A3EFE3F09BBC1A4D5EA605777F1C1908A4DB8D222EBC16072AE0F17DCDDC101010000C0CA7B3DE00376F641C7A7834A6760F3C1F00E7ED767FAEB412D053FEBE438FCC1':::GEOMETRY::GEOMETRY, 0.00012022606943190572:::FLOAT8::FLOAT8)::FLOAT8
),
((-178.8466842875636670):::DECIMAL, 0.44453952838657035:::FLOAT8)
)
AS tab_115002 (col_198717, col_198718)
LIMIT
51:::INT8;
```
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/sql-queries
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*sqlsmith/setup=empty/setting=no-ddl.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| 2.0 | roachtest: sqlsmith/setup=empty/setting=no-ddl failed - roachtest.sqlsmith/setup=empty/setting=no-ddl [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=4061806&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=4061806&tab=artifacts#/sqlsmith/setup=empty/setting=no-ddl) on master @ [d191f64e5600b773043da18d28e288bf6ae86b72](https://github.com/cockroachdb/cockroach/commits/d191f64e5600b773043da18d28e288bf6ae86b72):
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/sqlsmith/setup=empty/setting=no-ddl/run_1
sqlsmith.go:278,sqlsmith.go:298,test_runner.go:780: ping node 1: dial tcp 34.138.211.10:26257: connect: connection refused
previous sql:
SELECT
(-1.3166383504867554):::FLOAT8 AS col_198719
FROM
(
VALUES
(15875:::INT8, '-Inf':::FLOAT8, 0.992249921181157:::FLOAT8, e'\x0f\\':::STRING, 4059338996:::OID),
(
(-1):::INT8,
(-2.0671825408935547):::FLOAT8,
1.401298464324817e-45:::FLOAT8,
e'y\x16\x03}a\x07w':::STRING,
333157043:::OID
),
(
(-21289):::INT8,
(-1.5068405866622925):::FLOAT8,
NULL,
sha384('\x2f04a5':::BYTES::BYTES)::STRING,
3693203294:::OID
),
(
7335:::INT8,
NULL,
(0.42906490993539437:::FLOAT8::FLOAT8 + (-1.891056623397997):::FLOAT8::FLOAT8)::FLOAT8,
e'0\x05Z M\x051ku':::STRING,
595378716:::OID
),
(NULL, 0.0:::FLOAT8, (-1.0838586669877346):::FLOAT8, e'\x03``td>V':::STRING, 4252665745:::OID)
)
AS tab_115001 (col_198712, col_198713, col_198714, col_198715, col_198716),
(
VALUES
(757214978.7415306577:::DECIMAL, 5e-324:::FLOAT8),
(615535.5586741538899:::DECIMAL, (-0.32377636591513137):::FLOAT8),
(
(-96224327760.3915802):::DECIMAL,
st_frechetdistance('0103000080010000000C000000CE9E76EF92F1EFC1401F707BAD8CA3C1F3A8A8A808D8F6C1205B307A130CFBC12D23CFC3CB4FF2C1E2A867F6DD37E6C140E31059B8C4BFC150C320B25706C4C1C24AE2734514F541AEA2B9D05B9EF041ACB2B5EB74F8EAC16514560D7D1900C2A03A0937568CF74174F6D85EFAB4ECC108D4C4FE2D8DD0C15EDF25C0A5F0F74174F3D4FA322DEAC1348CAEAB433AE5419805BDBA4BE8DA4168D06BFA93E2EE41669A6FD69A63E4C12C4E7C5C8B0EE14128A44A7AACBDF841CE0319CA98B9F141A001BFB53776C54122EC0A62E971FA41133E9E63C12CF2C16F474D0C7F16F4C1DCD65455505F0042C0D1CE7E8720A7413C121359BECBFDC1E00529E5B0BED341C0A5E2D7039CAD41CE9E76EF92F1EFC1401F707BAD8CA3C1F3A8A8A808D8F6C1':::GEOMETRY::GEOMETRY, '01040000C00300000001010000C0D7D4F76CC962FFC19429E93B6EEFFA41FC9ED8437F21E141C05A3D4ADB17D3C101010000C0C05A3EFE3F09BBC1A4D5EA605777F1C1908A4DB8D222EBC16072AE0F17DCDDC101010000C0CA7B3DE00376F641C7A7834A6760F3C1F00E7ED767FAEB412D053FEBE438FCC1':::GEOMETRY::GEOMETRY, 0.00012022606943190572:::FLOAT8::FLOAT8)::FLOAT8
),
((-178.8466842875636670):::DECIMAL, 0.44453952838657035:::FLOAT8)
)
AS tab_115002 (col_198717, col_198718)
LIMIT
51:::INT8;
```
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/sql-queries
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*sqlsmith/setup=empty/setting=no-ddl.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| test | roachtest sqlsmith setup empty setting no ddl failed roachtest sqlsmith setup empty setting no ddl with on master the test failed on branch master cloud gce test artifacts and logs in home agent work go src github com cockroachdb cockroach artifacts sqlsmith setup empty setting no ddl run sqlsmith go sqlsmith go test runner go ping node dial tcp connect connection refused previous sql select as col from values inf e string oid e y a string oid null bytes bytes string oid null e m string oid null e td v string oid as tab col col col col col values decimal decimal decimal st frechetdistance geometry geometry geometry geometry decimal as tab col col limit help see see cc cockroachdb sql queries | 1 |
130,851 | 5,134,401,035 | IssuesEvent | 2017-01-11 08:52:36 | TwidereProject/Twidere-Android | https://api.github.com/repos/TwidereProject/Twidere-Android | closed | Open external links within the app | category:functionality priority:medium status:done type:enhancement | Add please ability to open external links within the app.
It's a killer feature. :)
| 1.0 | Open external links within the app - Add please ability to open external links within the app.
It's a killer feature. :)
| non_test | open external links within the app add please ability to open external links within the app it s a killer feature | 0 |
36,545 | 7,978,960,487 | IssuesEvent | 2018-07-17 20:04:36 | aleofreddi/svgpan | https://api.github.com/repos/aleofreddi/svgpan | closed | Antegrate with Angular | Priority-Medium Type-Defect | ```
What steps will reproduce the problem?
1. <span ng-include="'svg/tiger.svg'"></span>
2. insert id <svg id="svgCanvas"
3. Change var root = document.getElementById('svgCanvas');
4. Link script at end of SVG </svg><script xlink:href="svgpan.js" />
What is the expected output? What do you see instead?
Uncaught ReferenceError: handleMouseMove is not defined
Would like the script to be bound to the svg root and keep working
independently from Angular if possible. Otherwise, what is the best strtegy to
port it to Angular.
What version of the product are you using? On what operating system?
AngularJS v1.2.9
```
Original issue reported on code.google.com by `claude.b...@gmail.com` on 29 Jan 2014 at 4:54
| 1.0 | Antegrate with Angular - ```
What steps will reproduce the problem?
1. <span ng-include="'svg/tiger.svg'"></span>
2. insert id <svg id="svgCanvas"
3. Change var root = document.getElementById('svgCanvas');
4. Link script at end of SVG </svg><script xlink:href="svgpan.js" />
What is the expected output? What do you see instead?
Uncaught ReferenceError: handleMouseMove is not defined
Would like the script to be bound to the svg root and keep working
independently from Angular if possible. Otherwise, what is the best strtegy to
port it to Angular.
What version of the product are you using? On what operating system?
AngularJS v1.2.9
```
Original issue reported on code.google.com by `claude.b...@gmail.com` on 29 Jan 2014 at 4:54
| non_test | antegrate with angular what steps will reproduce the problem insert id svg id svgcanvas change var root document getelementbyid svgcanvas link script at end of svg what is the expected output what do you see instead uncaught referenceerror handlemousemove is not defined would like the script to be bound to the svg root and keep working independently from angular if possible otherwise what is the best strtegy to port it to angular what version of the product are you using on what operating system angularjs original issue reported on code google com by claude b gmail com on jan at | 0 |
311,376 | 26,786,365,824 | IssuesEvent | 2023-02-01 03:18:30 | aodn/nrmn-application | https://api.github.com/repos/aodn/nrmn-application | closed | Ingested Quadrat data not complying with quadrat summing to 50 rule | ready to test systest | From https://github.com/aodn/nrmn-application/issues/1085#issuecomment-1302768037
For an undetermined period, quadrat data that do not comply with the "quadrat do not sum to 50" rule have been successfully ingested in the production database. The validation rule is correctly applied in the current version of the software.
- [x] do and inventory of impacted survey
- [ ] discuss with the facility best way to correct impacted data | 2.0 | Ingested Quadrat data not complying with quadrat summing to 50 rule - From https://github.com/aodn/nrmn-application/issues/1085#issuecomment-1302768037
For an undetermined period, quadrat data that do not comply with the "quadrat do not sum to 50" rule have been successfully ingested in the production database. The validation rule is correctly applied in the current version of the software.
- [x] do and inventory of impacted survey
- [ ] discuss with the facility best way to correct impacted data | test | ingested quadrat data not complying with quadrat summing to rule from for an undetermined period quadrat data that do not comply with the quadrat do not sum to rule have been successfully ingested in the production database the validation rule is correctly applied in the current version of the software do and inventory of impacted survey discuss with the facility best way to correct impacted data | 1 |
186,016 | 6,732,760,419 | IssuesEvent | 2017-10-18 12:45:37 | salesagility/SuiteCRM | https://api.github.com/repos/salesagility/SuiteCRM | closed | SuiteP quickcreate forms not visible in mobile | bug Fix Proposed High Priority Resolved: Next Release | Quickcreate form is not visible in SuiteP theme and mobile browser.
#### Issue
Mobile browsers do not display the quickcreate form when button "Create" pressed in subpanel.
#### Expected Behavior
Quickcreate form should appear
#### Actual Behavior
Nothing happens when button "create" pressed
#### Possible Fix
Probably javascript issue.
#### Steps to Reproduce
1. With mobile browser (chrome on Android) go to Accounts and detail view
2. Scroll down to Opportunities subpanel
3. Press CREATE -> no quickcreate form is displayed
#### Context
#### Your Environment
* SuiteCRM Version used: 7.9.3
* Browser name and version : Chrome on mobile or developer extension on desktop chrome (on iPad browser it works ok.)
| 1.0 | SuiteP quickcreate forms not visible in mobile - Quickcreate form is not visible in SuiteP theme and mobile browser.
#### Issue
Mobile browsers do not display the quickcreate form when button "Create" pressed in subpanel.
#### Expected Behavior
Quickcreate form should appear
#### Actual Behavior
Nothing happens when button "create" pressed
#### Possible Fix
Probably javascript issue.
#### Steps to Reproduce
1. With mobile browser (chrome on Android) go to Accounts and detail view
2. Scroll down to Opportunities subpanel
3. Press CREATE -> no quickcreate form is displayed
#### Context
#### Your Environment
* SuiteCRM Version used: 7.9.3
* Browser name and version : Chrome on mobile or developer extension on desktop chrome (on iPad browser it works ok.)
| non_test | suitep quickcreate forms not visible in mobile quickcreate form is not visible in suitep theme and mobile browser issue mobile browsers do not display the quickcreate form when button create pressed in subpanel expected behavior quickcreate form should appear actual behavior nothing happens when button create pressed possible fix probably javascript issue steps to reproduce with mobile browser chrome on android go to accounts and detail view scroll down to opportunities subpanel press create no quickcreate form is displayed context your environment suitecrm version used browser name and version chrome on mobile or developer extension on desktop chrome on ipad browser it works ok | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.