Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 7 112 | repo_url stringlengths 36 141 | action stringclasses 3 values | title stringlengths 1 853 | labels stringlengths 4 898 | body stringlengths 2 262k | index stringclasses 13 values | text_combine stringlengths 96 262k | label stringclasses 2 values | text stringlengths 96 250k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
6,105 | 3,749,836,257 | IssuesEvent | 2016-03-11 02:10:07 | pydata/pandas | https://api.github.com/repos/pydata/pandas | closed | CI/BLD: update to conda-build >1.19.0 [win only] | Build CI Compat | waiting on breakage in: https://github.com/conda/conda-build/issues/820
to fix the conda-recipe
can just use ``conda-build=1.19.0`` ATM.
| 1.0 | CI/BLD: update to conda-build >1.19.0 [win only] - waiting on breakage in: https://github.com/conda/conda-build/issues/820
to fix the conda-recipe
can just use ``conda-build=1.19.0`` ATM.
| build | ci bld update to conda build waiting on breakage in to fix the conda recipe can just use conda build atm | 1 |
98,530 | 29,954,657,436 | IssuesEvent | 2023-06-23 06:22:11 | microsoft/appcenter | https://api.github.com/repos/microsoft/appcenter | closed | [Feature] Request React Native New Architecture | feature request build | **Describe the solution you'd like**
In the current AppCenter service, it's impossible to build React Native Apps with the New Architecture enabled.
In terms of Android, if I enable the new arch, it takes more than 60 minutes to make the build, and automatically it's stopped because 60 min is the maximum time in the paid tier (30 mins in the case of the free tier).
In iOS is impossible to run the `RCT_NEW_ARCH_ENABLED=1 bundle exec pod install` command because the `pod-install` step is called automatically by the AppCenter pipeline and it's not possible to modify it.
**Describe alternatives you've considered**
- Increase the maximum build time.
- Allow to modify the `pod-install` step. | 1.0 | [Feature] Request React Native New Architecture - **Describe the solution you'd like**
In the current AppCenter service, it's impossible to build React Native Apps with the New Architecture enabled.
In terms of Android, if I enable the new arch, it takes more than 60 minutes to make the build, and automatically it's stopped because 60 min is the maximum time in the paid tier (30 mins in the case of the free tier).
In iOS is impossible to run the `RCT_NEW_ARCH_ENABLED=1 bundle exec pod install` command because the `pod-install` step is called automatically by the AppCenter pipeline and it's not possible to modify it.
**Describe alternatives you've considered**
- Increase the maximum build time.
- Allow to modify the `pod-install` step. | build | request react native new architecture describe the solution you d like in the current appcenter service it s impossible to build react native apps with the new architecture enabled in terms of android if i enable the new arch it takes more than minutes to make the build and automatically it s stopped because min is the maximum time in the paid tier mins in the case of the free tier in ios is impossible to run the rct new arch enabled bundle exec pod install command because the pod install step is called automatically by the appcenter pipeline and it s not possible to modify it describe alternatives you ve considered increase the maximum build time allow to modify the pod install step | 1 |
54,961 | 13,491,853,810 | IssuesEvent | 2020-09-11 17:08:15 | googleapis/python-game-servers | https://api.github.com/repos/googleapis/python-game-servers | closed | samples.snippets.realm_and_cluster_test: test_get_cluster failed | :rotating_light: api: gameservices buildcop: flaky buildcop: issue priority: p1 samples type: bug | This test failed!
To configure my behavior, see [the Build Cop Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/master/packages/buildcop).
If I'm commenting on this issue too often, add the `buildcop: quiet` label and
I will stop commenting.
---
commit: 893dc5a1aa1bfeb3c26baa45c80a9c9fa0102a6c
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/0f62ccdf-ce74-48f0-8063-9b41db384774), [Sponge](http://sponge2/0f62ccdf-ce74-48f0-8063-9b41db384774)
status: failed
<details><summary>Test output</summary><br><pre>args = (parent: "projects/python-docs-samples-tests/locations/us-central1"
realm_id: "realm-5d04520a-2b83-48ba-ab2d-8e5c1bc6f6a6"
realm {
time_zone: "US/Pacific"
description: "My Realm"
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'parent=projects/python-docs-samples-tests/locations/us-central1'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.31.0 gax/1.22.1 gapic/0.3.0')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
> return callable_(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7fb75a186b90>
request = parent: "projects/python-docs-samples-tests/locations/us-central1"
realm_id: "realm-5d04520a-2b83-48ba-ab2d-8e5c1bc6f6a6"
realm {
time_zone: "US/Pacific"
description: "My Realm"
}
timeout = None
metadata = [('x-goog-request-params', 'parent=projects/python-docs-samples-tests/locations/us-central1'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.31.0 gax/1.22.1 gapic/0.3.0')]
credentials = None, wait_for_ready = None, compression = None
def __call__(self,
request,
timeout=None,
metadata=None,
credentials=None,
wait_for_ready=None,
compression=None):
state, call, = self._blocking(request, timeout, metadata, credentials,
wait_for_ready, compression)
> return _end_unary_response_blocking(state, call, False, None)
.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
state = <grpc._channel._RPCState object at 0x7fb75a112910>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7fb75a11a190>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
> raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAVAILABLE
E details = "502:Bad Gateway"
E debug_error_string = "{"created":"@1598523115.617003515","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"502:Bad Gateway","grpc_status":14}"
E >
.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:729: _InactiveRpcError
The above exception was the direct cause of the following exception:
@pytest.fixture(scope="function")
def test_realm_with_cluster():
realm_id = "realm-{}".format(uuid.uuid4())
print(f"Creating realm {realm_id} in location {REALM_LOCATION} in project {PROJECT_ID}")
> create_realm.create_realm(PROJECT_ID, REALM_LOCATION, realm_id)
realm_and_cluster_test.py:64:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
create_realm.py:44: in create_realm
operation = client.create_realm(request)
../../google/cloud/gaming_v1/services/realms_service/client.py:485: in create_realm
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in __call__
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "502:Bad Gateway"
debug_e...95.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"502:Bad Gateway","grpc_status":14}"
>
> ???
E google.api_core.exceptions.ServiceUnavailable: 503 502:Bad Gateway
<string>:3: ServiceUnavailable</pre></details> | 2.0 | samples.snippets.realm_and_cluster_test: test_get_cluster failed - This test failed!
To configure my behavior, see [the Build Cop Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/master/packages/buildcop).
If I'm commenting on this issue too often, add the `buildcop: quiet` label and
I will stop commenting.
---
commit: 893dc5a1aa1bfeb3c26baa45c80a9c9fa0102a6c
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/0f62ccdf-ce74-48f0-8063-9b41db384774), [Sponge](http://sponge2/0f62ccdf-ce74-48f0-8063-9b41db384774)
status: failed
<details><summary>Test output</summary><br><pre>args = (parent: "projects/python-docs-samples-tests/locations/us-central1"
realm_id: "realm-5d04520a-2b83-48ba-ab2d-8e5c1bc6f6a6"
realm {
time_zone: "US/Pacific"
description: "My Realm"
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'parent=projects/python-docs-samples-tests/locations/us-central1'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.31.0 gax/1.22.1 gapic/0.3.0')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
> return callable_(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7fb75a186b90>
request = parent: "projects/python-docs-samples-tests/locations/us-central1"
realm_id: "realm-5d04520a-2b83-48ba-ab2d-8e5c1bc6f6a6"
realm {
time_zone: "US/Pacific"
description: "My Realm"
}
timeout = None
metadata = [('x-goog-request-params', 'parent=projects/python-docs-samples-tests/locations/us-central1'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.31.0 gax/1.22.1 gapic/0.3.0')]
credentials = None, wait_for_ready = None, compression = None
def __call__(self,
request,
timeout=None,
metadata=None,
credentials=None,
wait_for_ready=None,
compression=None):
state, call, = self._blocking(request, timeout, metadata, credentials,
wait_for_ready, compression)
> return _end_unary_response_blocking(state, call, False, None)
.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
state = <grpc._channel._RPCState object at 0x7fb75a112910>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7fb75a11a190>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
> raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAVAILABLE
E details = "502:Bad Gateway"
E debug_error_string = "{"created":"@1598523115.617003515","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"502:Bad Gateway","grpc_status":14}"
E >
.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:729: _InactiveRpcError
The above exception was the direct cause of the following exception:
@pytest.fixture(scope="function")
def test_realm_with_cluster():
realm_id = "realm-{}".format(uuid.uuid4())
print(f"Creating realm {realm_id} in location {REALM_LOCATION} in project {PROJECT_ID}")
> create_realm.create_realm(PROJECT_ID, REALM_LOCATION, realm_id)
realm_and_cluster_test.py:64:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
create_realm.py:44: in create_realm
operation = client.create_realm(request)
../../google/cloud/gaming_v1/services/realms_service/client.py:485: in create_realm
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in __call__
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "502:Bad Gateway"
debug_e...95.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"502:Bad Gateway","grpc_status":14}"
>
> ???
E google.api_core.exceptions.ServiceUnavailable: 503 502:Bad Gateway
<string>:3: ServiceUnavailable</pre></details> | build | samples snippets realm and cluster test test get cluster failed this test failed to configure my behavior see if i m commenting on this issue too often add the buildcop quiet label and i will stop commenting commit buildurl status failed test output args parent projects python docs samples tests locations us realm id realm realm time zone us pacific description my realm kwargs metadata six wraps callable def error remapped callable args kwargs try return callable args kwargs nox py lib site packages google api core grpc helpers py self request parent projects python docs samples tests locations us realm id realm realm time zone us pacific description my realm timeout none metadata credentials none wait for ready none compression none def call self request timeout none metadata none credentials none wait for ready none compression none state call self blocking request timeout metadata credentials wait for ready compression return end unary response blocking state call false none nox py lib site packages grpc channel py state call with call false deadline none def end unary response blocking state call with call deadline if state code is grpc statuscode ok if with call rendezvous multithreadedrendezvous state call none deadline return state response rendezvous else return state response else raise inactiverpcerror state e grpc channel inactiverpcerror inactiverpcerror of rpc that terminated with e status statuscode unavailable e details bad gateway e debug error string created description error received from peer file src core lib surface call cc file line grpc message bad gateway grpc status e nox py lib site packages grpc channel py inactiverpcerror the above exception was the direct cause of the following exception pytest fixture scope function def test realm with cluster realm id realm format uuid print f creating realm realm id in location realm location in project project id create realm create realm project id realm location realm id realm and cluster test py create realm py in create realm operation client create realm request google cloud gaming services realms service client py in create realm response rpc request retry retry timeout timeout metadata metadata nox py lib site packages google api core gapic method py in call return wrapped func args kwargs nox py lib site packages google api core grpc helpers py in error remapped callable six raise from exceptions from grpc error exc exc value none from value inactiverpcerror of rpc that terminated with status statuscode unavailable details bad gateway debug e file src core lib surface call cc file line grpc message bad gateway grpc status e google api core exceptions serviceunavailable bad gateway serviceunavailable | 1 |
796 | 2,545,227,310 | IssuesEvent | 2015-01-29 15:57:18 | slick/slick | https://api.github.com/repos/slick/slick | closed | review / improve code samples on website | improvement topic:documentation | *[Migrated from Assembla ticket [233](https://www.assembla.com/spaces/typesafe-slick/tickets/233) - reported by @cvogt on 2013-04-28 23:51:08]*
maybe the examples on the frontpage of the website could be improved. E.g. the "^ comparing Rep[Int] to Rep[Int]!" and "// ^ comparing Int to Int!" could be more confusing than helpful. Also it be nice if the examples were compiled somehow to check they are correct. | 1.0 | review / improve code samples on website - *[Migrated from Assembla ticket [233](https://www.assembla.com/spaces/typesafe-slick/tickets/233) - reported by @cvogt on 2013-04-28 23:51:08]*
maybe the examples on the frontpage of the website could be improved. E.g. the "^ comparing Rep[Int] to Rep[Int]!" and "// ^ comparing Int to Int!" could be more confusing than helpful. Also it be nice if the examples were compiled somehow to check they are correct. | non_build | review improve code samples on website reported by cvogt on maybe the examples on the frontpage of the website could be improved e g the comparing rep to rep and comparing int to int could be more confusing than helpful also it be nice if the examples were compiled somehow to check they are correct | 0 |
460,307 | 13,208,008,142 | IssuesEvent | 2020-08-15 01:50:20 | kubernetes-sigs/kind | https://api.github.com/repos/kubernetes-sigs/kind | closed | Add CLI command to list images installed on Nodes | kind/feature lifecycle/rotten priority/awaiting-more-evidence | <!-- Please only use this template for submitting enhancement requests -->
**What would you like to be added**:
A new CLI subcommand that would list images present in a given kind cluster.
I have no particular opinion on the exact semantics, but a few ideas:
- `kind images`, most like `docker images`
- `kind get images`, fits with the other `kind get` commands
- `kind images list`, possibly combined with moving other image-related commands (such as `kind load`) under a new `images` subcommand.
I'd imagine that this command would accept the `--name` flag to specify which cluster is being referred to, and a `--nodes` flag to list images from (likely with some de-duplication, or column in a tabular result that shows which nodes an image is on).
**Why is this needed**:
- Aid in debugging, understanding why Pods are failing to pull images by allowing a user to inspect the image store in the nodes.
- Verify that a `kind load` command completed successfully. | 1.0 | Add CLI command to list images installed on Nodes - <!-- Please only use this template for submitting enhancement requests -->
**What would you like to be added**:
A new CLI subcommand that would list images present in a given kind cluster.
I have no particular opinion on the exact semantics, but a few ideas:
- `kind images`, most like `docker images`
- `kind get images`, fits with the other `kind get` commands
- `kind images list`, possibly combined with moving other image-related commands (such as `kind load`) under a new `images` subcommand.
I'd imagine that this command would accept the `--name` flag to specify which cluster is being referred to, and a `--nodes` flag to list images from (likely with some de-duplication, or column in a tabular result that shows which nodes an image is on).
**Why is this needed**:
- Aid in debugging, understanding why Pods are failing to pull images by allowing a user to inspect the image store in the nodes.
- Verify that a `kind load` command completed successfully. | non_build | add cli command to list images installed on nodes what would you like to be added a new cli subcommand that would list images present in a given kind cluster i have no particular opinion on the exact semantics but a few ideas kind images most like docker images kind get images fits with the other kind get commands kind images list possibly combined with moving other image related commands such as kind load under a new images subcommand i d imagine that this command would accept the name flag to specify which cluster is being referred to and a nodes flag to list images from likely with some de duplication or column in a tabular result that shows which nodes an image is on why is this needed aid in debugging understanding why pods are failing to pull images by allowing a user to inspect the image store in the nodes verify that a kind load command completed successfully | 0 |
58,424 | 8,263,155,495 | IssuesEvent | 2018-09-14 00:36:41 | fga-eps-mds/2018.2-ComexStat | https://api.github.com/repos/fga-eps-mds/2018.2-ComexStat | closed | Elaborar a EAP do ComexStat | documentation product engineering | ## Descrição
Elaborar a estrutura analítica do projeto de acordo com o escopo definido pelo cliente.
## Tipo
- Documentação
| 1.0 | Elaborar a EAP do ComexStat - ## Descrição
Elaborar a estrutura analítica do projeto de acordo com o escopo definido pelo cliente.
## Tipo
- Documentação
| non_build | elaborar a eap do comexstat descrição elaborar a estrutura analítica do projeto de acordo com o escopo definido pelo cliente tipo documentação | 0 |
628,032 | 19,959,860,148 | IssuesEvent | 2022-01-28 06:50:39 | apcountryman/picolibrary-microchip-megaavr | https://api.github.com/repos/apcountryman/picolibrary-microchip-megaavr | closed | Remove peripheral specific instance types | priority-normal status-awaiting_review type-refactoring | Remove peripheral specific instance types (e.g. `USART_Instance`). | 1.0 | Remove peripheral specific instance types - Remove peripheral specific instance types (e.g. `USART_Instance`). | non_build | remove peripheral specific instance types remove peripheral specific instance types e g usart instance | 0 |
63,059 | 15,419,372,685 | IssuesEvent | 2021-03-05 10:04:14 | kubevirt/kubevirt | https://api.github.com/repos/kubevirt/kubevirt | closed | [flaky ci] [Serial]Infrastructure [rfe_id:4102][crit:medium][vendor:cnv-qe@redhat.com][level:component]certificates [test_id:4100] should be valid during the whole rotation process | kind/bug triage/build-watcher | <!-- This form is for bug reports and feature requests ONLY!
Also make sure that you visit our User Guide at https://kubevirt.io/user-guide/
-->
**Is this a BUG REPORT or FEATURE REQUEST?**:
> Uncomment only one, leave it on its own line:
>
/kind bug
> /kind enhancement
/triage build-watcher
**What happened**:
Above test `[test_id:4100] should be valid during the whole rotation process` was flaky on #4501 before the PR got merged.
Symptom/Test error:
```
[Serial]Infrastructure [rfe_id:4102][crit:medium][vendor:cnv-qe@redhat.com][level:component]
certificates [test_id:4100] should be valid during the whole rotation process
tests/infra_test.go:178
Unexpected error:
<*kubecli.AsyncSubresourceError \| 0xc0042d63c0>: {
err: "Can't connect to websocket (500): websocket: bad handshake\n",
StatusCode: 500,
}
Can't connect to websocket (500): websocket: bad handshake
occurred
tests/infra_test.go:192
```
Flake on k8s-1.17: https://prow.apps.ovirt.org/view/gs/kubevirt-prow/pr-logs/pull/kubevirt_kubevirt/4501/pull-kubevirt-e2e-k8s-1.17/1325742799764590592
Flake on k8s-1.19: https://prow.apps.ovirt.org/view/gs/kubevirt-prow/pr-logs/pull/kubevirt_kubevirt/4501/pull-kubevirt-e2e-k8s-1.19/1325824207493271552
PR history: https://prow.apps.ovirt.org/pr-history/?org=kubevirt&repo=kubevirt&pr=4501
Possibly related to https://github.com/kubevirt/kubevirt/issues/4198
**What you expected to happen**:
**How to reproduce it (as minimally and precisely as possible)**:
**Anything else we need to know?**:
**Environment**:
- KubeVirt version (use `virtctl version`):
- Kubernetes version (use `kubectl version`):
- VM or VMI specifications:
- Cloud provider or hardware configuration:
- OS (e.g. from /etc/os-release):
- Kernel (e.g. `uname -a`):
- Install tools:
- Others:
| 1.0 | [flaky ci] [Serial]Infrastructure [rfe_id:4102][crit:medium][vendor:cnv-qe@redhat.com][level:component]certificates [test_id:4100] should be valid during the whole rotation process - <!-- This form is for bug reports and feature requests ONLY!
Also make sure that you visit our User Guide at https://kubevirt.io/user-guide/
-->
**Is this a BUG REPORT or FEATURE REQUEST?**:
> Uncomment only one, leave it on its own line:
>
/kind bug
> /kind enhancement
/triage build-watcher
**What happened**:
Above test `[test_id:4100] should be valid during the whole rotation process` was flaky on #4501 before the PR got merged.
Symptom/Test error:
```
[Serial]Infrastructure [rfe_id:4102][crit:medium][vendor:cnv-qe@redhat.com][level:component]
certificates [test_id:4100] should be valid during the whole rotation process
tests/infra_test.go:178
Unexpected error:
<*kubecli.AsyncSubresourceError \| 0xc0042d63c0>: {
err: "Can't connect to websocket (500): websocket: bad handshake\n",
StatusCode: 500,
}
Can't connect to websocket (500): websocket: bad handshake
occurred
tests/infra_test.go:192
```
Flake on k8s-1.17: https://prow.apps.ovirt.org/view/gs/kubevirt-prow/pr-logs/pull/kubevirt_kubevirt/4501/pull-kubevirt-e2e-k8s-1.17/1325742799764590592
Flake on k8s-1.19: https://prow.apps.ovirt.org/view/gs/kubevirt-prow/pr-logs/pull/kubevirt_kubevirt/4501/pull-kubevirt-e2e-k8s-1.19/1325824207493271552
PR history: https://prow.apps.ovirt.org/pr-history/?org=kubevirt&repo=kubevirt&pr=4501
Possibly related to https://github.com/kubevirt/kubevirt/issues/4198
**What you expected to happen**:
**How to reproduce it (as minimally and precisely as possible)**:
**Anything else we need to know?**:
**Environment**:
- KubeVirt version (use `virtctl version`):
- Kubernetes version (use `kubectl version`):
- VM or VMI specifications:
- Cloud provider or hardware configuration:
- OS (e.g. from /etc/os-release):
- Kernel (e.g. `uname -a`):
- Install tools:
- Others:
| build | infrastructure certificates should be valid during the whole rotation process this form is for bug reports and feature requests only also make sure that you visit our user guide at is this a bug report or feature request uncomment only one leave it on its own line kind bug kind enhancement triage build watcher what happened above test should be valid during the whole rotation process was flaky on before the pr got merged symptom test error infrastructure certificates should be valid during the whole rotation process tests infra test go unexpected error err can t connect to websocket websocket bad handshake n statuscode can t connect to websocket websocket bad handshake occurred tests infra test go flake on flake on pr history possibly related to what you expected to happen how to reproduce it as minimally and precisely as possible anything else we need to know environment kubevirt version use virtctl version kubernetes version use kubectl version vm or vmi specifications cloud provider or hardware configuration os e g from etc os release kernel e g uname a install tools others | 1 |
2,820 | 3,016,311,436 | IssuesEvent | 2015-07-30 01:17:56 | javaslang/javaslang | https://api.github.com/repos/javaslang/javaslang | closed | Make travis-ci build faster | build | With evolving Euler tests, the travis-ci build takes longer and longer...
**Task:** Analyse possibilities to make the build faster. | 1.0 | Make travis-ci build faster - With evolving Euler tests, the travis-ci build takes longer and longer...
**Task:** Analyse possibilities to make the build faster. | build | make travis ci build faster with evolving euler tests the travis ci build takes longer and longer task analyse possibilities to make the build faster | 1 |
528,698 | 15,372,886,371 | IssuesEvent | 2021-03-02 11:51:28 | ooni/probe | https://api.github.com/repos/ooni/probe | closed | Unable to upload result that was not uploaded | bug effort/M ooni/probe-mobile priority/high user feedback | Reported by a user in Twitter:



## Specifications
- Version: 2.7.1
- Platform: Android
| 1.0 | Unable to upload result that was not uploaded - Reported by a user in Twitter:



## Specifications
- Version: 2.7.1
- Platform: Android
| non_build | unable to upload result that was not uploaded reported by a user in twitter specifications version platform android | 0 |
19,773 | 10,520,353,031 | IssuesEvent | 2019-09-30 00:41:20 | phetsims/circuit-construction-kit-common | https://api.github.com/repos/phetsims/circuit-construction-kit-common | closed | Flickering item when rapidly grabbing items from the carousel | type:bug type:performance | @samreid @ariel-phet, on somewhat slower devices (I say somewhat because this is still seen on good iPads, but hard to see on laptops), you can see a flickering effect when rapidly grabbing/releasing an item out of the carousel. I usually see this more when grabbing the last of an item.
This is easy to do with one-ofs, like the dollar bill. While rapidly grabbing then releasing the item, if you look towards the center of the screen, you can see your item flicker:

@samreid I took some frames from that video that maybe reveal something:
Frame 0 (before release)
<img width="580" alt="screen shot 2017-10-24 at 1 22 28 pm" src="https://user-images.githubusercontent.com/5863899/31963728-ef079338-b8be-11e7-843c-996c692a3a9d.png">
Frame 1 (after release). Note how the object was being disposed, and there are floating vertices over the carousel. Does the image belong to those vertices?
<img width="580" alt="screen shot 2017-10-24 at 1 22 42 pm" src="https://user-images.githubusercontent.com/5863899/31963729-ef1c7de8-b8be-11e7-9145-4b655d987172.png">
Frame 2 (after release). Looks like I grabbed another one very quickly. So grab -> release -> grab within three frames?
<img width="579" alt="screen shot 2017-10-24 at 1 22 55 pm" src="https://user-images.githubusercontent.com/5863899/31963730-ef314430-b8be-11e7-841d-d79c1872d57d.png">
Seen on iPad Air 2 iOS 11.0.3 and ***barely*** seen on macOS 10.12.6 Chrome. For phetsims/QA/issues/55.
URL: https://www.colorado.edu/physics/phet/dev/html/circuit-construction-kit-dc/1.0.0-rc.1/circuit-construction-kit-dc_en.html
Version: 1.0.0-rc.1 2017-10-17 18:05:58 UTC
Flags: pixelRatioScaling
User Agent: Mozilla/5.0 (iPad; CPU OS 9_1 like Mac OS X) AppleWebKit/601.1.46 (KHTML, like Gecko) Version/9.0 Mobile/13B143 Safari/601.1
Language: en-US
Window: 1920x1014
Pixel Ratio: 2/1
WebGL: WebGL 1.0 (OpenGL ES 2.0 Chromium)
GLSL: WebGL GLSL ES 1.0 (OpenGL ES GLSL ES 1.0 Chromium)
Vendor: WebKit (WebKit WebGL)
Vertex: attribs: 16 varying: 32 uniform: 1024
Texture: size: 16384 imageUnits: 16 (vertex: 16, combined: 80)
Max viewport: 16384x16384
OES_texture_float: true
Dependencies JSON: {"assert":{"sha":"928741cf","branch":"HEAD"},"axon":{"sha":"8a8ff7d2","branch":"HEAD"},"babel":{"sha":"0b5a4452","branch":"master"},"brand":{"sha":"cfca902d","branch":"HEAD"},"chipper":{"sha":"b8a3c09a","branch":"HEAD"},"circuit-construction-kit-common":{"sha":"11736e99","branch":"HEAD"},"circuit-construction-kit-dc":{"sha":"5cbdc391","branch":"HEAD"},"dot":{"sha":"84bc5146","branch":"HEAD"},"joist":{"sha":"92b186e2","branch":"HEAD"},"kite":{"sha":"067109a8","branch":"HEAD"},"phet-core":{"sha":"054b2d6d","branch":"HEAD"},"phetcommon":{"sha":"b13138d2","branch":"HEAD"},"query-string-machine":{"sha":"c74e454e","branch":"HEAD"},"scenery":{"sha":"c9c3052a","branch":"HEAD"},"scenery-phet":{"sha":"124c1c25","branch":"HEAD"},"sherpa":{"sha":"0d590744","branch":"HEAD"},"sun":{"sha":"082fee37","branch":"HEAD"},"tandem":{"sha":"98e9917c","branch":"HEAD"},"twixt":{"sha":"a2eaa10a","branch":"HEAD"}} | True | Flickering item when rapidly grabbing items from the carousel - @samreid @ariel-phet, on somewhat slower devices (I say somewhat because this is still seen on good iPads, but hard to see on laptops), you can see a flickering effect when rapidly grabbing/releasing an item out of the carousel. I usually see this more when grabbing the last of an item.
This is easy to do with one-ofs, like the dollar bill. While rapidly grabbing then releasing the item, if you look towards the center of the screen, you can see your item flicker:

@samreid I took some frames from that video that maybe reveal something:
Frame 0 (before release)
<img width="580" alt="screen shot 2017-10-24 at 1 22 28 pm" src="https://user-images.githubusercontent.com/5863899/31963728-ef079338-b8be-11e7-843c-996c692a3a9d.png">
Frame 1 (after release). Note how the object was being disposed, and there are floating vertices over the carousel. Does the image belong to those vertices?
<img width="580" alt="screen shot 2017-10-24 at 1 22 42 pm" src="https://user-images.githubusercontent.com/5863899/31963729-ef1c7de8-b8be-11e7-9145-4b655d987172.png">
Frame 2 (after release). Looks like I grabbed another one very quickly. So grab -> release -> grab within three frames?
<img width="579" alt="screen shot 2017-10-24 at 1 22 55 pm" src="https://user-images.githubusercontent.com/5863899/31963730-ef314430-b8be-11e7-841d-d79c1872d57d.png">
Seen on iPad Air 2 iOS 11.0.3 and ***barely*** seen on macOS 10.12.6 Chrome. For phetsims/QA/issues/55.
URL: https://www.colorado.edu/physics/phet/dev/html/circuit-construction-kit-dc/1.0.0-rc.1/circuit-construction-kit-dc_en.html
Version: 1.0.0-rc.1 2017-10-17 18:05:58 UTC
Flags: pixelRatioScaling
User Agent: Mozilla/5.0 (iPad; CPU OS 9_1 like Mac OS X) AppleWebKit/601.1.46 (KHTML, like Gecko) Version/9.0 Mobile/13B143 Safari/601.1
Language: en-US
Window: 1920x1014
Pixel Ratio: 2/1
WebGL: WebGL 1.0 (OpenGL ES 2.0 Chromium)
GLSL: WebGL GLSL ES 1.0 (OpenGL ES GLSL ES 1.0 Chromium)
Vendor: WebKit (WebKit WebGL)
Vertex: attribs: 16 varying: 32 uniform: 1024
Texture: size: 16384 imageUnits: 16 (vertex: 16, combined: 80)
Max viewport: 16384x16384
OES_texture_float: true
Dependencies JSON: {"assert":{"sha":"928741cf","branch":"HEAD"},"axon":{"sha":"8a8ff7d2","branch":"HEAD"},"babel":{"sha":"0b5a4452","branch":"master"},"brand":{"sha":"cfca902d","branch":"HEAD"},"chipper":{"sha":"b8a3c09a","branch":"HEAD"},"circuit-construction-kit-common":{"sha":"11736e99","branch":"HEAD"},"circuit-construction-kit-dc":{"sha":"5cbdc391","branch":"HEAD"},"dot":{"sha":"84bc5146","branch":"HEAD"},"joist":{"sha":"92b186e2","branch":"HEAD"},"kite":{"sha":"067109a8","branch":"HEAD"},"phet-core":{"sha":"054b2d6d","branch":"HEAD"},"phetcommon":{"sha":"b13138d2","branch":"HEAD"},"query-string-machine":{"sha":"c74e454e","branch":"HEAD"},"scenery":{"sha":"c9c3052a","branch":"HEAD"},"scenery-phet":{"sha":"124c1c25","branch":"HEAD"},"sherpa":{"sha":"0d590744","branch":"HEAD"},"sun":{"sha":"082fee37","branch":"HEAD"},"tandem":{"sha":"98e9917c","branch":"HEAD"},"twixt":{"sha":"a2eaa10a","branch":"HEAD"}} | non_build | flickering item when rapidly grabbing items from the carousel samreid ariel phet on somewhat slower devices i say somewhat because this is still seen on good ipads but hard to see on laptops you can see a flickering effect when rapidly grabbing releasing an item out of the carousel i usually see this more when grabbing the last of an item this is easy to do with one ofs like the dollar bill while rapidly grabbing then releasing the item if you look towards the center of the screen you can see your item flicker samreid i took some frames from that video that maybe reveal something frame before release img width alt screen shot at pm src frame after release note how the object was being disposed and there are floating vertices over the carousel does the image belong to those vertices img width alt screen shot at pm src frame after release looks like i grabbed another one very quickly so grab release grab within three frames img width alt screen shot at pm src seen on ipad air ios and barely seen on macos chrome for phetsims qa issues url version rc utc flags pixelratioscaling user agent mozilla ipad cpu os like mac os x applewebkit khtml like gecko version mobile safari language en us window pixel ratio webgl webgl opengl es chromium glsl webgl glsl es opengl es glsl es chromium vendor webkit webkit webgl vertex attribs varying uniform texture size imageunits vertex combined max viewport oes texture float true dependencies json assert sha branch head axon sha branch head babel sha branch master brand sha branch head chipper sha branch head circuit construction kit common sha branch head circuit construction kit dc sha branch head dot sha branch head joist sha branch head kite sha branch head phet core sha branch head phetcommon sha branch head query string machine sha branch head scenery sha branch head scenery phet sha branch head sherpa sha branch head sun sha branch head tandem sha branch head twixt sha branch head | 0 |
11,930 | 18,536,477,865 | IssuesEvent | 2021-10-21 12:05:57 | renovatebot/renovate | https://api.github.com/repos/renovatebot/renovate | opened | Cloning of repositories fails with ___ since 27.0 | type:bug status:requirements priority-5-triage | ### How are you running Renovate?
Self-hosted
### Please select which platform you are using if self-hosting.
Bitbucket Server
### If you're self-hosting Renovate, tell us what version of Renovate you run.
28.6.0, based on renovate/renovate:slim
### Describe the bug
We're using the [`renovate/renovate` Docker image](https://github.com/renovatebot/docker-renovate) and are seeing problems since `27.0.0`. The issue is that the `git clone` for each repository Renovate runs on throws the error included below.
**Tested the following versions**
* `26.21.7` has no problems
* `27.0.0`, `27.1.0` and `28.6.0` throw the error included below for each repository that Renovate runs on
**Things I've tried already**
* The [release notes of `27.0.0`](https://github.com/renovatebot/renovate/releases/tag/27.0.0) mention that at least git `2.33.0` is needed and running `git --version` within the container returns `2.33.0`.
* The password used in the `git clone` command is an alphanumerical string
* Running the command `git clone --filter=blob:none https://<username>:<password>@<repository-url>.git .` from within the container works without issues.
**Other things I noticed**
* What's interesting is that the revision identifier `8eb2636b29df5b60f15872f4124488749873b116` is the same for each project.
### Relevant debug logs
<details><summary>Logs</summary>
```
build 21-Oct-2021 11:49:03 DEBUG: git clone error (repository=ddnei/expert-insight-wrapper)
build 21-Oct-2021 11:49:03 "err": {
build 21-Oct-2021 11:49:03 "task": {
build 21-Oct-2021 11:49:03 "commands": [
build 21-Oct-2021 11:49:03 "clone",
build 21-Oct-2021 11:49:03 "--filter=blob:none",
build 21-Oct-2021 11:49:03 "https://<username>:<password>@<repository-url>.git",
build 21-Oct-2021 11:49:03 "."
build 21-Oct-2021 11:49:03 ],
build 21-Oct-2021 11:49:03 "format": "utf-8"
build 21-Oct-2021 11:49:03 },
build 21-Oct-2021 11:49:03 "message": "Cloning into '.'...\nfatal: bad revision '8eb2636b29df5b60f15872f4124488749873b116'\nerror: <repository-url>.git did not send all necessary objects\n\nfatal: bad revision '8eb2636b29df5b60f15872f4124488749873b116'\nerror: <repository-url>.git did not send all necessary objects\n\nfatal: bad revision '8eb2636b29df5b60f15872f4124488749873b116'\nerror: <repository-url>.git did not send all necessary objects\n\nerror: unable to read sha1 file of <file> (8eb2636b29df5b60f15872f4124488749873b116)\nfatal: bad revision '65015f13a45ac21064c4fd448bf4970bdab691f7'\nerror: <repository-url>.git did not send all necessary objects\n
...
fatal: unable to checkout working tree\nwarning: Clone succeeded, but checkout failed.\nYou can inspect what was checked out with 'git status'\nand retry with 'git restore --source=HEAD :/'\n\n\n at Object.action (/usr/src/app/node_modules/simple-git/src/lib/plugins/error-detection.plugin.ts:38:28)\n at PluginStore.exec (/usr/src/app/node_modules/simple-git/src/lib/plugins/plugin-store.ts:24:29)\n at /usr/src/app/node_modules/simple-git/src/lib/runners/git-executor-chain.ts:114:40\n at new Promise (<anonymous>)\n at GitExecutorChain.handleTaskData (/usr/src/app/node_modules/simple-git/src/lib/runners/git-executor-chain.ts:111:14)\n at GitExecutorChain.<anonymous> (/usr/src/app/node_modules/simple-git/src/lib/runners/git-executor-chain.ts:88:40)\n at Generator.next (<anonymous>)\n at fulfilled (/usr/src/app/node_modules/simple-git/src/lib/runners/git-executor-chain.js:5:58)\n at processTicksAndRejections (internal/process/task_queues.js:95:5)"
build 21-Oct-2021 11:49:03 }
```
</details>
### Have you created a minimal reproduction repository?
No reproduction repository | 1.0 | Cloning of repositories fails with ___ since 27.0 - ### How are you running Renovate?
Self-hosted
### Please select which platform you are using if self-hosting.
Bitbucket Server
### If you're self-hosting Renovate, tell us what version of Renovate you run.
28.6.0, based on renovate/renovate:slim
### Describe the bug
We're using the [`renovate/renovate` Docker image](https://github.com/renovatebot/docker-renovate) and are seeing problems since `27.0.0`. The issue is that the `git clone` for each repository Renovate runs on throws the error included below.
**Tested the following versions**
* `26.21.7` has no problems
* `27.0.0`, `27.1.0` and `28.6.0` throw the error included below for each repository that Renovate runs on
**Things I've tried already**
* The [release notes of `27.0.0`](https://github.com/renovatebot/renovate/releases/tag/27.0.0) mention that at least git `2.33.0` is needed and running `git --version` within the container returns `2.33.0`.
* The password used in the `git clone` command is an alphanumerical string
* Running the command `git clone --filter=blob:none https://<username>:<password>@<repository-url>.git .` from within the container works without issues.
**Other things I noticed**
* What's interesting is that the revision identifier `8eb2636b29df5b60f15872f4124488749873b116` is the same for each project.
### Relevant debug logs
<details><summary>Logs</summary>
```
build 21-Oct-2021 11:49:03 DEBUG: git clone error (repository=ddnei/expert-insight-wrapper)
build 21-Oct-2021 11:49:03 "err": {
build 21-Oct-2021 11:49:03 "task": {
build 21-Oct-2021 11:49:03 "commands": [
build 21-Oct-2021 11:49:03 "clone",
build 21-Oct-2021 11:49:03 "--filter=blob:none",
build 21-Oct-2021 11:49:03 "https://<username>:<password>@<repository-url>.git",
build 21-Oct-2021 11:49:03 "."
build 21-Oct-2021 11:49:03 ],
build 21-Oct-2021 11:49:03 "format": "utf-8"
build 21-Oct-2021 11:49:03 },
build 21-Oct-2021 11:49:03 "message": "Cloning into '.'...\nfatal: bad revision '8eb2636b29df5b60f15872f4124488749873b116'\nerror: <repository-url>.git did not send all necessary objects\n\nfatal: bad revision '8eb2636b29df5b60f15872f4124488749873b116'\nerror: <repository-url>.git did not send all necessary objects\n\nfatal: bad revision '8eb2636b29df5b60f15872f4124488749873b116'\nerror: <repository-url>.git did not send all necessary objects\n\nerror: unable to read sha1 file of <file> (8eb2636b29df5b60f15872f4124488749873b116)\nfatal: bad revision '65015f13a45ac21064c4fd448bf4970bdab691f7'\nerror: <repository-url>.git did not send all necessary objects\n
...
fatal: unable to checkout working tree\nwarning: Clone succeeded, but checkout failed.\nYou can inspect what was checked out with 'git status'\nand retry with 'git restore --source=HEAD :/'\n\n\n at Object.action (/usr/src/app/node_modules/simple-git/src/lib/plugins/error-detection.plugin.ts:38:28)\n at PluginStore.exec (/usr/src/app/node_modules/simple-git/src/lib/plugins/plugin-store.ts:24:29)\n at /usr/src/app/node_modules/simple-git/src/lib/runners/git-executor-chain.ts:114:40\n at new Promise (<anonymous>)\n at GitExecutorChain.handleTaskData (/usr/src/app/node_modules/simple-git/src/lib/runners/git-executor-chain.ts:111:14)\n at GitExecutorChain.<anonymous> (/usr/src/app/node_modules/simple-git/src/lib/runners/git-executor-chain.ts:88:40)\n at Generator.next (<anonymous>)\n at fulfilled (/usr/src/app/node_modules/simple-git/src/lib/runners/git-executor-chain.js:5:58)\n at processTicksAndRejections (internal/process/task_queues.js:95:5)"
build 21-Oct-2021 11:49:03 }
```
</details>
### Have you created a minimal reproduction repository?
No reproduction repository | non_build | cloning of repositories fails with since how are you running renovate self hosted please select which platform you are using if self hosting bitbucket server if you re self hosting renovate tell us what version of renovate you run based on renovate renovate slim describe the bug we re using the and are seeing problems since the issue is that the git clone for each repository renovate runs on throws the error included below tested the following versions has no problems and throw the error included below for each repository that renovate runs on things i ve tried already the mention that at least git is needed and running git version within the container returns the password used in the git clone command is an alphanumerical string running the command git clone filter blob none from within the container works without issues other things i noticed what s interesting is that the revision identifier is the same for each project relevant debug logs logs build oct debug git clone error repository ddnei expert insight wrapper build oct err build oct task build oct commands build oct clone build oct filter blob none build oct build oct build oct build oct format utf build oct build oct message cloning into nfatal bad revision nerror git did not send all necessary objects n nfatal bad revision nerror git did not send all necessary objects n nfatal bad revision nerror git did not send all necessary objects n nerror unable to read file of nfatal bad revision nerror git did not send all necessary objects n fatal unable to checkout working tree nwarning clone succeeded but checkout failed nyou can inspect what was checked out with git status nand retry with git restore source head n n n at object action usr src app node modules simple git src lib plugins error detection plugin ts n at pluginstore exec usr src app node modules simple git src lib plugins plugin store ts n at usr src app node modules simple git src lib runners git executor chain ts n at new promise n at gitexecutorchain handletaskdata usr src app node modules simple git src lib runners git executor chain ts n at gitexecutorchain usr src app node modules simple git src lib runners git executor chain ts n at generator next n at fulfilled usr src app node modules simple git src lib runners git executor chain js n at processticksandrejections internal process task queues js build oct have you created a minimal reproduction repository no reproduction repository | 0 |
284,810 | 30,913,694,080 | IssuesEvent | 2023-08-05 02:38:16 | Nivaskumark/kernel_v4.19.72_old | https://api.github.com/repos/Nivaskumark/kernel_v4.19.72_old | reopened | CVE-2022-25375 (Medium) detected in linux-yoctov5.4.51 | Mend: dependency security vulnerability | ## CVE-2022-25375 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in HEAD commit: <a href="https://github.com/Nivaskumark/kernel_v4.19.72/commit/ce49083a1c14be2d13cb5e878257d293e6c748bc">ce49083a1c14be2d13cb5e878257d293e6c748bc</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/usb/gadget/function/rndis.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/usb/gadget/function/rndis.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in drivers/usb/gadget/function/rndis.c in the Linux kernel before 5.16.10. The RNDIS USB gadget lacks validation of the size of the RNDIS_MSG_SET command. Attackers can obtain sensitive information from kernel memory.
<p>Publish Date: 2022-02-20
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25375>CVE-2022-25375</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25375">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25375</a></p>
<p>Release Date: 2022-02-20</p>
<p>Fix Resolution: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25375</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-25375 (Medium) detected in linux-yoctov5.4.51 - ## CVE-2022-25375 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in HEAD commit: <a href="https://github.com/Nivaskumark/kernel_v4.19.72/commit/ce49083a1c14be2d13cb5e878257d293e6c748bc">ce49083a1c14be2d13cb5e878257d293e6c748bc</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/usb/gadget/function/rndis.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/usb/gadget/function/rndis.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in drivers/usb/gadget/function/rndis.c in the Linux kernel before 5.16.10. The RNDIS USB gadget lacks validation of the size of the RNDIS_MSG_SET command. Attackers can obtain sensitive information from kernel memory.
<p>Publish Date: 2022-02-20
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25375>CVE-2022-25375</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25375">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25375</a></p>
<p>Release Date: 2022-02-20</p>
<p>Fix Resolution: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25375</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_build | cve medium detected in linux cve medium severity vulnerability vulnerable library linux yocto linux embedded kernel library home page a href found in head commit a href found in base branch master vulnerable source files drivers usb gadget function rndis c drivers usb gadget function rndis c vulnerability details an issue was discovered in drivers usb gadget function rndis c in the linux kernel before the rndis usb gadget lacks validation of the size of the rndis msg set command attackers can obtain sensitive information from kernel memory publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend | 0 |
16,962 | 6,311,400,130 | IssuesEvent | 2017-07-23 19:09:57 | cucumber/cucumber-jvm | https://api.github.com/repos/cucumber/cucumber-jvm | closed | [Build] Trusty with oraclejdk7 is not supported | Build | ## Summary
After forking this repo and enabling my travis repo, the build for java 7 failed as travis does not support it since june 14.
https://github.com/travis-ci/travis-ci/issues/7884#issuecomment-308451879
## Expected Behavior
Without changes, the build at my repository should stop at 'mvn deploy' step, not earlier.
## Current Behavior
https://travis-ci.org/nicosmaris/cucumber-jvm/jobs/256635147
## Possible Solution
We can remove the following line and the next one to drop support for java 7.
https://github.com/nicosmaris/cucumber-jvm/blob/master/.travis.yml#L23
## Steps to Reproduce (for bugs)
Fork and build
## Context & Motivation
Fork and build before doing anything else
## Your Environment
https://travis-ci.org/nicosmaris/cucumber-jvm/jobs/256635147 | 1.0 | [Build] Trusty with oraclejdk7 is not supported - ## Summary
After forking this repo and enabling my travis repo, the build for java 7 failed as travis does not support it since june 14.
https://github.com/travis-ci/travis-ci/issues/7884#issuecomment-308451879
## Expected Behavior
Without changes, the build at my repository should stop at 'mvn deploy' step, not earlier.
## Current Behavior
https://travis-ci.org/nicosmaris/cucumber-jvm/jobs/256635147
## Possible Solution
We can remove the following line and the next one to drop support for java 7.
https://github.com/nicosmaris/cucumber-jvm/blob/master/.travis.yml#L23
## Steps to Reproduce (for bugs)
Fork and build
## Context & Motivation
Fork and build before doing anything else
## Your Environment
https://travis-ci.org/nicosmaris/cucumber-jvm/jobs/256635147 | build | trusty with is not supported summary after forking this repo and enabling my travis repo the build for java failed as travis does not support it since june expected behavior without changes the build at my repository should stop at mvn deploy step not earlier current behavior possible solution we can remove the following line and the next one to drop support for java steps to reproduce for bugs fork and build context motivation fork and build before doing anything else your environment | 1 |
72,793 | 19,491,647,905 | IssuesEvent | 2021-12-27 07:43:30 | tsunamayo/Starship-EVO | https://api.github.com/repos/tsunamayo/Starship-EVO | opened | [New build - DEFAULT] 21w52a: Hotfixes | Build Release Note | Teleport have been improved with better name and no dummy spawn point if actual teleport station exists on the entity.
It is now possible to pick a brick on an item stand by look at the brick itself.
Hotfixes:
#4399 #4397 #4396 Symmetry rotation can break game.
- It is possible to enter a yoke placed on an item stand.
Some NPCs were inactive. | 1.0 | [New build - DEFAULT] 21w52a: Hotfixes - Teleport have been improved with better name and no dummy spawn point if actual teleport station exists on the entity.
It is now possible to pick a brick on an item stand by look at the brick itself.
Hotfixes:
#4399 #4397 #4396 Symmetry rotation can break game.
- It is possible to enter a yoke placed on an item stand.
Some NPCs were inactive. | build | hotfixes teleport have been improved with better name and no dummy spawn point if actual teleport station exists on the entity it is now possible to pick a brick on an item stand by look at the brick itself hotfixes symmetry rotation can break game it is possible to enter a yoke placed on an item stand some npcs were inactive | 1 |
378,804 | 11,208,920,299 | IssuesEvent | 2020-01-06 09:14:23 | Tournamanager/webApp | https://api.github.com/repos/Tournamanager/webApp | closed | Create match | Low priority enhancement | What has to be done:
Create a component that will be used for creating matches.
Why it has to be done:
So that tournament managers can create matches for their tournaments.
When the task is done:
The front end application can send a request to the backend GraphQL server.
When the backend returns an error an error messages needs to be displayed to the user.
When the backend returns a http status of 200 the front end needs to show the match page. | 1.0 | Create match - What has to be done:
Create a component that will be used for creating matches.
Why it has to be done:
So that tournament managers can create matches for their tournaments.
When the task is done:
The front end application can send a request to the backend GraphQL server.
When the backend returns an error an error messages needs to be displayed to the user.
When the backend returns a http status of 200 the front end needs to show the match page. | non_build | create match what has to be done create a component that will be used for creating matches why it has to be done so that tournament managers can create matches for their tournaments when the task is done the front end application can send a request to the backend graphql server when the backend returns an error an error messages needs to be displayed to the user when the backend returns a http status of the front end needs to show the match page | 0 |
13,128 | 5,306,263,191 | IssuesEvent | 2017-02-11 00:10:37 | rust-lang/rust | https://api.github.com/repos/rust-lang/rust | closed | Stop rebuilding crates during testing | A-build I-compiletime | We have unit tests inline with crates, and testing them involves recompiling the crates in test mode. This is bad for development times. We are currently doing this for core, std, _rustc_, rustdoc, and cargo.
My preference is just to always compile the crates with tests and use an external test runner to invoke them. It will add some compile time to non-testing builds, and some size to the binaries so maybe we can have a way to turn them off.
We could also move all tests to their own libraries, but it would preclude testing private functions.
| 1.0 | Stop rebuilding crates during testing - We have unit tests inline with crates, and testing them involves recompiling the crates in test mode. This is bad for development times. We are currently doing this for core, std, _rustc_, rustdoc, and cargo.
My preference is just to always compile the crates with tests and use an external test runner to invoke them. It will add some compile time to non-testing builds, and some size to the binaries so maybe we can have a way to turn them off.
We could also move all tests to their own libraries, but it would preclude testing private functions.
| build | stop rebuilding crates during testing we have unit tests inline with crates and testing them involves recompiling the crates in test mode this is bad for development times we are currently doing this for core std rustc rustdoc and cargo my preference is just to always compile the crates with tests and use an external test runner to invoke them it will add some compile time to non testing builds and some size to the binaries so maybe we can have a way to turn them off we could also move all tests to their own libraries but it would preclude testing private functions | 1 |
107,960 | 9,255,415,118 | IssuesEvent | 2019-03-16 09:50:20 | Azure/acs-engine | https://api.github.com/repos/Azure/acs-engine | closed | Kubernetes E2E scenario: multiple clusters in a single VNET | missing test orchestrator/k8s stale | We should ensure test coverage for adding > 1 Kubernetes cluster to a common VNET. | 1.0 | Kubernetes E2E scenario: multiple clusters in a single VNET - We should ensure test coverage for adding > 1 Kubernetes cluster to a common VNET. | non_build | kubernetes scenario multiple clusters in a single vnet we should ensure test coverage for adding kubernetes cluster to a common vnet | 0 |
690,777 | 23,671,936,229 | IssuesEvent | 2022-08-27 13:46:54 | xpressengine/xpressengine | https://api.github.com/repos/xpressengine/xpressengine | closed | bootstrap-sass build 오류 | type/bug priority/high | ```
Message:
node_modules/bootstrap-sass/assets/stylesheets/bootstrap/mixins/_nav-divider.scss
Error: Invalid CSS after " margin: (math": expected expression (e.g. 1px, bold), was ".div($line-height-c"
on line 8 of node_modules/bootstrap-sass/assets/stylesheets/bootstrap/mixins/_nav-divider.scss
from line 25 of node_modules/bootstrap-sass/assets/stylesheets/bootstrap/_mixins.scss
from line 3 of core/xe-ui-component/xe-ui-component.scss
>> margin: (math.div($line-height-computed, 2) - 1) 0;
``` | 1.0 | bootstrap-sass build 오류 - ```
Message:
node_modules/bootstrap-sass/assets/stylesheets/bootstrap/mixins/_nav-divider.scss
Error: Invalid CSS after " margin: (math": expected expression (e.g. 1px, bold), was ".div($line-height-c"
on line 8 of node_modules/bootstrap-sass/assets/stylesheets/bootstrap/mixins/_nav-divider.scss
from line 25 of node_modules/bootstrap-sass/assets/stylesheets/bootstrap/_mixins.scss
from line 3 of core/xe-ui-component/xe-ui-component.scss
>> margin: (math.div($line-height-computed, 2) - 1) 0;
``` | non_build | bootstrap sass build 오류 message node modules bootstrap sass assets stylesheets bootstrap mixins nav divider scss error invalid css after margin math expected expression e g bold was div line height c on line of node modules bootstrap sass assets stylesheets bootstrap mixins nav divider scss from line of node modules bootstrap sass assets stylesheets bootstrap mixins scss from line of core xe ui component xe ui component scss margin math div line height computed | 0 |
99,646 | 30,517,713,349 | IssuesEvent | 2023-07-19 05:22:09 | appsmithorg/appsmith | https://api.github.com/repos/appsmithorg/appsmith | closed | AxiosError: Request failed with status code 404 | UI Builders Pod Conversion Algorithm | Sentry Issue: [APPSMITH-6CZ](https://appsmith.sentry.io/issues/4182077380/?referrer=github_integration)
```
AxiosError: Request failed with status code 404
``` | 1.0 | AxiosError: Request failed with status code 404 - Sentry Issue: [APPSMITH-6CZ](https://appsmith.sentry.io/issues/4182077380/?referrer=github_integration)
```
AxiosError: Request failed with status code 404
``` | build | axioserror request failed with status code sentry issue axioserror request failed with status code | 1 |
791,067 | 27,848,737,431 | IssuesEvent | 2023-03-20 17:08:53 | swe574-spring23/SWE574 | https://api.github.com/repos/swe574-spring23/SWE574 | opened | BUG - Sidebar not responsive | Type: Bug Priority: Medium | When the screen is smaller the sidebar is colliding with the main screen items.
 | 1.0 | BUG - Sidebar not responsive - When the screen is smaller the sidebar is colliding with the main screen items.
 | non_build | bug sidebar not responsive when the screen is smaller the sidebar is colliding with the main screen items | 0 |
97,419 | 12,234,175,272 | IssuesEvent | 2020-05-04 13:00:21 | graphql-nexus/nexus | https://api.github.com/repos/graphql-nexus/nexus | opened | crud system over graphql api | needs/discussion type/design | <!-- Instructions -->
<!-- -->
<!-- 1. Remove sections/details you do not complete -->
<!-- 2. Add sections/details useful to you -->
#### Perceived Problem
- crud is complicated
- crud often involves boilerplate
- nexus-prisma crud is tied to prisma
- nexus-prisma crud is tied to the data layer
- nexus-prisma crud is a plugin, blocking other plugins from cleanly contributing
#### Ideas / Proposed Solution(s)
- introduce crud concept in nexus core
- focus on primitives, being pluggable
- maybe bring back open-crud https://www.opencrud.org/
- defining mutation operations should optionally automatically create
- revisiting what is crud at its most basic
- given a graphql object
- a create mutation
- an update mutation (+ maybe upsert)
- a read query
- a delete mutation
- batch variants of the above
- for each mutation, a corresponding subscription operation
- crud at its most basic is a starting placing but crud in practice means domain tailoring
- a domain will not need all crud operations for all graphql objects
- a domain will not call the operations by crud-y terms but domains ones, e.g. instead of `createUser` `signup`
- a domain will have some graphql objects that have no direct crud operations
- questions
- this list is not exhaustive
- when authoring a crud operation how will a graphql object be selected as the target of the crud operation?
- crud could offer certain standard features like filtering, ordering, ...
todo | 1.0 | crud system over graphql api - <!-- Instructions -->
<!-- -->
<!-- 1. Remove sections/details you do not complete -->
<!-- 2. Add sections/details useful to you -->
#### Perceived Problem
- crud is complicated
- crud often involves boilerplate
- nexus-prisma crud is tied to prisma
- nexus-prisma crud is tied to the data layer
- nexus-prisma crud is a plugin, blocking other plugins from cleanly contributing
#### Ideas / Proposed Solution(s)
- introduce crud concept in nexus core
- focus on primitives, being pluggable
- maybe bring back open-crud https://www.opencrud.org/
- defining mutation operations should optionally automatically create
- revisiting what is crud at its most basic
- given a graphql object
- a create mutation
- an update mutation (+ maybe upsert)
- a read query
- a delete mutation
- batch variants of the above
- for each mutation, a corresponding subscription operation
- crud at its most basic is a starting placing but crud in practice means domain tailoring
- a domain will not need all crud operations for all graphql objects
- a domain will not call the operations by crud-y terms but domains ones, e.g. instead of `createUser` `signup`
- a domain will have some graphql objects that have no direct crud operations
- questions
- this list is not exhaustive
- when authoring a crud operation how will a graphql object be selected as the target of the crud operation?
- crud could offer certain standard features like filtering, ordering, ...
todo | non_build | crud system over graphql api perceived problem crud is complicated crud often involves boilerplate nexus prisma crud is tied to prisma nexus prisma crud is tied to the data layer nexus prisma crud is a plugin blocking other plugins from cleanly contributing ideas proposed solution s introduce crud concept in nexus core focus on primitives being pluggable maybe bring back open crud defining mutation operations should optionally automatically create revisiting what is crud at its most basic given a graphql object a create mutation an update mutation maybe upsert a read query a delete mutation batch variants of the above for each mutation a corresponding subscription operation crud at its most basic is a starting placing but crud in practice means domain tailoring a domain will not need all crud operations for all graphql objects a domain will not call the operations by crud y terms but domains ones e g instead of createuser signup a domain will have some graphql objects that have no direct crud operations questions this list is not exhaustive when authoring a crud operation how will a graphql object be selected as the target of the crud operation crud could offer certain standard features like filtering ordering todo | 0 |
31,124 | 8,657,689,074 | IssuesEvent | 2018-11-27 22:04:49 | travis-ci/travis-ci | https://api.github.com/repos/travis-ci/travis-ci | closed | Update gimme version for non-Golang environments | build environment | [Container-based](https://docs.travis-ci.com/user/reference/overview/#container-based) builds with `language: scala` the [travis-ci/gimme](https://github.com/travis-ci/gimme) `gimme version` returns `v1.2.0`, which is quite old and e.g does not support `.x` suffixes (added in [1.5](https://github.com/travis-ci/gimme/blob/master/CHANGELOG.md#150---2018-05-29)).
Builds with `language: go` it has `gimme version` of `v1.5.3`.
Thus builds \w both Scala and Golang need to always `curl` the latest gimme in every job.
It would nice to have gimme version updated, to avoid extra network call to Github. | 1.0 | Update gimme version for non-Golang environments - [Container-based](https://docs.travis-ci.com/user/reference/overview/#container-based) builds with `language: scala` the [travis-ci/gimme](https://github.com/travis-ci/gimme) `gimme version` returns `v1.2.0`, which is quite old and e.g does not support `.x` suffixes (added in [1.5](https://github.com/travis-ci/gimme/blob/master/CHANGELOG.md#150---2018-05-29)).
Builds with `language: go` it has `gimme version` of `v1.5.3`.
Thus builds \w both Scala and Golang need to always `curl` the latest gimme in every job.
It would nice to have gimme version updated, to avoid extra network call to Github. | build | update gimme version for non golang environments builds with language scala the gimme version returns which is quite old and e g does not support x suffixes added in builds with language go it has gimme version of thus builds w both scala and golang need to always curl the latest gimme in every job it would nice to have gimme version updated to avoid extra network call to github | 1 |
93,949 | 27,080,931,467 | IssuesEvent | 2023-02-14 13:59:37 | Challanger524/ChernoOpenGL-CMake | https://api.github.com/repos/Challanger524/ChernoOpenGL-CMake | closed | Add **Linux: clang** configuration preset to `CMakePresets.json` (and maybe g++) | build | And change preset name: `windows-base` -> `Windows MSVC` | 1.0 | Add **Linux: clang** configuration preset to `CMakePresets.json` (and maybe g++) - And change preset name: `windows-base` -> `Windows MSVC` | build | add linux clang configuration preset to cmakepresets json and maybe g and change preset name windows base windows msvc | 1 |
103,617 | 11,361,565,578 | IssuesEvent | 2020-01-26 15:44:25 | OttoDIY/blockly | https://api.github.com/repos/OttoDIY/blockly | closed | Block category icons disappear after switching language from default | documentation | English:

French:

The icons should remain the same. Any idea why this is happening? | 1.0 | Block category icons disappear after switching language from default - English:

French:

The icons should remain the same. Any idea why this is happening? | non_build | block category icons disappear after switching language from default english french the icons should remain the same any idea why this is happening | 0 |
396,992 | 11,716,830,525 | IssuesEvent | 2020-03-09 16:15:00 | tranquilitybase-io/tb-gcp | https://api.github.com/repos/tranquilitybase-io/tb-gcp | opened | Deploy Eagle Console into Kubernetes | eagle_console priority:medium | The Eable Console docker file must be deployed in the Eagle Console Kubernetes cluster | 1.0 | Deploy Eagle Console into Kubernetes - The Eable Console docker file must be deployed in the Eagle Console Kubernetes cluster | non_build | deploy eagle console into kubernetes the eable console docker file must be deployed in the eagle console kubernetes cluster | 0 |
238,296 | 18,237,901,519 | IssuesEvent | 2021-10-01 09:16:14 | free5gc/free5gc | https://api.github.com/repos/free5gc/free5gc | closed | FYI: Sample configuration - Select nearby UPF according to the connected gNodeB | documentation | Hi @free5gc-org @mariakagi @aligungr and everyone
I wrote a very simple configuration that uses free5GC and UERANSIM to select the nearby UPF according to the connected gNodeB.
https://github.com/s5uishida/free5gc_ueransim_nearby_upf_sample_config
This feature was added to free5GC last month, so this time I compared the configurations for Open5GS and free5GC simply.
Best regards,
--Shigeru | 1.0 | FYI: Sample configuration - Select nearby UPF according to the connected gNodeB - Hi @free5gc-org @mariakagi @aligungr and everyone
I wrote a very simple configuration that uses free5GC and UERANSIM to select the nearby UPF according to the connected gNodeB.
https://github.com/s5uishida/free5gc_ueransim_nearby_upf_sample_config
This feature was added to free5GC last month, so this time I compared the configurations for Open5GS and free5GC simply.
Best regards,
--Shigeru | non_build | fyi sample configuration select nearby upf according to the connected gnodeb hi org mariakagi aligungr and everyone i wrote a very simple configuration that uses and ueransim to select the nearby upf according to the connected gnodeb this feature was added to last month so this time i compared the configurations for and simply best regards shigeru | 0 |
52,980 | 13,097,727,107 | IssuesEvent | 2020-08-03 18:01:29 | lanl/LaGriT | https://api.github.com/repos/lanl/LaGriT | closed | make test failed | Test Suite build | Hi
I am trying to install LaGriT for using it with dfnWorks. I followed your instructions with exodus build. Everything is fine until I run the make test.
It gives me: All checks complete, 2 directories failed out of 41
Makefile:198: recipe for target 'test' failed
make: *** [test] Error 1
I have checked in the terminal (not sure if I have a log file for this test) and it appears to me that there are two main issues:
"Lines Essentially the Same: 2 out of 220
Test has 496 diffs at line 237 >>
Test has 484 extra lines in this chunk.
Lines Essentially the Same: 496 out of 496
2637 lines failed."
This is before step "9 Done with Directory ./quality".
The second issue is "22 Check Directory ./cmo_addatt_normals --------------------------
Test has 3 diffs at line 259 >>
Test has 3 extra lines in this chunk.
Lines Essentially the Same: 3 out of 3
Test has 3 diffs at line 330 >>
Test has 3 extra lines in this chunk.
Lines Essentially the Same: 3 out of 3
6 lines failed."
Any idea of what I am doing wrong?
Thank you so much
| 1.0 | make test failed - Hi
I am trying to install LaGriT for using it with dfnWorks. I followed your instructions with exodus build. Everything is fine until I run the make test.
It gives me: All checks complete, 2 directories failed out of 41
Makefile:198: recipe for target 'test' failed
make: *** [test] Error 1
I have checked in the terminal (not sure if I have a log file for this test) and it appears to me that there are two main issues:
"Lines Essentially the Same: 2 out of 220
Test has 496 diffs at line 237 >>
Test has 484 extra lines in this chunk.
Lines Essentially the Same: 496 out of 496
2637 lines failed."
This is before step "9 Done with Directory ./quality".
The second issue is "22 Check Directory ./cmo_addatt_normals --------------------------
Test has 3 diffs at line 259 >>
Test has 3 extra lines in this chunk.
Lines Essentially the Same: 3 out of 3
Test has 3 diffs at line 330 >>
Test has 3 extra lines in this chunk.
Lines Essentially the Same: 3 out of 3
6 lines failed."
Any idea of what I am doing wrong?
Thank you so much
| build | make test failed hi i am trying to install lagrit for using it with dfnworks i followed your instructions with exodus build everything is fine until i run the make test it gives me all checks complete directories failed out of makefile recipe for target test failed make error i have checked in the terminal not sure if i have a log file for this test and it appears to me that there are two main issues lines essentially the same out of test has diffs at line test has extra lines in this chunk lines essentially the same out of lines failed this is before step done with directory quality the second issue is check directory cmo addatt normals test has diffs at line test has extra lines in this chunk lines essentially the same out of test has diffs at line test has extra lines in this chunk lines essentially the same out of lines failed any idea of what i am doing wrong thank you so much | 1 |
13,894 | 5,487,563,858 | IssuesEvent | 2017-03-14 05:19:32 | libbitcoin/libbitcoin-server | https://api.github.com/repos/libbitcoin/libbitcoin-server | closed | [osx] malloc failure during IBD with 8 channels and tx relay | build | Any recommendations for the following:
**% git branch -v** (for libbitcoin-server)
```
* master 1d9edb8 Merge pull request #337 from evoskuil/master
```
**STDOUT**:
```
22:52:42.858416 INFO [blockchain] Block [406296] 1039 txs 2211 ins 0 wms 462 vms 209 vµs 4 rµs 5 cµs 131 pµs 4 aµs 64 sµs 108 dµs 0.000000
22:52:53.636974 INFO [network] Connected outbound channel [180.200.128.58:8333] (8)
22:53:29.140112 INFO [blockchain] Block [406297] 625 txs 1351 ins 0 wms 247 vms 183 vµs 5 rµs 5 cµs 123 pµs 2 aµs 48 sµs 88 dµs 0.000000
22:53:38.717243 INFO [blockchain] Block [406298] 2634 txs 4260 ins 0 wms 1193 vms 280 vµs 5 rµs 6 cµs 170 pµs 2 aµs 97 sµs 94 dµs 0.000000
22:53:42.008083 INFO [blockchain] Block [406299] 2420 txs 4593 ins 0 wms 996 vms 217 vµs 4 rµs 5 cµs 141 pµs 2 aµs 63 sµs 90 dµs 0.000000
22:53:44.186073 INFO [blockchain] Block [406300] 895 txs 4369 ins 0 wms 622 vms 142 vµs 4 rµs 4 cµs 68 pµs 2 aµs 64 sµs 61 dµs 0.000000
22:53:45.173052 INFO [blockchain] Block [406301] 1116 txs 2144 ins 0 wms 369 vms 172 vµs 5 rµs 7 cµs 102 pµs 3 aµs 55 sµs 80 dµs 0.000000
22:53:48.302806 INFO [blockchain] Block [406302] 1392 txs 3588 ins 0 wms 862 vms 240 vµs 4 rµs 5 cµs 117 pµs 3 aµs 112 sµs 77 dµs 0.000000
bs(64808,0x106290000) malloc: *** error for object 0x7fd009e1cd10: incorrect checksum for freed object - object was probably modified after being freed.
*** set a breakpoint in malloc_error_break to debug
```
**% cat error.log**
```
01:15:57.045495 WARNING [server] ================= startup ==================
01:15:57.045817 ERROR [server] ================= startup ==================
01:15:57.045913 FATAL [server] ================= startup ==================
03:41:59.447513 WARNING [network] Invalid version payload from [178.238.224.213:11050] bad data stream
07:32:36.770931 WARNING [network] Invalid version payload from [81.57.108.63:11050] bad data stream
08:03:18.683320 WARNING [network] Invalid version payload from [138.68.14.183:11050] bad data stream
15:54:15.725786 WARNING [network] Invalid version payload from [46.101.248.142:11050] bad data stream
23:19:53.482954 WARNING [network] Invalid version payload from [95.85.53.51:11050] bad data stream
04:51:29.259682 WARNING [network] Invalid version payload from [5.9.48.68:11050] bad data stream
05:56:01.875936 WARNING [network] Invalid version payload from [87.197.151.160:8333] bad data stream
13:32:14.743385 WARNING [network] Invalid version payload from [159.203.31.42:11050] bad data stream
18:11:12.196439 WARNING [network] Invalid version payload from [108.61.10.90:11050] bad data stream
19:51:05.691218 WARNING [network] Invalid version payload from [138.68.143.185:11050] bad data stream
20:02:25.998815 WARNING [network] Invalid version payload from [178.238.224.213:11050] bad data stream
20:54:43.939352 WARNING [network] Invalid version payload from [52.213.83.100:8333] bad data stream
```
**% vi debug.log**
```
2776436 22:53:42.008083 INFO [blockchain] Block [406299] 2420 txs 4593 ins 0 wms 996 vms 217 vµs 4 rµs 5 cµs 141 pµs 2 aµs 63 sµs 90 dµs 0.000000
2776437 22:53:42.034306 DEBUG [network] Fired protocol_block_in timer on [119.106.12.169:8333] success
2776438 22:53:42.038185 DEBUG [node] Ask [119.106.12.169:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776439 22:53:42.038296 DEBUG [network] Fired protocol_block_in timer on [13.55.2.234:8333] success
2776440 22:53:42.038412 DEBUG [node] Ask [13.55.2.234:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776441 22:53:42.038461 DEBUG [network] Fired protocol_block_in timer on [96.255.211.70:8333] success
2776442 22:53:42.038550 DEBUG [node] Ask [96.255.211.70:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776443 22:53:42.038595 DEBUG [network] Fired protocol_block_in timer on [54.202.86.204:8333] success
2776444 22:53:42.038682 DEBUG [node] Ask [54.202.86.204:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776445 22:53:42.038727 DEBUG [network] Fired protocol_block_in timer on [45.32.112.81:8333] success
2776446 22:53:42.038816 DEBUG [node] Ask [45.32.112.81:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776447 22:53:42.038892 DEBUG [network] Fired protocol_block_in timer on [51.15.51.26:8333] success
2776448 22:53:42.038990 DEBUG [node] Ask [51.15.51.26:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776449 22:53:42.039039 DEBUG [network] Fired protocol_block_in timer on [180.200.128.58:8333] success
2776450 22:53:42.039135 DEBUG [node] Ask [180.200.128.58:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776451 22:53:42.039184 DEBUG [network] Fired protocol_block_in timer on [75.177.137.134:8333] success
2776452 22:53:42.039280 DEBUG [node] Ask [75.177.137.134:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776453 22:53:42.063017 DEBUG [node] Dropped transaction [f557c6a85643ddb3d2108afd5dd9cf848e83eab88e385c467351c12200327348] from [45.32.112.81:8333] previous output not found
2776454 22:53:42.063258 DEBUG [node] Dropped transaction [f07493944c0f21ee94e58bc01d8816dc5f2b69b9362d109c2e814d9c42b4c08a] from [45.32.112.81:8333] previous output not found
2776455 22:53:42.063357 DEBUG [node] Dropped transaction [02c210e561c4b3f033006e2f60bdd0b90fddf8dc5d14b780e262fe5076ee9395] from [51.15.51.26:8333] previous output not found
2776456 22:53:42.063496 DEBUG [node] Dropped transaction [600a652a133adb286c07ef38b6902b9cd32addbdb0d9a9df9fb51852deec45d1] from [45.32.112.81:8333] previous output not found
2776457 22:53:42.063598 DEBUG [node] Dropped transaction [7f7da6912f2da38768f5b6251d5efbdfeebb776de7879faca2330c7f71e46532] from [51.15.51.26:8333] previous output not found
2776458 22:53:42.063732 DEBUG [node] Dropped transaction [35ddd3664b8ba1c220fd3f27e37493ca4c5c66e4016567ba14b3ad1f957bdadc] from [51.15.51.26:8333] previous output not found
2776459 22:53:42.063861 DEBUG [node] Dropped transaction [c7dff1f62a637b5712ec31190a5b36f1bb59827b953df9f35f56b13704334b00] from [51.15.51.26:8333] previous output not found
2776460 22:53:42.064050 DEBUG [node] Dropped transaction [942825f998e9dce4bdfad10f1b12ea95b319f10e458c1b32b874dfd80aff356f] from [51.15.51.26:8333] previous output not found
2776461 22:53:42.064167 DEBUG [node] Dropped transaction [4d85b01841c50ca7d4bcba8cb71973e8dfd96684bb3553d18d98fe635d3b3ce4] from [51.15.51.26:8333] previous output not found
2776462 22:53:42.064282 DEBUG [node] Dropped transaction [ebcfc69af4e0ac45c6333e317e36f6edc024b1add510fb9fb70311c677cd4bf4] from [51.15.51.26:8333] previous output not found
2776463 22:53:42.064405 DEBUG [node] Dropped transaction [6bcc284695addcb2e1ecaafacfa8167aae88c896cdfe022283b6a6e3875721df] from [51.15.51.26:8333] previous output not found
2776464 22:53:42.064529 DEBUG [node] Dropped transaction [aab0ee03c6010bbeaa87425eac90283ffd3cc9b6817c895f0fe4c0220ce4f98b] from [51.15.51.26:8333] previous output not found
2776465 22:53:42.064673 DEBUG [node] Dropped transaction [7ac4678d8c661f0a89839df40186604806cd66e1d777f144d15b08762f73c495] from [51.15.51.26:8333] previous output not found
2776466 22:53:42.065001 DEBUG [node] Dropped transaction [c77b29aac3aeddd34294dea461dbe18b1f3aeb2c2e87e929cdd635cd74b4bf6a] from [51.15.51.26:8333] previous output not found
2776467 22:53:42.065107 DEBUG [node] Dropped transaction [28ccfcf9c31a9e3633369a40dbfa1dfebf0e127429034569b55596f4955920e1] from [51.15.51.26:8333] previous output not found
2776468 22:53:42.065238 DEBUG [node] Dropped transaction [2c3d66a7e42cb0b7d2504a0928535b15b4724c2c2b42bf7d6dfc584a6b8fda09] from [51.15.51.26:8333] previous output not found
2776469 22:53:42.065435 DEBUG [node] Dropped transaction [0a2c75028d67cfa626458c6651520f2efceb8db5f12b80ebe8cf0ccfaf3a6e8c] from [51.15.51.26:8333] previous output not found
2776470 22:53:42.065555 DEBUG [node] Dropped transaction [75188b240ab767829e2a161f1552b903bbe2dcbe3590fd2926a886d83f2dc2e0] from [51.15.51.26:8333] previous output not found
2776471 22:53:42.065693 DEBUG [node] Dropped transaction [1b56ddefd8dbf64d15b42602673e9313e8d04d556f146710b23910508650d41a] from [51.15.51.26:8333] previous output not found
2776472 22:53:42.065801 DEBUG [node] Dropped transaction [62792e8cd3328bb09f32ebcdd3ff623d799dd960e20e8b95e82ae2ca57b2825d] from [51.15.51.26:8333] previous output not found
2776473 22:53:42.065920 DEBUG [node] Dropped transaction [6226cd9da175862fb81a4a2484da8f1cbb34434bb127bb2fca042022a54a9253] from [51.15.51.26:8333] previous output not found
2776474 22:53:42.066019 DEBUG [node] Dropped transaction [d774acb706439958c3e5da7c13dee74ccac813b0314b7d4ea1e458c76db164af] from [51.15.51.26:8333] previous output not found
2776475 22:53:42.066124 DEBUG [node] Dropped transaction [648d8a8f24c217a568fdf10008250d27a44a491c9686a5ed43239b68c723e1e5] from [51.15.51.26:8333] previous output not found
2776476 22:53:42.066240 DEBUG [node] Dropped transaction [e8d16596a6b5f5e661d9d58c5ded0ea2bb1a1fd56961fe75c7eee149b19217cb] from [51.15.51.26:8333] previous output not found
2776477 22:53:42.066344 DEBUG [node] Dropped transaction [059d3ce118f71038606671f787a32a432ded6dbcee10c5fad33de3cae7bc6e60] from [51.15.51.26:8333] previous output not found
2776478 22:53:42.066427 DEBUG [node] Dropped transaction [b39525cf1a053b00a3df220768a29eae9d7b31836873b962fd451ff3eab2009c] from [51.15.51.26:8333] previous output not found
2776479 22:53:42.066557 DEBUG [node] Dropped transaction [3b6ed1eca806a905c64e4aca3d99778a5e83d84f9f070349d54a206d9060d207] from [51.15.51.26:8333] previous output not found
2776480 22:53:42.066700 DEBUG [node] Dropped transaction [bc1d4cbe62da0697ff1352990f90d9fbd3b64680f148af70273b63e1f90c1fe0] from [51.15.51.26:8333] previous output not found
2776481 22:53:42.066800 DEBUG [node] Dropped transaction [8af5e276e2ff8f3aa591dce75db5557f94d28f08a24273e8863cbbd7c700afe2] from [51.15.51.26:8333] previous output not found
2776482 22:53:42.066951 DEBUG [node] Dropped transaction [7ad41cb827e15362947b4a1418b4695da2f9bad35912261e7fcba94ca28ce712] from [51.15.51.26:8333] previous output not found
2776483 22:53:42.067136 DEBUG [node] Dropped transaction [7f6a5a35e2dadbb9ec1e7f55bc74c64b8d51a1ae92ee8bf2ce9da51eaac72e34] from [51.15.51.26:8333] previous output not found
2776484 22:53:42.067279 DEBUG [node] Dropped transaction [7c0634c5308e1bb29c3f2ca21eb79503ae6554b197307d0ccf2b9c5be2de8697] from [51.15.51.26:8333] previous output not found
2776485 22:53:42.067455 DEBUG [node] Dropped transaction [62aca37de35d62f66fb41092103d11f6be95ceb6aa523f7804fd7bef0c1f6a7c] from [51.15.51.26:8333] previous output not found
2776486 22:53:43.038344 DEBUG [network] Fired protocol_block_in timer on [119.106.12.169:8333] success
2776487 22:53:43.038444 DEBUG [network] Fired protocol_block_in timer on [13.55.2.234:8333] success
2776488 22:53:43.038546 DEBUG [network] Fired protocol_block_in timer on [96.255.211.70:8333] success
2776489 22:53:43.038773 DEBUG [network] Fired protocol_block_in timer on [54.202.86.204:8333] success
2776490 22:53:43.038846 DEBUG [network] Fired protocol_block_in timer on [45.32.112.81:8333] success
2776491 22:53:43.039050 DEBUG [network] Fired protocol_block_in timer on [51.15.51.26:8333] success
2776492 22:53:43.039126 DEBUG [network] Fired protocol_block_in timer on [180.200.128.58:8333] success
2776493 22:53:43.039342 DEBUG [network] Fired protocol_block_in timer on [75.177.137.134:8333] success
2776494 22:53:44.186021 DEBUG [node] Connected block [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] at height [406300] from [75.177.137.134:8333] (126, 4).
2776495 22:53:44.186073 INFO [blockchain] Block [406300] 895 txs 4369 ins 0 wms 622 vms 142 vµs 4 rµs 4 cµs 68 pµs 2 aµs 64 sµs 61 dµs 0.000000
2776496 22:53:44.202543 DEBUG [network] Fired protocol_block_in timer on [119.106.12.169:8333] success
2776497 22:53:44.205979 DEBUG [node] Ask [119.106.12.169:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776498 22:53:44.206064 DEBUG [network] Fired protocol_block_in timer on [13.55.2.234:8333] success
2776499 22:53:44.206174 DEBUG [node] Ask [13.55.2.234:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776500 22:53:44.206221 DEBUG [network] Fired protocol_block_in timer on [96.255.211.70:8333] success
2776501 22:53:44.206333 DEBUG [node] Ask [96.255.211.70:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776502 22:53:44.206399 DEBUG [network] Fired protocol_block_in timer on [54.202.86.204:8333] success
2776503 22:53:44.206540 DEBUG [node] Ask [54.202.86.204:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776504 22:53:44.206589 DEBUG [network] Fired protocol_block_in timer on [45.32.112.81:8333] success
2776505 22:53:44.206685 DEBUG [node] Ask [45.32.112.81:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776506 22:53:44.206752 DEBUG [network] Fired protocol_block_in timer on [51.15.51.26:8333] success
2776507 22:53:44.206846 DEBUG [node] Ask [51.15.51.26:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776508 22:53:44.206909 DEBUG [network] Fired protocol_block_in timer on [180.200.128.58:8333] success
2776509 22:53:44.207026 DEBUG [node] Ask [180.200.128.58:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776510 22:53:44.285852 DEBUG [node] Dropped transaction [0d531b2aea81578e9413edbd64c4ecf51195438998799d6df6c543dfd255775c] from [51.15.51.26:8333] previous output not found
2776511 22:53:44.285977 DEBUG [node] Dropped transaction [9956b5cd9645393278a5a52aeee6a128807d37f701a00f390c8560b38f7474eb] from [51.15.51.26:8333] previous output not found
2776512 22:53:44.286087 DEBUG [node] Dropped transaction [438465846cfb12fadc203219d8fa7de6d545e25f3e85aebe64451eccfc52cc64] from [51.15.51.26:8333] previous output not found
2776513 22:53:44.316397 DEBUG [network] Fired protocol_block_in timer on [75.177.137.134:8333] success
2776514 22:53:44.316618 DEBUG [node] Ask [75.177.137.134:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776515 22:53:45.172995 DEBUG [node] Connected block [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] at height [406301] from [75.177.137.134:8333] (126, 4).
2776516 22:53:45.173052 INFO [blockchain] Block [406301] 1116 txs 2144 ins 0 wms 369 vms 172 vµs 5 rµs 7 cµs 102 pµs 3 aµs 55 sµs 80 dµs 0.000000
2776517 22:53:45.206306 DEBUG [network] Fired protocol_block_in timer on [119.106.12.169:8333] success
2776518 22:53:45.207089 DEBUG [node] Ask [119.106.12.169:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776519 22:53:45.207209 DEBUG [network] Fired protocol_block_in timer on [13.55.2.234:8333] success
2776520 22:53:45.207334 DEBUG [node] Ask [13.55.2.234:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776521 22:53:45.207445 DEBUG [network] Fired protocol_block_in timer on [96.255.211.70:8333] success
2776522 22:53:45.207562 DEBUG [node] Ask [96.255.211.70:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776523 22:53:45.207622 DEBUG [network] Fired protocol_block_in timer on [54.202.86.204:8333] success
2776524 22:53:45.207730 DEBUG [node] Ask [54.202.86.204:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776525 22:53:45.207785 DEBUG [network] Fired protocol_block_in timer on [45.32.112.81:8333] success
2776526 22:53:45.207892 DEBUG [node] Ask [45.32.112.81:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776527 22:53:45.207947 DEBUG [network] Fired protocol_block_in timer on [51.15.51.26:8333] success
2776528 22:53:45.208054 DEBUG [node] Ask [51.15.51.26:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776529 22:53:45.208109 DEBUG [network] Fired protocol_block_in timer on [180.200.128.58:8333] success
2776530 22:53:45.208217 DEBUG [node] Ask [180.200.128.58:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776531 22:53:45.644661 DEBUG [network] Fired protocol_block_in timer on [75.177.137.134:8333] success
2776532 22:53:45.644846 DEBUG [node] Ask [75.177.137.134:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776533 22:53:45.760478 DEBUG [network] Storing addresses from [45.32.112.81:8333] (1)
2776534 22:53:45.760553 DEBUG [network] Accepted (1 of 1) host addresses from peer.
2776535 22:53:45.760695 DEBUG [node] Dropped transaction [1da9914b655dcdc05879a6af609299382bb6c15d82f604e322f41fcb00008366] from [45.32.112.81:8333] previous output not found
2776536 22:53:45.760824 DEBUG [node] Dropped transaction [517b5f83602e7113e48dedeff99851dc834377d13a013158fc2f0e39b87e0487] from [45.32.112.81:8333] previous output not found
2776537 22:53:46.207740 DEBUG [network] Fired protocol_block_in timer on [119.106.12.169:8333] success
2776538 22:53:46.208127 DEBUG [network] Fired protocol_block_in timer on [13.55.2.234:8333] success
2776539 22:53:46.208157 DEBUG [network] Fired protocol_block_in timer on [96.255.211.70:8333] success
2776540 22:53:46.208213 DEBUG [network] Fired protocol_block_in timer on [54.202.86.204:8333] success
2776541 22:53:46.208243 DEBUG [network] Fired protocol_block_in timer on [45.32.112.81:8333] success
2776542 22:53:46.208269 DEBUG [network] Fired protocol_block_in timer on [51.15.51.26:8333] success
2776543 22:53:46.208294 DEBUG [network] Fired protocol_block_in timer on [180.200.128.58:8333] success
2776544 22:53:46.501021 DEBUG [node] Dropped transaction [47e4a7ab67dc4d51b222d375c586c4e48cde8dd075392e5a76294c37453ff58e] from [51.15.51.26:8333] previous output not found
2776545 22:53:46.501202 DEBUG [node] Dropped transaction [739ce4901e9ca6055bf886d9a88e1b369eee8ef64b299d1afa88a48be94bfb63] from [51.15.51.26:8333] previous output not found
2776546 22:53:46.501473 DEBUG [node] Dropped transaction [2af64d1a51ad4919b31ce54f5288e59bc43bc528ad6426fb277457fa5f898ef5] from [51.15.51.26:8333] previous output not found
2776547 22:53:46.514104 DEBUG [node] Dropped transaction [68441179d70c6b2ec3e4237294befe3a579a83312c92ac14d2443ca5005ac393] from [51.15.51.26:8333] previous output not found
2776548 22:53:46.514330 DEBUG [node] Dropped transaction [7bbf063b8be917053eb3b31f4d3bdfb115f9c7dfae0e1ec9c59343effbecb8f2] from [51.15.51.26:8333] previous output not found
2776549 22:53:46.514478 DEBUG [node] Dropped transaction [292378217b5dbca23e82f0ea531897e15b9e8939dc315ae292b5738c95e24564] from [51.15.51.26:8333] previous output not found
2776550 22:53:46.514619 DEBUG [node] Dropped transaction [481426839fdcdcaf1c49e8926186ed6112d8ca23710452010c38781cd41c4d8b] from [51.15.51.26:8333] previous output not found
2776551 22:53:46.514792 DEBUG [node] Dropped transaction [98c060c8c2fbe995e6d4a1edb7a71bfefcc19dd40b77af66663a4ca696d5732f] from [51.15.51.26:8333] previous output not found
2776552 22:53:46.570493 DEBUG [node] Captured block [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] from [180.200.128.58:8333] duplicate block
2776553 22:53:46.645029 DEBUG [network] Fired protocol_block_in timer on [75.177.137.134:8333] success
2776554 22:53:47.015867 DEBUG [node] Captured block [00000000000000000043f300ac1e64a33435d0bbc2d51c227cd674b4674f5cc5] from [119.106.12.169:8333] duplicate block
2776555 22:53:48.302752 DEBUG [node] Connected block [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] at height [406302] from [75.177.137.134:8333] (126, 4).
2776556 22:53:48.302806 INFO [blockchain] Block [406302] 1392 txs 3588 ins 0 wms 862 vms 240 vµs 4 rµs 5 cµs 117 pµs 3 aµs 112 sµs 77 dµs 0.000000
2776557 22:53:48.322293 DEBUG [network] Fired protocol_block_in timer on [13.55.2.234:8333] success
2776558 22:53:48.327381 DEBUG [node] Ask [13.55.2.234:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776559 22:53:48.327489 DEBUG [network] Fired protocol_block_in timer on [96.255.211.70:8333] success
2776560 22:53:48.327600 DEBUG [node] Ask [96.255.211.70:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776561 22:53:48.327647 DEBUG [network] Fired protocol_block_in timer on [54.202.86.204:8333] success
2776562 22:53:48.327732 DEBUG [node] Ask [54.202.86.204:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776563 22:53:48.327777 DEBUG [network] Fired protocol_block_in timer on [45.32.112.81:8333] success
2776564 22:53:48.327862 DEBUG [node] Ask [45.32.112.81:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776565 22:53:48.327906 DEBUG [network] Fired protocol_block_in timer on [51.15.51.26:8333] success
2776566 22:53:48.328751 DEBUG [node] Ask [51.15.51.26:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776567 22:53:48.328803 DEBUG [network] Fired protocol_block_in timer on [180.200.128.58:8333] success
2776568 22:53:48.328889 DEBUG [node] Ask [180.200.128.58:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776569 22:53:48.328932 DEBUG [network] Fired protocol_block_in timer on [119.106.12.169:8333] success
2776570 22:53:48.329017 DEBUG [node] Ask [119.106.12.169:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776571 22:53:48.329060 DEBUG [network] Fired protocol_block_in timer on [75.177.137.134:8333] success
2776572 22:53:48.329145 DEBUG [node] Ask [75.177.137.134:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776573 22:53:48.362832 DEBUG [node] Dropped transaction [b6920278a8788373530f168f3dbef4055b18e747f91ab775ee7923c1988c05a5] from [51.15.51.26:8333] previous output not found
2776574 22:53:48.363040 DEBUG [node] Dropped transaction [cb69fc82583e8d49202290032df7fda4b78089b21262f9a3d1fe2774ddd8c1db] from [51.15.51.26:8333] previous output not found
2776575 22:53:48.363160 DEBUG [node] Dropped transaction [9ed5a37d380a7dbb7010aa8c517a0fc72420a41b4e714ac3a3def7d9c4c8819c] from [51.15.51.26:8333] previous output not found
2776576 22:53:48.369268 DEBUG [node] Dropped transaction [d5a18ca532da981e80994058f3140667a0173c762ebad6aa100b341244bce995] from [51.15.51.26:8333] previous output not found
2776577 22:53:48.369393 DEBUG [node] Dropped transaction [df67af498826be83fd9c52de1ca27d5a5fd3bda22331538590c9abb6aa6ee63f] from [51.15.51.26:8333] previous output not found
2776578 22:53:48.369503 DEBUG [node] Dropped transaction [b4ef56e9430df1cf6d537a5d478c0958a8ede58d9d9fcbb6bdd1faf2fe74dec7] from [51.15.51.26:8333] previous output not found
2776579 22:53:48.369620 DEBUG [node] Dropped transaction [1f1c4b1c0d7a297ec28c99a0116b98b2537ccd505c662179d29f384b64b1c630] from [51.15.51.26:8333] previous output not found
2776580 22:53:48.369732 DEBUG [node] Dropped transaction [57937a7d2cc783429153ac35dcd8378e4a2cfe523a5b701a7fabe9d853b0c82b] from [51.15.51.26:8333] previous output not found
2776581 22:53:48.369835 DEBUG [node] Dropped transaction [60b230f8ddd9ff76627b9432ccdcf1591b565c28801e957921b49c1e3c984c34] from [51.15.51.26:8333] previous output not found
2776582 22:53:48.381879 DEBUG [node] Dropped transaction [dd30d1880d7548458ef0d5b7a359daf1104a97b171937921c204afa5d7777d89] from [51.15.51.26:8333] previous output not found
2776583 22:53:48.382003 DEBUG [node] Dropped transaction [37ad4a8f01c2c4b86912707ae6b0ed2b8daf3cc894d598bced5cb1f1b08610de] from [51.15.51.26:8333] previous output not found
```
| 1.0 | [osx] malloc failure during IBD with 8 channels and tx relay - Any recommendations for the following:
**% git branch -v** (for libbitcoin-server)
```
* master 1d9edb8 Merge pull request #337 from evoskuil/master
```
**STDOUT**:
```
22:52:42.858416 INFO [blockchain] Block [406296] 1039 txs 2211 ins 0 wms 462 vms 209 vµs 4 rµs 5 cµs 131 pµs 4 aµs 64 sµs 108 dµs 0.000000
22:52:53.636974 INFO [network] Connected outbound channel [180.200.128.58:8333] (8)
22:53:29.140112 INFO [blockchain] Block [406297] 625 txs 1351 ins 0 wms 247 vms 183 vµs 5 rµs 5 cµs 123 pµs 2 aµs 48 sµs 88 dµs 0.000000
22:53:38.717243 INFO [blockchain] Block [406298] 2634 txs 4260 ins 0 wms 1193 vms 280 vµs 5 rµs 6 cµs 170 pµs 2 aµs 97 sµs 94 dµs 0.000000
22:53:42.008083 INFO [blockchain] Block [406299] 2420 txs 4593 ins 0 wms 996 vms 217 vµs 4 rµs 5 cµs 141 pµs 2 aµs 63 sµs 90 dµs 0.000000
22:53:44.186073 INFO [blockchain] Block [406300] 895 txs 4369 ins 0 wms 622 vms 142 vµs 4 rµs 4 cµs 68 pµs 2 aµs 64 sµs 61 dµs 0.000000
22:53:45.173052 INFO [blockchain] Block [406301] 1116 txs 2144 ins 0 wms 369 vms 172 vµs 5 rµs 7 cµs 102 pµs 3 aµs 55 sµs 80 dµs 0.000000
22:53:48.302806 INFO [blockchain] Block [406302] 1392 txs 3588 ins 0 wms 862 vms 240 vµs 4 rµs 5 cµs 117 pµs 3 aµs 112 sµs 77 dµs 0.000000
bs(64808,0x106290000) malloc: *** error for object 0x7fd009e1cd10: incorrect checksum for freed object - object was probably modified after being freed.
*** set a breakpoint in malloc_error_break to debug
```
**% cat error.log**
```
01:15:57.045495 WARNING [server] ================= startup ==================
01:15:57.045817 ERROR [server] ================= startup ==================
01:15:57.045913 FATAL [server] ================= startup ==================
03:41:59.447513 WARNING [network] Invalid version payload from [178.238.224.213:11050] bad data stream
07:32:36.770931 WARNING [network] Invalid version payload from [81.57.108.63:11050] bad data stream
08:03:18.683320 WARNING [network] Invalid version payload from [138.68.14.183:11050] bad data stream
15:54:15.725786 WARNING [network] Invalid version payload from [46.101.248.142:11050] bad data stream
23:19:53.482954 WARNING [network] Invalid version payload from [95.85.53.51:11050] bad data stream
04:51:29.259682 WARNING [network] Invalid version payload from [5.9.48.68:11050] bad data stream
05:56:01.875936 WARNING [network] Invalid version payload from [87.197.151.160:8333] bad data stream
13:32:14.743385 WARNING [network] Invalid version payload from [159.203.31.42:11050] bad data stream
18:11:12.196439 WARNING [network] Invalid version payload from [108.61.10.90:11050] bad data stream
19:51:05.691218 WARNING [network] Invalid version payload from [138.68.143.185:11050] bad data stream
20:02:25.998815 WARNING [network] Invalid version payload from [178.238.224.213:11050] bad data stream
20:54:43.939352 WARNING [network] Invalid version payload from [52.213.83.100:8333] bad data stream
```
**% vi debug.log**
```
2776436 22:53:42.008083 INFO [blockchain] Block [406299] 2420 txs 4593 ins 0 wms 996 vms 217 vµs 4 rµs 5 cµs 141 pµs 2 aµs 63 sµs 90 dµs 0.000000
2776437 22:53:42.034306 DEBUG [network] Fired protocol_block_in timer on [119.106.12.169:8333] success
2776438 22:53:42.038185 DEBUG [node] Ask [119.106.12.169:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776439 22:53:42.038296 DEBUG [network] Fired protocol_block_in timer on [13.55.2.234:8333] success
2776440 22:53:42.038412 DEBUG [node] Ask [13.55.2.234:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776441 22:53:42.038461 DEBUG [network] Fired protocol_block_in timer on [96.255.211.70:8333] success
2776442 22:53:42.038550 DEBUG [node] Ask [96.255.211.70:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776443 22:53:42.038595 DEBUG [network] Fired protocol_block_in timer on [54.202.86.204:8333] success
2776444 22:53:42.038682 DEBUG [node] Ask [54.202.86.204:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776445 22:53:42.038727 DEBUG [network] Fired protocol_block_in timer on [45.32.112.81:8333] success
2776446 22:53:42.038816 DEBUG [node] Ask [45.32.112.81:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776447 22:53:42.038892 DEBUG [network] Fired protocol_block_in timer on [51.15.51.26:8333] success
2776448 22:53:42.038990 DEBUG [node] Ask [51.15.51.26:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776449 22:53:42.039039 DEBUG [network] Fired protocol_block_in timer on [180.200.128.58:8333] success
2776450 22:53:42.039135 DEBUG [node] Ask [180.200.128.58:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776451 22:53:42.039184 DEBUG [network] Fired protocol_block_in timer on [75.177.137.134:8333] success
2776452 22:53:42.039280 DEBUG [node] Ask [75.177.137.134:8333] for headers from [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] through [2000]
2776453 22:53:42.063017 DEBUG [node] Dropped transaction [f557c6a85643ddb3d2108afd5dd9cf848e83eab88e385c467351c12200327348] from [45.32.112.81:8333] previous output not found
2776454 22:53:42.063258 DEBUG [node] Dropped transaction [f07493944c0f21ee94e58bc01d8816dc5f2b69b9362d109c2e814d9c42b4c08a] from [45.32.112.81:8333] previous output not found
2776455 22:53:42.063357 DEBUG [node] Dropped transaction [02c210e561c4b3f033006e2f60bdd0b90fddf8dc5d14b780e262fe5076ee9395] from [51.15.51.26:8333] previous output not found
2776456 22:53:42.063496 DEBUG [node] Dropped transaction [600a652a133adb286c07ef38b6902b9cd32addbdb0d9a9df9fb51852deec45d1] from [45.32.112.81:8333] previous output not found
2776457 22:53:42.063598 DEBUG [node] Dropped transaction [7f7da6912f2da38768f5b6251d5efbdfeebb776de7879faca2330c7f71e46532] from [51.15.51.26:8333] previous output not found
2776458 22:53:42.063732 DEBUG [node] Dropped transaction [35ddd3664b8ba1c220fd3f27e37493ca4c5c66e4016567ba14b3ad1f957bdadc] from [51.15.51.26:8333] previous output not found
2776459 22:53:42.063861 DEBUG [node] Dropped transaction [c7dff1f62a637b5712ec31190a5b36f1bb59827b953df9f35f56b13704334b00] from [51.15.51.26:8333] previous output not found
2776460 22:53:42.064050 DEBUG [node] Dropped transaction [942825f998e9dce4bdfad10f1b12ea95b319f10e458c1b32b874dfd80aff356f] from [51.15.51.26:8333] previous output not found
2776461 22:53:42.064167 DEBUG [node] Dropped transaction [4d85b01841c50ca7d4bcba8cb71973e8dfd96684bb3553d18d98fe635d3b3ce4] from [51.15.51.26:8333] previous output not found
2776462 22:53:42.064282 DEBUG [node] Dropped transaction [ebcfc69af4e0ac45c6333e317e36f6edc024b1add510fb9fb70311c677cd4bf4] from [51.15.51.26:8333] previous output not found
2776463 22:53:42.064405 DEBUG [node] Dropped transaction [6bcc284695addcb2e1ecaafacfa8167aae88c896cdfe022283b6a6e3875721df] from [51.15.51.26:8333] previous output not found
2776464 22:53:42.064529 DEBUG [node] Dropped transaction [aab0ee03c6010bbeaa87425eac90283ffd3cc9b6817c895f0fe4c0220ce4f98b] from [51.15.51.26:8333] previous output not found
2776465 22:53:42.064673 DEBUG [node] Dropped transaction [7ac4678d8c661f0a89839df40186604806cd66e1d777f144d15b08762f73c495] from [51.15.51.26:8333] previous output not found
2776466 22:53:42.065001 DEBUG [node] Dropped transaction [c77b29aac3aeddd34294dea461dbe18b1f3aeb2c2e87e929cdd635cd74b4bf6a] from [51.15.51.26:8333] previous output not found
2776467 22:53:42.065107 DEBUG [node] Dropped transaction [28ccfcf9c31a9e3633369a40dbfa1dfebf0e127429034569b55596f4955920e1] from [51.15.51.26:8333] previous output not found
2776468 22:53:42.065238 DEBUG [node] Dropped transaction [2c3d66a7e42cb0b7d2504a0928535b15b4724c2c2b42bf7d6dfc584a6b8fda09] from [51.15.51.26:8333] previous output not found
2776469 22:53:42.065435 DEBUG [node] Dropped transaction [0a2c75028d67cfa626458c6651520f2efceb8db5f12b80ebe8cf0ccfaf3a6e8c] from [51.15.51.26:8333] previous output not found
2776470 22:53:42.065555 DEBUG [node] Dropped transaction [75188b240ab767829e2a161f1552b903bbe2dcbe3590fd2926a886d83f2dc2e0] from [51.15.51.26:8333] previous output not found
2776471 22:53:42.065693 DEBUG [node] Dropped transaction [1b56ddefd8dbf64d15b42602673e9313e8d04d556f146710b23910508650d41a] from [51.15.51.26:8333] previous output not found
2776472 22:53:42.065801 DEBUG [node] Dropped transaction [62792e8cd3328bb09f32ebcdd3ff623d799dd960e20e8b95e82ae2ca57b2825d] from [51.15.51.26:8333] previous output not found
2776473 22:53:42.065920 DEBUG [node] Dropped transaction [6226cd9da175862fb81a4a2484da8f1cbb34434bb127bb2fca042022a54a9253] from [51.15.51.26:8333] previous output not found
2776474 22:53:42.066019 DEBUG [node] Dropped transaction [d774acb706439958c3e5da7c13dee74ccac813b0314b7d4ea1e458c76db164af] from [51.15.51.26:8333] previous output not found
2776475 22:53:42.066124 DEBUG [node] Dropped transaction [648d8a8f24c217a568fdf10008250d27a44a491c9686a5ed43239b68c723e1e5] from [51.15.51.26:8333] previous output not found
2776476 22:53:42.066240 DEBUG [node] Dropped transaction [e8d16596a6b5f5e661d9d58c5ded0ea2bb1a1fd56961fe75c7eee149b19217cb] from [51.15.51.26:8333] previous output not found
2776477 22:53:42.066344 DEBUG [node] Dropped transaction [059d3ce118f71038606671f787a32a432ded6dbcee10c5fad33de3cae7bc6e60] from [51.15.51.26:8333] previous output not found
2776478 22:53:42.066427 DEBUG [node] Dropped transaction [b39525cf1a053b00a3df220768a29eae9d7b31836873b962fd451ff3eab2009c] from [51.15.51.26:8333] previous output not found
2776479 22:53:42.066557 DEBUG [node] Dropped transaction [3b6ed1eca806a905c64e4aca3d99778a5e83d84f9f070349d54a206d9060d207] from [51.15.51.26:8333] previous output not found
2776480 22:53:42.066700 DEBUG [node] Dropped transaction [bc1d4cbe62da0697ff1352990f90d9fbd3b64680f148af70273b63e1f90c1fe0] from [51.15.51.26:8333] previous output not found
2776481 22:53:42.066800 DEBUG [node] Dropped transaction [8af5e276e2ff8f3aa591dce75db5557f94d28f08a24273e8863cbbd7c700afe2] from [51.15.51.26:8333] previous output not found
2776482 22:53:42.066951 DEBUG [node] Dropped transaction [7ad41cb827e15362947b4a1418b4695da2f9bad35912261e7fcba94ca28ce712] from [51.15.51.26:8333] previous output not found
2776483 22:53:42.067136 DEBUG [node] Dropped transaction [7f6a5a35e2dadbb9ec1e7f55bc74c64b8d51a1ae92ee8bf2ce9da51eaac72e34] from [51.15.51.26:8333] previous output not found
2776484 22:53:42.067279 DEBUG [node] Dropped transaction [7c0634c5308e1bb29c3f2ca21eb79503ae6554b197307d0ccf2b9c5be2de8697] from [51.15.51.26:8333] previous output not found
2776485 22:53:42.067455 DEBUG [node] Dropped transaction [62aca37de35d62f66fb41092103d11f6be95ceb6aa523f7804fd7bef0c1f6a7c] from [51.15.51.26:8333] previous output not found
2776486 22:53:43.038344 DEBUG [network] Fired protocol_block_in timer on [119.106.12.169:8333] success
2776487 22:53:43.038444 DEBUG [network] Fired protocol_block_in timer on [13.55.2.234:8333] success
2776488 22:53:43.038546 DEBUG [network] Fired protocol_block_in timer on [96.255.211.70:8333] success
2776489 22:53:43.038773 DEBUG [network] Fired protocol_block_in timer on [54.202.86.204:8333] success
2776490 22:53:43.038846 DEBUG [network] Fired protocol_block_in timer on [45.32.112.81:8333] success
2776491 22:53:43.039050 DEBUG [network] Fired protocol_block_in timer on [51.15.51.26:8333] success
2776492 22:53:43.039126 DEBUG [network] Fired protocol_block_in timer on [180.200.128.58:8333] success
2776493 22:53:43.039342 DEBUG [network] Fired protocol_block_in timer on [75.177.137.134:8333] success
2776494 22:53:44.186021 DEBUG [node] Connected block [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] at height [406300] from [75.177.137.134:8333] (126, 4).
2776495 22:53:44.186073 INFO [blockchain] Block [406300] 895 txs 4369 ins 0 wms 622 vms 142 vµs 4 rµs 4 cµs 68 pµs 2 aµs 64 sµs 61 dµs 0.000000
2776496 22:53:44.202543 DEBUG [network] Fired protocol_block_in timer on [119.106.12.169:8333] success
2776497 22:53:44.205979 DEBUG [node] Ask [119.106.12.169:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776498 22:53:44.206064 DEBUG [network] Fired protocol_block_in timer on [13.55.2.234:8333] success
2776499 22:53:44.206174 DEBUG [node] Ask [13.55.2.234:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776500 22:53:44.206221 DEBUG [network] Fired protocol_block_in timer on [96.255.211.70:8333] success
2776501 22:53:44.206333 DEBUG [node] Ask [96.255.211.70:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776502 22:53:44.206399 DEBUG [network] Fired protocol_block_in timer on [54.202.86.204:8333] success
2776503 22:53:44.206540 DEBUG [node] Ask [54.202.86.204:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776504 22:53:44.206589 DEBUG [network] Fired protocol_block_in timer on [45.32.112.81:8333] success
2776505 22:53:44.206685 DEBUG [node] Ask [45.32.112.81:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776506 22:53:44.206752 DEBUG [network] Fired protocol_block_in timer on [51.15.51.26:8333] success
2776507 22:53:44.206846 DEBUG [node] Ask [51.15.51.26:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776508 22:53:44.206909 DEBUG [network] Fired protocol_block_in timer on [180.200.128.58:8333] success
2776509 22:53:44.207026 DEBUG [node] Ask [180.200.128.58:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776510 22:53:44.285852 DEBUG [node] Dropped transaction [0d531b2aea81578e9413edbd64c4ecf51195438998799d6df6c543dfd255775c] from [51.15.51.26:8333] previous output not found
2776511 22:53:44.285977 DEBUG [node] Dropped transaction [9956b5cd9645393278a5a52aeee6a128807d37f701a00f390c8560b38f7474eb] from [51.15.51.26:8333] previous output not found
2776512 22:53:44.286087 DEBUG [node] Dropped transaction [438465846cfb12fadc203219d8fa7de6d545e25f3e85aebe64451eccfc52cc64] from [51.15.51.26:8333] previous output not found
2776513 22:53:44.316397 DEBUG [network] Fired protocol_block_in timer on [75.177.137.134:8333] success
2776514 22:53:44.316618 DEBUG [node] Ask [75.177.137.134:8333] for headers from [000000000000000001f3b9abbf0000ae26af26315dc072e8b70f25ba69ba76c0] through [2000]
2776515 22:53:45.172995 DEBUG [node] Connected block [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] at height [406301] from [75.177.137.134:8333] (126, 4).
2776516 22:53:45.173052 INFO [blockchain] Block [406301] 1116 txs 2144 ins 0 wms 369 vms 172 vµs 5 rµs 7 cµs 102 pµs 3 aµs 55 sµs 80 dµs 0.000000
2776517 22:53:45.206306 DEBUG [network] Fired protocol_block_in timer on [119.106.12.169:8333] success
2776518 22:53:45.207089 DEBUG [node] Ask [119.106.12.169:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776519 22:53:45.207209 DEBUG [network] Fired protocol_block_in timer on [13.55.2.234:8333] success
2776520 22:53:45.207334 DEBUG [node] Ask [13.55.2.234:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776521 22:53:45.207445 DEBUG [network] Fired protocol_block_in timer on [96.255.211.70:8333] success
2776522 22:53:45.207562 DEBUG [node] Ask [96.255.211.70:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776523 22:53:45.207622 DEBUG [network] Fired protocol_block_in timer on [54.202.86.204:8333] success
2776524 22:53:45.207730 DEBUG [node] Ask [54.202.86.204:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776525 22:53:45.207785 DEBUG [network] Fired protocol_block_in timer on [45.32.112.81:8333] success
2776526 22:53:45.207892 DEBUG [node] Ask [45.32.112.81:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776527 22:53:45.207947 DEBUG [network] Fired protocol_block_in timer on [51.15.51.26:8333] success
2776528 22:53:45.208054 DEBUG [node] Ask [51.15.51.26:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776529 22:53:45.208109 DEBUG [network] Fired protocol_block_in timer on [180.200.128.58:8333] success
2776530 22:53:45.208217 DEBUG [node] Ask [180.200.128.58:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776531 22:53:45.644661 DEBUG [network] Fired protocol_block_in timer on [75.177.137.134:8333] success
2776532 22:53:45.644846 DEBUG [node] Ask [75.177.137.134:8333] for headers from [000000000000000004dc16eafc2cc5769cc1e0ffb9d807b8885fec2f61774be6] through [2000]
2776533 22:53:45.760478 DEBUG [network] Storing addresses from [45.32.112.81:8333] (1)
2776534 22:53:45.760553 DEBUG [network] Accepted (1 of 1) host addresses from peer.
2776535 22:53:45.760695 DEBUG [node] Dropped transaction [1da9914b655dcdc05879a6af609299382bb6c15d82f604e322f41fcb00008366] from [45.32.112.81:8333] previous output not found
2776536 22:53:45.760824 DEBUG [node] Dropped transaction [517b5f83602e7113e48dedeff99851dc834377d13a013158fc2f0e39b87e0487] from [45.32.112.81:8333] previous output not found
2776537 22:53:46.207740 DEBUG [network] Fired protocol_block_in timer on [119.106.12.169:8333] success
2776538 22:53:46.208127 DEBUG [network] Fired protocol_block_in timer on [13.55.2.234:8333] success
2776539 22:53:46.208157 DEBUG [network] Fired protocol_block_in timer on [96.255.211.70:8333] success
2776540 22:53:46.208213 DEBUG [network] Fired protocol_block_in timer on [54.202.86.204:8333] success
2776541 22:53:46.208243 DEBUG [network] Fired protocol_block_in timer on [45.32.112.81:8333] success
2776542 22:53:46.208269 DEBUG [network] Fired protocol_block_in timer on [51.15.51.26:8333] success
2776543 22:53:46.208294 DEBUG [network] Fired protocol_block_in timer on [180.200.128.58:8333] success
2776544 22:53:46.501021 DEBUG [node] Dropped transaction [47e4a7ab67dc4d51b222d375c586c4e48cde8dd075392e5a76294c37453ff58e] from [51.15.51.26:8333] previous output not found
2776545 22:53:46.501202 DEBUG [node] Dropped transaction [739ce4901e9ca6055bf886d9a88e1b369eee8ef64b299d1afa88a48be94bfb63] from [51.15.51.26:8333] previous output not found
2776546 22:53:46.501473 DEBUG [node] Dropped transaction [2af64d1a51ad4919b31ce54f5288e59bc43bc528ad6426fb277457fa5f898ef5] from [51.15.51.26:8333] previous output not found
2776547 22:53:46.514104 DEBUG [node] Dropped transaction [68441179d70c6b2ec3e4237294befe3a579a83312c92ac14d2443ca5005ac393] from [51.15.51.26:8333] previous output not found
2776548 22:53:46.514330 DEBUG [node] Dropped transaction [7bbf063b8be917053eb3b31f4d3bdfb115f9c7dfae0e1ec9c59343effbecb8f2] from [51.15.51.26:8333] previous output not found
2776549 22:53:46.514478 DEBUG [node] Dropped transaction [292378217b5dbca23e82f0ea531897e15b9e8939dc315ae292b5738c95e24564] from [51.15.51.26:8333] previous output not found
2776550 22:53:46.514619 DEBUG [node] Dropped transaction [481426839fdcdcaf1c49e8926186ed6112d8ca23710452010c38781cd41c4d8b] from [51.15.51.26:8333] previous output not found
2776551 22:53:46.514792 DEBUG [node] Dropped transaction [98c060c8c2fbe995e6d4a1edb7a71bfefcc19dd40b77af66663a4ca696d5732f] from [51.15.51.26:8333] previous output not found
2776552 22:53:46.570493 DEBUG [node] Captured block [00000000000000000135dc1e5f3a394bf1b2b08d4373b1dba4f143b6230237a4] from [180.200.128.58:8333] duplicate block
2776553 22:53:46.645029 DEBUG [network] Fired protocol_block_in timer on [75.177.137.134:8333] success
2776554 22:53:47.015867 DEBUG [node] Captured block [00000000000000000043f300ac1e64a33435d0bbc2d51c227cd674b4674f5cc5] from [119.106.12.169:8333] duplicate block
2776555 22:53:48.302752 DEBUG [node] Connected block [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] at height [406302] from [75.177.137.134:8333] (126, 4).
2776556 22:53:48.302806 INFO [blockchain] Block [406302] 1392 txs 3588 ins 0 wms 862 vms 240 vµs 4 rµs 5 cµs 117 pµs 3 aµs 112 sµs 77 dµs 0.000000
2776557 22:53:48.322293 DEBUG [network] Fired protocol_block_in timer on [13.55.2.234:8333] success
2776558 22:53:48.327381 DEBUG [node] Ask [13.55.2.234:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776559 22:53:48.327489 DEBUG [network] Fired protocol_block_in timer on [96.255.211.70:8333] success
2776560 22:53:48.327600 DEBUG [node] Ask [96.255.211.70:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776561 22:53:48.327647 DEBUG [network] Fired protocol_block_in timer on [54.202.86.204:8333] success
2776562 22:53:48.327732 DEBUG [node] Ask [54.202.86.204:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776563 22:53:48.327777 DEBUG [network] Fired protocol_block_in timer on [45.32.112.81:8333] success
2776564 22:53:48.327862 DEBUG [node] Ask [45.32.112.81:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776565 22:53:48.327906 DEBUG [network] Fired protocol_block_in timer on [51.15.51.26:8333] success
2776566 22:53:48.328751 DEBUG [node] Ask [51.15.51.26:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776567 22:53:48.328803 DEBUG [network] Fired protocol_block_in timer on [180.200.128.58:8333] success
2776568 22:53:48.328889 DEBUG [node] Ask [180.200.128.58:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776569 22:53:48.328932 DEBUG [network] Fired protocol_block_in timer on [119.106.12.169:8333] success
2776570 22:53:48.329017 DEBUG [node] Ask [119.106.12.169:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776571 22:53:48.329060 DEBUG [network] Fired protocol_block_in timer on [75.177.137.134:8333] success
2776572 22:53:48.329145 DEBUG [node] Ask [75.177.137.134:8333] for headers from [00000000000000000338831e159cd5255b13d92ac8912de5dbb7fb13f62fcbbf] through [2000]
2776573 22:53:48.362832 DEBUG [node] Dropped transaction [b6920278a8788373530f168f3dbef4055b18e747f91ab775ee7923c1988c05a5] from [51.15.51.26:8333] previous output not found
2776574 22:53:48.363040 DEBUG [node] Dropped transaction [cb69fc82583e8d49202290032df7fda4b78089b21262f9a3d1fe2774ddd8c1db] from [51.15.51.26:8333] previous output not found
2776575 22:53:48.363160 DEBUG [node] Dropped transaction [9ed5a37d380a7dbb7010aa8c517a0fc72420a41b4e714ac3a3def7d9c4c8819c] from [51.15.51.26:8333] previous output not found
2776576 22:53:48.369268 DEBUG [node] Dropped transaction [d5a18ca532da981e80994058f3140667a0173c762ebad6aa100b341244bce995] from [51.15.51.26:8333] previous output not found
2776577 22:53:48.369393 DEBUG [node] Dropped transaction [df67af498826be83fd9c52de1ca27d5a5fd3bda22331538590c9abb6aa6ee63f] from [51.15.51.26:8333] previous output not found
2776578 22:53:48.369503 DEBUG [node] Dropped transaction [b4ef56e9430df1cf6d537a5d478c0958a8ede58d9d9fcbb6bdd1faf2fe74dec7] from [51.15.51.26:8333] previous output not found
2776579 22:53:48.369620 DEBUG [node] Dropped transaction [1f1c4b1c0d7a297ec28c99a0116b98b2537ccd505c662179d29f384b64b1c630] from [51.15.51.26:8333] previous output not found
2776580 22:53:48.369732 DEBUG [node] Dropped transaction [57937a7d2cc783429153ac35dcd8378e4a2cfe523a5b701a7fabe9d853b0c82b] from [51.15.51.26:8333] previous output not found
2776581 22:53:48.369835 DEBUG [node] Dropped transaction [60b230f8ddd9ff76627b9432ccdcf1591b565c28801e957921b49c1e3c984c34] from [51.15.51.26:8333] previous output not found
2776582 22:53:48.381879 DEBUG [node] Dropped transaction [dd30d1880d7548458ef0d5b7a359daf1104a97b171937921c204afa5d7777d89] from [51.15.51.26:8333] previous output not found
2776583 22:53:48.382003 DEBUG [node] Dropped transaction [37ad4a8f01c2c4b86912707ae6b0ed2b8daf3cc894d598bced5cb1f1b08610de] from [51.15.51.26:8333] previous output not found
```
| build | malloc failure during ibd with channels and tx relay any recommendations for the following git branch v for libbitcoin server master merge pull request from evoskuil master stdout info block txs ins wms vms vµs rµs cµs pµs aµs sµs dµs info connected outbound channel info block txs ins wms vms vµs rµs cµs pµs aµs sµs dµs info block txs ins wms vms vµs rµs cµs pµs aµs sµs dµs info block txs ins wms vms vµs rµs cµs pµs aµs sµs dµs info block txs ins wms vms vµs rµs cµs pµs aµs sµs dµs info block txs ins wms vms vµs rµs cµs pµs aµs sµs dµs info block txs ins wms vms vµs rµs cµs pµs aµs sµs dµs bs malloc error for object incorrect checksum for freed object object was probably modified after being freed set a breakpoint in malloc error break to debug cat error log warning startup error startup fatal startup warning invalid version payload from bad data stream warning invalid version payload from bad data stream warning invalid version payload from bad data stream warning invalid version payload from bad data stream warning invalid version payload from bad data stream warning invalid version payload from bad data stream warning invalid version payload from bad data stream warning invalid version payload from bad data stream warning invalid version payload from bad data stream warning invalid version payload from bad data stream warning invalid version payload from bad data stream warning invalid version payload from bad data stream vi debug log info block txs ins wms vms vµs rµs cµs pµs aµs sµs dµs debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug fired protocol block in timer on success debug fired protocol block in timer on success debug fired protocol block in timer on success debug fired protocol block in timer on success debug fired protocol block in timer on success debug fired protocol block in timer on success debug fired protocol block in timer on success debug fired protocol block in timer on success debug connected block at height from info block txs ins wms vms vµs rµs cµs pµs aµs sµs dµs debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug fired protocol block in timer on success debug ask for headers from through debug connected block at height from info block txs ins wms vms vµs rµs cµs pµs aµs sµs dµs debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug storing addresses from debug accepted of host addresses from peer debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug fired protocol block in timer on success debug fired protocol block in timer on success debug fired protocol block in timer on success debug fired protocol block in timer on success debug fired protocol block in timer on success debug fired protocol block in timer on success debug fired protocol block in timer on success debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug captured block from duplicate block debug fired protocol block in timer on success debug captured block from duplicate block debug connected block at height from info block txs ins wms vms vµs rµs cµs pµs aµs sµs dµs debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug fired protocol block in timer on success debug ask for headers from through debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found debug dropped transaction from previous output not found | 1 |
14,096 | 5,554,908,484 | IssuesEvent | 2017-03-24 02:14:16 | docker/docker | https://api.github.com/repos/docker/docker | closed | Proposal: Multi-stage builds | area/builder kind/feature | #resurrects #7149
We've been going back-and-forth among some maintainers to provide a way to provide capabilities for users to produce sleek images without the cruft of the intermediate build artifacts.
We see a lot of requests from the community for this feature and different ways how people try to work around it, most commonly with `docker cp` and re-tarring a new context or trying to combine the whole build into a single `RUN` instruction.
Among the things we discussed were rebasing to a different rootfs path, mounting or copying data from other images, using cache storage between images, squashing, subblocks inside dockerfile, invoking builder inside of dockerfile etc.
Eventually, we ended up on the #7149 proposal that allows switching context of a build to a directory from an existing image. The benefits of this proposal are that it least conflicts with the current design principles of Dockerfile like self-consistency, build cache, returning single target etc. while elegantly solving the small images problem
While this proposal can be considered as a "chained-build" and has some limitations for describing complicated build graphs with multiple branches we have concluded that it would be best to solve that problem in a more higher level and we continue to investigate possible improvements.
#### The proposal:
edit: this has been updated to new syntax
edit2: `s/--context/--from/`
`--from=n` flag allows to access files from rootfs of previous build block. Every build block starts with a `FROM` instruction(multiple `FROM` instructions already work in Docker today). `n` specifies an incrementing index for every block. In the future we want to extend it to human readable labels.
```
FROM ubuntu
RUN apt-get install build-essentials
ADD . /src
RUN cd /src && make
FROM busybox
COPY --from=0 app /usr/local/bin/app
EXPOSE 80
ENTRYPOINT /usr/local/bin/app
```
Benefits for this syntax are that when files from the user context are required both for building some artifact and also for the final image they don't need to be copied to the first environment. That also means that it doesn't invalidate cache for the first environment if the file is not used there. This syntax can also be used for including content from other images with just extra `FROM` command.
old proposal:
#### The proposal:
`BUILD /path/to/context` instruction in the `Dockerfile` that switches the current build context to `/path/to/context` from the current image's rootfs.
`docker build docker://image-reference[::/subdir]` that invokes a new build using the data from a specified image as a build context.
#### Notes:
- No previous metadata carries over to the new image after `BUILD`. The next instruction after this command needs to be `FROM`.
- The other way to think about the `BUILD` instruction is as `SETCONTEXT`
- The build from docker reference syntax is useful when the build is described by multiple Dockerfiles and dependencies are controlled by Makefile like utility.
- Only the layers after the last `BUILD` instruction end up in the final image.
- `docker build -t` would tag the last image defined at the end the Dockerfile
- Some features like auto-tagging and specifying/loading a Dockerfile from new context directory have been left out and can be considered as future additions.
#### Example:
```
FROM ubuntu
RUN apt-get install build-essentials
ADD . /src
RUN cd /src && make
BUILD /src/build
FROM busybox
COPY app /usr/local/bin/app
EXPOSE 80
ENTRYPOINT /usr/local/bin/app
```
@icecrime @vikstrous @fermayo | 1.0 | Proposal: Multi-stage builds - #resurrects #7149
We've been going back-and-forth among some maintainers to provide a way to provide capabilities for users to produce sleek images without the cruft of the intermediate build artifacts.
We see a lot of requests from the community for this feature and different ways how people try to work around it, most commonly with `docker cp` and re-tarring a new context or trying to combine the whole build into a single `RUN` instruction.
Among the things we discussed were rebasing to a different rootfs path, mounting or copying data from other images, using cache storage between images, squashing, subblocks inside dockerfile, invoking builder inside of dockerfile etc.
Eventually, we ended up on the #7149 proposal that allows switching context of a build to a directory from an existing image. The benefits of this proposal are that it least conflicts with the current design principles of Dockerfile like self-consistency, build cache, returning single target etc. while elegantly solving the small images problem
While this proposal can be considered as a "chained-build" and has some limitations for describing complicated build graphs with multiple branches we have concluded that it would be best to solve that problem in a more higher level and we continue to investigate possible improvements.
#### The proposal:
edit: this has been updated to new syntax
edit2: `s/--context/--from/`
`--from=n` flag allows to access files from rootfs of previous build block. Every build block starts with a `FROM` instruction(multiple `FROM` instructions already work in Docker today). `n` specifies an incrementing index for every block. In the future we want to extend it to human readable labels.
```
FROM ubuntu
RUN apt-get install build-essentials
ADD . /src
RUN cd /src && make
FROM busybox
COPY --from=0 app /usr/local/bin/app
EXPOSE 80
ENTRYPOINT /usr/local/bin/app
```
Benefits for this syntax are that when files from the user context are required both for building some artifact and also for the final image they don't need to be copied to the first environment. That also means that it doesn't invalidate cache for the first environment if the file is not used there. This syntax can also be used for including content from other images with just extra `FROM` command.
old proposal:
#### The proposal:
`BUILD /path/to/context` instruction in the `Dockerfile` that switches the current build context to `/path/to/context` from the current image's rootfs.
`docker build docker://image-reference[::/subdir]` that invokes a new build using the data from a specified image as a build context.
#### Notes:
- No previous metadata carries over to the new image after `BUILD`. The next instruction after this command needs to be `FROM`.
- The other way to think about the `BUILD` instruction is as `SETCONTEXT`
- The build from docker reference syntax is useful when the build is described by multiple Dockerfiles and dependencies are controlled by Makefile like utility.
- Only the layers after the last `BUILD` instruction end up in the final image.
- `docker build -t` would tag the last image defined at the end the Dockerfile
- Some features like auto-tagging and specifying/loading a Dockerfile from new context directory have been left out and can be considered as future additions.
#### Example:
```
FROM ubuntu
RUN apt-get install build-essentials
ADD . /src
RUN cd /src && make
BUILD /src/build
FROM busybox
COPY app /usr/local/bin/app
EXPOSE 80
ENTRYPOINT /usr/local/bin/app
```
@icecrime @vikstrous @fermayo | build | proposal multi stage builds resurrects we ve been going back and forth among some maintainers to provide a way to provide capabilities for users to produce sleek images without the cruft of the intermediate build artifacts we see a lot of requests from the community for this feature and different ways how people try to work around it most commonly with docker cp and re tarring a new context or trying to combine the whole build into a single run instruction among the things we discussed were rebasing to a different rootfs path mounting or copying data from other images using cache storage between images squashing subblocks inside dockerfile invoking builder inside of dockerfile etc eventually we ended up on the proposal that allows switching context of a build to a directory from an existing image the benefits of this proposal are that it least conflicts with the current design principles of dockerfile like self consistency build cache returning single target etc while elegantly solving the small images problem while this proposal can be considered as a chained build and has some limitations for describing complicated build graphs with multiple branches we have concluded that it would be best to solve that problem in a more higher level and we continue to investigate possible improvements the proposal edit this has been updated to new syntax s context from from n flag allows to access files from rootfs of previous build block every build block starts with a from instruction multiple from instructions already work in docker today n specifies an incrementing index for every block in the future we want to extend it to human readable labels from ubuntu run apt get install build essentials add src run cd src make from busybox copy from app usr local bin app expose entrypoint usr local bin app benefits for this syntax are that when files from the user context are required both for building some artifact and also for the final image they don t need to be copied to the first environment that also means that it doesn t invalidate cache for the first environment if the file is not used there this syntax can also be used for including content from other images with just extra from command old proposal the proposal build path to context instruction in the dockerfile that switches the current build context to path to context from the current image s rootfs docker build docker image reference that invokes a new build using the data from a specified image as a build context notes no previous metadata carries over to the new image after build the next instruction after this command needs to be from the other way to think about the build instruction is as setcontext the build from docker reference syntax is useful when the build is described by multiple dockerfiles and dependencies are controlled by makefile like utility only the layers after the last build instruction end up in the final image docker build t would tag the last image defined at the end the dockerfile some features like auto tagging and specifying loading a dockerfile from new context directory have been left out and can be considered as future additions example from ubuntu run apt get install build essentials add src run cd src make build src build from busybox copy app usr local bin app expose entrypoint usr local bin app icecrime vikstrous fermayo | 1 |
63,329 | 15,571,518,131 | IssuesEvent | 2021-03-17 05:09:22 | tensorflow/tensorflow | https://api.github.com/repos/tensorflow/tensorflow | closed | Change a1825c95 breaks TFLite for Raspberry Pi | comp:lite type:bug type:build/install | When compiling the TensorFlow Lite Python wheel for Raspberry Pi (as described on https://www.tensorflow.org/lite/guide/build_cmake_pip), the result throws an exception when I try to use it:
~~~~
Traceback (most recent call last):
File ".../venv/lib/python3.7/site-packages/tflite_runtime/interpreter.py", line 45, in <module>
from tensorflow.lite.python import metrics_portable as metrics
ModuleNotFoundError: No module named 'tensorflow'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
...
import tflite_runtime.interpreter as tflite
File ".../venv/lib/python3.7/site-packages/tflite_runtime/interpreter.py", line 47, in <module>
from tensorflow.lite.python import metrics_nonportable as metrics
ModuleNotFoundError: No module named 'tensorflow'
~~~~
The offending lines were added in change a1825c95, in the file `tensorflow/lite/python/interpreter.py`:
~~~~
diff --git a/tensorflow/lite/python/interpreter.py b/tensorflow/lite/python/interpreter.py
index f7ef3b34ba6..5c5898b6d4d 100644
--- a/tensorflow/lite/python/interpreter.py
+++ b/tensorflow/lite/python/interpreter.py
@@ -40,6 +40,13 @@ else:
return lambda x: x
+try:
+ from tensorflow.lite.python import metrics_portable as metrics
+except ImportError:
+ from tensorflow.lite.python import metrics_nonportable as metrics
+# pylint: enable=g-import-not-at-top
+
+
class Delegate(object):
"""Python wrapper class to manage TfLiteDelegate objects.
@@ -321,6 +328,9 @@ class Interpreter(object):
delegate._get_native_delegate_pointer()) # pylint: disable=protected-access
self._signature_defs = self.get_signature_list()
+ self._metrics = metrics.TFLiteMetrics()
+ self._metrics.increase_counter_interpreter_creation()
+
def __del__(self):
# Must make sure the interpreter is destroyed before things that
# are used by it like the delegates. NOTE this only works on CPython
~~~~
I removed these lines from the copy of `interpreter.py` after installing the wheel and the rest of the code works fine.
It appears that these metrics are needed for unit testing, but something needs to be changed so they are not used when the TFLite package is run a system where TensorFlow itself is not present (e.g. a small platform like Raspberry Pi). | 1.0 | Change a1825c95 breaks TFLite for Raspberry Pi - When compiling the TensorFlow Lite Python wheel for Raspberry Pi (as described on https://www.tensorflow.org/lite/guide/build_cmake_pip), the result throws an exception when I try to use it:
~~~~
Traceback (most recent call last):
File ".../venv/lib/python3.7/site-packages/tflite_runtime/interpreter.py", line 45, in <module>
from tensorflow.lite.python import metrics_portable as metrics
ModuleNotFoundError: No module named 'tensorflow'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
...
import tflite_runtime.interpreter as tflite
File ".../venv/lib/python3.7/site-packages/tflite_runtime/interpreter.py", line 47, in <module>
from tensorflow.lite.python import metrics_nonportable as metrics
ModuleNotFoundError: No module named 'tensorflow'
~~~~
The offending lines were added in change a1825c95, in the file `tensorflow/lite/python/interpreter.py`:
~~~~
diff --git a/tensorflow/lite/python/interpreter.py b/tensorflow/lite/python/interpreter.py
index f7ef3b34ba6..5c5898b6d4d 100644
--- a/tensorflow/lite/python/interpreter.py
+++ b/tensorflow/lite/python/interpreter.py
@@ -40,6 +40,13 @@ else:
return lambda x: x
+try:
+ from tensorflow.lite.python import metrics_portable as metrics
+except ImportError:
+ from tensorflow.lite.python import metrics_nonportable as metrics
+# pylint: enable=g-import-not-at-top
+
+
class Delegate(object):
"""Python wrapper class to manage TfLiteDelegate objects.
@@ -321,6 +328,9 @@ class Interpreter(object):
delegate._get_native_delegate_pointer()) # pylint: disable=protected-access
self._signature_defs = self.get_signature_list()
+ self._metrics = metrics.TFLiteMetrics()
+ self._metrics.increase_counter_interpreter_creation()
+
def __del__(self):
# Must make sure the interpreter is destroyed before things that
# are used by it like the delegates. NOTE this only works on CPython
~~~~
I removed these lines from the copy of `interpreter.py` after installing the wheel and the rest of the code works fine.
It appears that these metrics are needed for unit testing, but something needs to be changed so they are not used when the TFLite package is run a system where TensorFlow itself is not present (e.g. a small platform like Raspberry Pi). | build | change breaks tflite for raspberry pi when compiling the tensorflow lite python wheel for raspberry pi as described on the result throws an exception when i try to use it traceback most recent call last file venv lib site packages tflite runtime interpreter py line in from tensorflow lite python import metrics portable as metrics modulenotfounderror no module named tensorflow during handling of the above exception another exception occurred traceback most recent call last import tflite runtime interpreter as tflite file venv lib site packages tflite runtime interpreter py line in from tensorflow lite python import metrics nonportable as metrics modulenotfounderror no module named tensorflow the offending lines were added in change in the file tensorflow lite python interpreter py diff git a tensorflow lite python interpreter py b tensorflow lite python interpreter py index a tensorflow lite python interpreter py b tensorflow lite python interpreter py else return lambda x x try from tensorflow lite python import metrics portable as metrics except importerror from tensorflow lite python import metrics nonportable as metrics pylint enable g import not at top class delegate object python wrapper class to manage tflitedelegate objects class interpreter object delegate get native delegate pointer pylint disable protected access self signature defs self get signature list self metrics metrics tflitemetrics self metrics increase counter interpreter creation def del self must make sure the interpreter is destroyed before things that are used by it like the delegates note this only works on cpython i removed these lines from the copy of interpreter py after installing the wheel and the rest of the code works fine it appears that these metrics are needed for unit testing but something needs to be changed so they are not used when the tflite package is run a system where tensorflow itself is not present e g a small platform like raspberry pi | 1 |
70,054 | 18,008,959,198 | IssuesEvent | 2021-09-16 05:57:25 | tesseract-ocr/tesseract | https://api.github.com/repos/tesseract-ocr/tesseract | closed | Build Tesseract from source with Visual Studio | build process |
------------------------
### Environment
* **Tesseract Version**: 5.0.0 alfa<!-- compulsory. you must provide your version -->
* **Commit Number**: a1a177f<!-- optional. if known - specify commit used, if built from source -->
* **Platform**:Windows 10 64 bit <!-- either `uname -a` output, or if Windows, version and 32-bit or 64-bit -->
### Current Behavior:
I can not build from source
i had download SW client and save it at "D:\Essam\Software\SW" the add to Path
and i can run SW in command line and see WS information as follow
D:\Tutorial\Git\tesseract\build>sw --version
sw.client.sw version 1.0.0
git revision 083bb99144549c1f361298e8284daa6b54422965
assembled on 30.01.2020 18:36:29 Egypt Standard Time
then i run the following commands to compile from source as describe in the following link
https://github.com/tesseract-ocr/tesseract/wiki/Compiling
the command are
git clone https://github.com/tesseract-ocr/tesseract tesseract
cd tesseract
mkdir build && cd build
cmake .. -G "Visual Studio 15 2017 Win64" -DCMAKE_INSTALL_PREFIX=inst
i receive the following error
"-- Selecting Windows SDK version 10.0.17763.0 to target Windows 10.0.18363.
Configuring tesseract version 5.0.0-alpha-621-ga1a17...
-- target changed from "auto" to "kaby-lake"
CMake Error at CMakeLists.txt:197 (find_package):
By not providing "FindSW.cmake" in CMAKE_MODULE_PATH this project has asked
CMake to find a package configuration file provided by "SW", but CMake did
not find one.
Could not find a package configuration file provided by "SW" with any of
the following names:
SWConfig.cmake
sw-config.cmake
Add the installation prefix of "SW" to CMAKE_PREFIX_PATH or set "SW_DIR" to
a directory containing one of the above files. If "SW" provides a separate
development package or SDK, be sure it has been installed.
-- Configuring incomplete, errors occurred!
See also "D:/Tutorial/Git/tesseract/build/CMakeFiles/CMakeOutput.log"."
the log file attached
[CMakeOutput.log](https://github.com/tesseract-ocr/tesseract/files/4143734/CMakeOutput.log)
### Expected Behavior:
build tesseract solution
### Suggested Fix:
| 1.0 | Build Tesseract from source with Visual Studio -
------------------------
### Environment
* **Tesseract Version**: 5.0.0 alfa<!-- compulsory. you must provide your version -->
* **Commit Number**: a1a177f<!-- optional. if known - specify commit used, if built from source -->
* **Platform**:Windows 10 64 bit <!-- either `uname -a` output, or if Windows, version and 32-bit or 64-bit -->
### Current Behavior:
I can not build from source
i had download SW client and save it at "D:\Essam\Software\SW" the add to Path
and i can run SW in command line and see WS information as follow
D:\Tutorial\Git\tesseract\build>sw --version
sw.client.sw version 1.0.0
git revision 083bb99144549c1f361298e8284daa6b54422965
assembled on 30.01.2020 18:36:29 Egypt Standard Time
then i run the following commands to compile from source as describe in the following link
https://github.com/tesseract-ocr/tesseract/wiki/Compiling
the command are
git clone https://github.com/tesseract-ocr/tesseract tesseract
cd tesseract
mkdir build && cd build
cmake .. -G "Visual Studio 15 2017 Win64" -DCMAKE_INSTALL_PREFIX=inst
i receive the following error
"-- Selecting Windows SDK version 10.0.17763.0 to target Windows 10.0.18363.
Configuring tesseract version 5.0.0-alpha-621-ga1a17...
-- target changed from "auto" to "kaby-lake"
CMake Error at CMakeLists.txt:197 (find_package):
By not providing "FindSW.cmake" in CMAKE_MODULE_PATH this project has asked
CMake to find a package configuration file provided by "SW", but CMake did
not find one.
Could not find a package configuration file provided by "SW" with any of
the following names:
SWConfig.cmake
sw-config.cmake
Add the installation prefix of "SW" to CMAKE_PREFIX_PATH or set "SW_DIR" to
a directory containing one of the above files. If "SW" provides a separate
development package or SDK, be sure it has been installed.
-- Configuring incomplete, errors occurred!
See also "D:/Tutorial/Git/tesseract/build/CMakeFiles/CMakeOutput.log"."
the log file attached
[CMakeOutput.log](https://github.com/tesseract-ocr/tesseract/files/4143734/CMakeOutput.log)
### Expected Behavior:
build tesseract solution
### Suggested Fix:
| build | build tesseract from source with visual studio environment tesseract version alfa commit number platform windows bit current behavior i can not build from source i had download sw client and save it at d essam software sw the add to path and i can run sw in command line and see ws information as follow d tutorial git tesseract build sw version sw client sw version git revision assembled on egypt standard time then i run the following commands to compile from source as describe in the following link the command are git clone tesseract cd tesseract mkdir build cd build cmake g visual studio dcmake install prefix inst i receive the following error selecting windows sdk version to target windows configuring tesseract version alpha target changed from auto to kaby lake cmake error at cmakelists txt find package by not providing findsw cmake in cmake module path this project has asked cmake to find a package configuration file provided by sw but cmake did not find one could not find a package configuration file provided by sw with any of the following names swconfig cmake sw config cmake add the installation prefix of sw to cmake prefix path or set sw dir to a directory containing one of the above files if sw provides a separate development package or sdk be sure it has been installed configuring incomplete errors occurred see also d tutorial git tesseract build cmakefiles cmakeoutput log the log file attached expected behavior build tesseract solution suggested fix | 1 |
35,852 | 12,393,586,217 | IssuesEvent | 2020-05-20 15:38:06 | wallanpsantos/apache-camel-alura | https://api.github.com/repos/wallanpsantos/apache-camel-alura | opened | CVE-2019-10202 (High) detected in jackson-mapper-asl-1.9.2.jar | security vulnerability | ## CVE-2019-10202 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-mapper-asl-1.9.2.jar</b></p></summary>
<p>Data Mapper package is a high-performance data binding package
built on Jackson JSON processor</p>
<p>Path to dependency file: /tmp/ws-scm/apache-camel-alura/camel-alura/pom.xml</p>
<p>Path to vulnerable library: /root/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar</p>
<p>
Dependency Hierarchy:
- activemq-camel-5.6.0.jar (Root Library)
- activemq-core-5.6.0.jar
- fusemq-leveldb-1.1.jar
- :x: **jackson-mapper-asl-1.9.2.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/wallanpsantos/apache-camel-alura/commit/1c082812a144fc7b9f16c14f56f305c26ef10ca5">1c082812a144fc7b9f16c14f56f305c26ef10ca5</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A series of deserialization vulnerabilities have been discovered in Codehaus 1.9.x implemented in EAP 7. This CVE fixes CVE-2017-17485, CVE-2017-7525, CVE-2017-15095, CVE-2018-5968, CVE-2018-7489, CVE-2018-1000873, CVE-2019-12086 reported for FasterXML jackson-databind by implementing a whitelist approach that will mitigate these vulnerabilities and future ones alike.
<p>Publish Date: 2019-10-01
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10202>CVE-2019-10202</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://access.redhat.com/errata/RHSA-2019:2938">https://access.redhat.com/errata/RHSA-2019:2938</a></p>
<p>Release Date: 2019-10-01</p>
<p>Fix Resolution: JBoss Enterprise Application Platform - 7.2.4;com.fasterxml.jackson.core:jackson-databind:2.9.9</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2019-10202 (High) detected in jackson-mapper-asl-1.9.2.jar - ## CVE-2019-10202 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-mapper-asl-1.9.2.jar</b></p></summary>
<p>Data Mapper package is a high-performance data binding package
built on Jackson JSON processor</p>
<p>Path to dependency file: /tmp/ws-scm/apache-camel-alura/camel-alura/pom.xml</p>
<p>Path to vulnerable library: /root/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar</p>
<p>
Dependency Hierarchy:
- activemq-camel-5.6.0.jar (Root Library)
- activemq-core-5.6.0.jar
- fusemq-leveldb-1.1.jar
- :x: **jackson-mapper-asl-1.9.2.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/wallanpsantos/apache-camel-alura/commit/1c082812a144fc7b9f16c14f56f305c26ef10ca5">1c082812a144fc7b9f16c14f56f305c26ef10ca5</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A series of deserialization vulnerabilities have been discovered in Codehaus 1.9.x implemented in EAP 7. This CVE fixes CVE-2017-17485, CVE-2017-7525, CVE-2017-15095, CVE-2018-5968, CVE-2018-7489, CVE-2018-1000873, CVE-2019-12086 reported for FasterXML jackson-databind by implementing a whitelist approach that will mitigate these vulnerabilities and future ones alike.
<p>Publish Date: 2019-10-01
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10202>CVE-2019-10202</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://access.redhat.com/errata/RHSA-2019:2938">https://access.redhat.com/errata/RHSA-2019:2938</a></p>
<p>Release Date: 2019-10-01</p>
<p>Fix Resolution: JBoss Enterprise Application Platform - 7.2.4;com.fasterxml.jackson.core:jackson-databind:2.9.9</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_build | cve high detected in jackson mapper asl jar cve high severity vulnerability vulnerable library jackson mapper asl jar data mapper package is a high performance data binding package built on jackson json processor path to dependency file tmp ws scm apache camel alura camel alura pom xml path to vulnerable library root repository org codehaus jackson jackson mapper asl jackson mapper asl jar dependency hierarchy activemq camel jar root library activemq core jar fusemq leveldb jar x jackson mapper asl jar vulnerable library found in head commit a href vulnerability details a series of deserialization vulnerabilities have been discovered in codehaus x implemented in eap this cve fixes cve cve cve cve cve cve cve reported for fasterxml jackson databind by implementing a whitelist approach that will mitigate these vulnerabilities and future ones alike publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jboss enterprise application platform com fasterxml jackson core jackson databind step up your open source security game with whitesource | 0 |
101,816 | 31,683,297,280 | IssuesEvent | 2023-09-08 03:02:23 | visualboyadvance-m/visualboyadvance-m | https://api.github.com/repos/visualboyadvance-m/visualboyadvance-m | closed | Feature Request: Flatpak installation of VBA-M. | enhancement build | Make flatpak builds on flathub.org. This will allow for an easily accessible more up to date version for users.`` | 1.0 | Feature Request: Flatpak installation of VBA-M. - Make flatpak builds on flathub.org. This will allow for an easily accessible more up to date version for users.`` | build | feature request flatpak installation of vba m make flatpak builds on flathub org this will allow for an easily accessible more up to date version for users | 1 |
161,044 | 12,530,039,476 | IssuesEvent | 2020-06-04 12:25:25 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | closed | kv/kvserver: TestReplicateQueueUpReplicate failed | C-test-failure O-robot branch-master | [(kv/kvserver).TestReplicateQueueUpReplicate failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=1879090&tab=buildLog) on [master@de48c6d13431061a6227697a917e574464b80983](https://github.com/cockroachdb/cockroach/commits/de48c6d13431061a6227697a917e574464b80983):
```
W200416 05:20:41.553399 405544 kv/kvserver/raft_transport.go:637 [n1] while processing outgoing Raft queue to node 3: rpc error: code = Canceled desc = grpc: the client connection is closing:
W200416 05:20:41.554218 405592 kv/kvserver/raft_transport.go:637 [n3] while processing outgoing Raft queue to node 2: rpc error: code = Canceled desc = grpc: the client connection is closing:
I200416 05:20:41.557250 403976 kv/kvserver/queue.go:578 [n1,replicate,s1,r15/1:/Table/{19-20}] rate limited in MaybeAdd (replicate): node unavailable; try another peer
W200416 05:20:41.559170 405353 kv/kvserver/raft_transport.go:637 [n1] while processing outgoing Raft queue to node 2: rpc error: code = Unavailable desc = transport is closing:
I200416 05:20:41.559921 403976 kv/kvserver/replicate_queue.go:280 [n1,replicate,s1,r15/1:/Table/{19-20}] snapshot failed: rpc error: code = Canceled desc = grpc: the client connection is closing
W200416 05:20:41.560390 405426 kv/kvserver/raft_transport.go:637 [n1] while processing outgoing Raft queue to node 2: rpc error: code = Canceled desc = grpc: the client connection is closing:
W200416 05:20:41.568014 404801 gossip/gossip.go:1524 [n2] no incoming or outgoing connections
W200416 05:20:41.569510 405661 kv/kvserver/raft_transport.go:637 [n1] while processing outgoing Raft queue to node 3: rpc error: code = Canceled desc = grpc: the client connection is closing:
W200416 05:20:41.574628 405359 kv/kvserver/raft_transport.go:637 [n2] while processing outgoing Raft queue to node 1: EOF:
W200416 05:20:41.579583 405538 kv/kvserver/raft_transport.go:637 [n3] while processing outgoing Raft queue to node 1: EOF:
W200416 05:20:41.582368 405573 kv/kvserver/raft_transport.go:637 [n3] while processing outgoing Raft queue to node 1: EOF:
W200416 05:20:41.583202 403833 gossip/gossip.go:1524 [n1] no incoming or outgoing connections
W200416 05:20:41.584205 405398 kv/kvserver/raft_transport.go:637 [n2] while processing outgoing Raft queue to node 1: EOF:
W200416 05:20:41.592362 405567 kv/kvserver/raft_transport.go:637 [n2] while processing outgoing Raft queue to node 3: EOF:
W200416 05:20:41.593718 406041 kv/kvserver/raft_transport.go:637 [n3] while processing outgoing Raft queue to node 2: EOF:
W200416 05:20:41.596970 405190 gossip/gossip.go:1524 [n3] no incoming or outgoing connections
W200416 05:20:41.614272 405517 kv/kvserver/raft_transport.go:637 [n2] while processing outgoing Raft queue to node 3: rpc error: code = Unavailable desc = transport is closing:
E200416 05:20:41.619984 403976 kv/kvserver/queue.go:1089 [n1,replicate,s1,r15/1:/Table/{19-20}] change replicas of r15 failed: fetching current range descriptor value: node unavailable; try another peer
I200416 05:20:41.620976 403976 kv/kvserver/queue.go:1189 [n1,replicate] purgatory is now empty
I200416 05:20:41.732957 408942 rpc/nodedialer/nodedialer.go:160 [ct-client] unable to connect to n1: failed to connect to n1 at 127.0.0.1:38635: context canceled
I200416 05:20:41.733725 408941 rpc/nodedialer/nodedialer.go:160 [ct-client] unable to connect to n2: failed to connect to n2 at 127.0.0.1:33991: context canceled
I200416 05:20:41.737339 408945 rpc/nodedialer/nodedialer.go:160 [ct-client] unable to connect to n1: failed to connect to n1 at 127.0.0.1:38635: context canceled
I200416 05:20:41.737770 408944 rpc/nodedialer/nodedialer.go:160 [ct-client] unable to connect to n3: failed to connect to n3 at 127.0.0.1:37567: context canceled
I200416 05:20:41.745824 408836 rpc/nodedialer/nodedialer.go:160 [ct-client] unable to connect to n2: failed to connect to n2 at 127.0.0.1:33991: context canceled
I200416 05:20:41.748045 408835 rpc/nodedialer/nodedialer.go:160 [ct-client] unable to connect to n3: failed to connect to n3 at 127.0.0.1:37567: context canceled
I200416 05:20:41.921148 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (gc): node unavailable; try another peer
I200416 05:20:41.921706 403975 kv/kvclient/kvcoord/transport_race.go:108 transport race promotion: ran 153 iterations on up to 1729 requests
I200416 05:20:41.921872 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (merge): node unavailable; try another peer
I200416 05:20:41.922711 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (split): node unavailable; try another peer
I200416 05:20:41.923138 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (replicate): node unavailable; try another peer
I200416 05:20:41.923762 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (replicaGC): node unavailable; try another peer
I200416 05:20:41.924549 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (raftlog): node unavailable; try another peer
I200416 05:20:41.924868 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (raftsnapshot): node unavailable; try another peer
I200416 05:20:41.925293 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (consistencyChecker): node unavailable; try another peer
I200416 05:20:41.925487 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (timeSeriesMaintenance): node unavailable; try another peer
E200416 05:20:41.937279 408860 vendor/google.golang.org/grpc/pickfirst.go:61 pickfirstBalancer: failed to NewSubConn: rpc error: code = Canceled desc = grpc: the client connection is closing
E200416 05:20:41.938043 408930 vendor/google.golang.org/grpc/pickfirst.go:61 pickfirstBalancer: failed to NewSubConn: rpc error: code = Canceled desc = grpc: the client connection is closing
--- FAIL: TestReplicateQueueUpReplicate (49.56s)
replicate_queue_test.go:218: condition failed to evaluate within 45s: replica count, want 3, current 1
goroutine 403773 [running]:
runtime/debug.Stack(0xc00146f638, 0x70f4820, 0xc000383440)
/usr/local/go/src/runtime/debug/stack.go:24 +0xab
github.com/cockroachdb/cockroach/pkg/testutils.SucceedsSoon(0x720cde0, 0xc0033acd00, 0xc00146f638)
/go/src/github.com/cockroachdb/cockroach/pkg/testutils/soon.go:37 +0x87
github.com/cockroachdb/cockroach/pkg/kv/kvserver_test.TestReplicateQueueUpReplicate(0xc0033acd00)
/go/src/github.com/cockroachdb/cockroach/pkg/kv/kvserver/replicate_queue_test.go:218 +0x821
testing.tRunner(0xc0033acd00, 0x62a66c8)
/usr/local/go/src/testing/testing.go:909 +0x19a
created by testing.(*T).Run
/usr/local/go/src/testing/testing.go:960 +0x652
```
<details><summary>More</summary><p>
Parameters:
- GOFLAGS=-json
```
make stressrace TESTS=TestReplicateQueueUpReplicate PKG=./pkg/kv/kvserver TESTTIMEOUT=5m STRESSFLAGS='-timeout 5m' 2>&1
```
Related:
- #47318 kv/kvserver: TestReplicateQueueUpReplicate failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-release-20.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-release-20.1)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2ATestReplicateQueueUpReplicate.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
| 1.0 | kv/kvserver: TestReplicateQueueUpReplicate failed - [(kv/kvserver).TestReplicateQueueUpReplicate failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=1879090&tab=buildLog) on [master@de48c6d13431061a6227697a917e574464b80983](https://github.com/cockroachdb/cockroach/commits/de48c6d13431061a6227697a917e574464b80983):
```
W200416 05:20:41.553399 405544 kv/kvserver/raft_transport.go:637 [n1] while processing outgoing Raft queue to node 3: rpc error: code = Canceled desc = grpc: the client connection is closing:
W200416 05:20:41.554218 405592 kv/kvserver/raft_transport.go:637 [n3] while processing outgoing Raft queue to node 2: rpc error: code = Canceled desc = grpc: the client connection is closing:
I200416 05:20:41.557250 403976 kv/kvserver/queue.go:578 [n1,replicate,s1,r15/1:/Table/{19-20}] rate limited in MaybeAdd (replicate): node unavailable; try another peer
W200416 05:20:41.559170 405353 kv/kvserver/raft_transport.go:637 [n1] while processing outgoing Raft queue to node 2: rpc error: code = Unavailable desc = transport is closing:
I200416 05:20:41.559921 403976 kv/kvserver/replicate_queue.go:280 [n1,replicate,s1,r15/1:/Table/{19-20}] snapshot failed: rpc error: code = Canceled desc = grpc: the client connection is closing
W200416 05:20:41.560390 405426 kv/kvserver/raft_transport.go:637 [n1] while processing outgoing Raft queue to node 2: rpc error: code = Canceled desc = grpc: the client connection is closing:
W200416 05:20:41.568014 404801 gossip/gossip.go:1524 [n2] no incoming or outgoing connections
W200416 05:20:41.569510 405661 kv/kvserver/raft_transport.go:637 [n1] while processing outgoing Raft queue to node 3: rpc error: code = Canceled desc = grpc: the client connection is closing:
W200416 05:20:41.574628 405359 kv/kvserver/raft_transport.go:637 [n2] while processing outgoing Raft queue to node 1: EOF:
W200416 05:20:41.579583 405538 kv/kvserver/raft_transport.go:637 [n3] while processing outgoing Raft queue to node 1: EOF:
W200416 05:20:41.582368 405573 kv/kvserver/raft_transport.go:637 [n3] while processing outgoing Raft queue to node 1: EOF:
W200416 05:20:41.583202 403833 gossip/gossip.go:1524 [n1] no incoming or outgoing connections
W200416 05:20:41.584205 405398 kv/kvserver/raft_transport.go:637 [n2] while processing outgoing Raft queue to node 1: EOF:
W200416 05:20:41.592362 405567 kv/kvserver/raft_transport.go:637 [n2] while processing outgoing Raft queue to node 3: EOF:
W200416 05:20:41.593718 406041 kv/kvserver/raft_transport.go:637 [n3] while processing outgoing Raft queue to node 2: EOF:
W200416 05:20:41.596970 405190 gossip/gossip.go:1524 [n3] no incoming or outgoing connections
W200416 05:20:41.614272 405517 kv/kvserver/raft_transport.go:637 [n2] while processing outgoing Raft queue to node 3: rpc error: code = Unavailable desc = transport is closing:
E200416 05:20:41.619984 403976 kv/kvserver/queue.go:1089 [n1,replicate,s1,r15/1:/Table/{19-20}] change replicas of r15 failed: fetching current range descriptor value: node unavailable; try another peer
I200416 05:20:41.620976 403976 kv/kvserver/queue.go:1189 [n1,replicate] purgatory is now empty
I200416 05:20:41.732957 408942 rpc/nodedialer/nodedialer.go:160 [ct-client] unable to connect to n1: failed to connect to n1 at 127.0.0.1:38635: context canceled
I200416 05:20:41.733725 408941 rpc/nodedialer/nodedialer.go:160 [ct-client] unable to connect to n2: failed to connect to n2 at 127.0.0.1:33991: context canceled
I200416 05:20:41.737339 408945 rpc/nodedialer/nodedialer.go:160 [ct-client] unable to connect to n1: failed to connect to n1 at 127.0.0.1:38635: context canceled
I200416 05:20:41.737770 408944 rpc/nodedialer/nodedialer.go:160 [ct-client] unable to connect to n3: failed to connect to n3 at 127.0.0.1:37567: context canceled
I200416 05:20:41.745824 408836 rpc/nodedialer/nodedialer.go:160 [ct-client] unable to connect to n2: failed to connect to n2 at 127.0.0.1:33991: context canceled
I200416 05:20:41.748045 408835 rpc/nodedialer/nodedialer.go:160 [ct-client] unable to connect to n3: failed to connect to n3 at 127.0.0.1:37567: context canceled
I200416 05:20:41.921148 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (gc): node unavailable; try another peer
I200416 05:20:41.921706 403975 kv/kvclient/kvcoord/transport_race.go:108 transport race promotion: ran 153 iterations on up to 1729 requests
I200416 05:20:41.921872 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (merge): node unavailable; try another peer
I200416 05:20:41.922711 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (split): node unavailable; try another peer
I200416 05:20:41.923138 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (replicate): node unavailable; try another peer
I200416 05:20:41.923762 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (replicaGC): node unavailable; try another peer
I200416 05:20:41.924549 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (raftlog): node unavailable; try another peer
I200416 05:20:41.924868 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (raftsnapshot): node unavailable; try another peer
I200416 05:20:41.925293 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (consistencyChecker): node unavailable; try another peer
I200416 05:20:41.925487 404968 kv/kvserver/queue.go:578 [n2,s2] rate limited in MaybeAdd (timeSeriesMaintenance): node unavailable; try another peer
E200416 05:20:41.937279 408860 vendor/google.golang.org/grpc/pickfirst.go:61 pickfirstBalancer: failed to NewSubConn: rpc error: code = Canceled desc = grpc: the client connection is closing
E200416 05:20:41.938043 408930 vendor/google.golang.org/grpc/pickfirst.go:61 pickfirstBalancer: failed to NewSubConn: rpc error: code = Canceled desc = grpc: the client connection is closing
--- FAIL: TestReplicateQueueUpReplicate (49.56s)
replicate_queue_test.go:218: condition failed to evaluate within 45s: replica count, want 3, current 1
goroutine 403773 [running]:
runtime/debug.Stack(0xc00146f638, 0x70f4820, 0xc000383440)
/usr/local/go/src/runtime/debug/stack.go:24 +0xab
github.com/cockroachdb/cockroach/pkg/testutils.SucceedsSoon(0x720cde0, 0xc0033acd00, 0xc00146f638)
/go/src/github.com/cockroachdb/cockroach/pkg/testutils/soon.go:37 +0x87
github.com/cockroachdb/cockroach/pkg/kv/kvserver_test.TestReplicateQueueUpReplicate(0xc0033acd00)
/go/src/github.com/cockroachdb/cockroach/pkg/kv/kvserver/replicate_queue_test.go:218 +0x821
testing.tRunner(0xc0033acd00, 0x62a66c8)
/usr/local/go/src/testing/testing.go:909 +0x19a
created by testing.(*T).Run
/usr/local/go/src/testing/testing.go:960 +0x652
```
<details><summary>More</summary><p>
Parameters:
- GOFLAGS=-json
```
make stressrace TESTS=TestReplicateQueueUpReplicate PKG=./pkg/kv/kvserver TESTTIMEOUT=5m STRESSFLAGS='-timeout 5m' 2>&1
```
Related:
- #47318 kv/kvserver: TestReplicateQueueUpReplicate failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-release-20.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-release-20.1)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2ATestReplicateQueueUpReplicate.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
| non_build | kv kvserver testreplicatequeueupreplicate failed on kv kvserver raft transport go while processing outgoing raft queue to node rpc error code canceled desc grpc the client connection is closing kv kvserver raft transport go while processing outgoing raft queue to node rpc error code canceled desc grpc the client connection is closing kv kvserver queue go rate limited in maybeadd replicate node unavailable try another peer kv kvserver raft transport go while processing outgoing raft queue to node rpc error code unavailable desc transport is closing kv kvserver replicate queue go snapshot failed rpc error code canceled desc grpc the client connection is closing kv kvserver raft transport go while processing outgoing raft queue to node rpc error code canceled desc grpc the client connection is closing gossip gossip go no incoming or outgoing connections kv kvserver raft transport go while processing outgoing raft queue to node rpc error code canceled desc grpc the client connection is closing kv kvserver raft transport go while processing outgoing raft queue to node eof kv kvserver raft transport go while processing outgoing raft queue to node eof kv kvserver raft transport go while processing outgoing raft queue to node eof gossip gossip go no incoming or outgoing connections kv kvserver raft transport go while processing outgoing raft queue to node eof kv kvserver raft transport go while processing outgoing raft queue to node eof kv kvserver raft transport go while processing outgoing raft queue to node eof gossip gossip go no incoming or outgoing connections kv kvserver raft transport go while processing outgoing raft queue to node rpc error code unavailable desc transport is closing kv kvserver queue go change replicas of failed fetching current range descriptor value node unavailable try another peer kv kvserver queue go purgatory is now empty rpc nodedialer nodedialer go unable to connect to failed to connect to at context canceled rpc nodedialer nodedialer go unable to connect to failed to connect to at context canceled rpc nodedialer nodedialer go unable to connect to failed to connect to at context canceled rpc nodedialer nodedialer go unable to connect to failed to connect to at context canceled rpc nodedialer nodedialer go unable to connect to failed to connect to at context canceled rpc nodedialer nodedialer go unable to connect to failed to connect to at context canceled kv kvserver queue go rate limited in maybeadd gc node unavailable try another peer kv kvclient kvcoord transport race go transport race promotion ran iterations on up to requests kv kvserver queue go rate limited in maybeadd merge node unavailable try another peer kv kvserver queue go rate limited in maybeadd split node unavailable try another peer kv kvserver queue go rate limited in maybeadd replicate node unavailable try another peer kv kvserver queue go rate limited in maybeadd replicagc node unavailable try another peer kv kvserver queue go rate limited in maybeadd raftlog node unavailable try another peer kv kvserver queue go rate limited in maybeadd raftsnapshot node unavailable try another peer kv kvserver queue go rate limited in maybeadd consistencychecker node unavailable try another peer kv kvserver queue go rate limited in maybeadd timeseriesmaintenance node unavailable try another peer vendor google golang org grpc pickfirst go pickfirstbalancer failed to newsubconn rpc error code canceled desc grpc the client connection is closing vendor google golang org grpc pickfirst go pickfirstbalancer failed to newsubconn rpc error code canceled desc grpc the client connection is closing fail testreplicatequeueupreplicate replicate queue test go condition failed to evaluate within replica count want current goroutine runtime debug stack usr local go src runtime debug stack go github com cockroachdb cockroach pkg testutils succeedssoon go src github com cockroachdb cockroach pkg testutils soon go github com cockroachdb cockroach pkg kv kvserver test testreplicatequeueupreplicate go src github com cockroachdb cockroach pkg kv kvserver replicate queue test go testing trunner usr local go src testing testing go created by testing t run usr local go src testing testing go more parameters goflags json make stressrace tests testreplicatequeueupreplicate pkg pkg kv kvserver testtimeout stressflags timeout related kv kvserver testreplicatequeueupreplicate failed powered by | 0 |
13,414 | 5,357,590,247 | IssuesEvent | 2017-02-20 19:02:13 | SEED-platform/seed | https://api.github.com/repos/SEED-platform/seed | closed | Decouple list settings in different views | Building List GUI Matching | **Problem**
Right now, the List Settings from different views are coupled together. So if you change them in one place, such as in the Matching screen, those same changes are carried over into the Building List view.
Users may want to see a very different set of fields in the Matching screen than in the Building List screen
| 1.0 | Decouple list settings in different views - **Problem**
Right now, the List Settings from different views are coupled together. So if you change them in one place, such as in the Matching screen, those same changes are carried over into the Building List view.
Users may want to see a very different set of fields in the Matching screen than in the Building List screen
| build | decouple list settings in different views problem right now the list settings from different views are coupled together so if you change them in one place such as in the matching screen those same changes are carried over into the building list view users may want to see a very different set of fields in the matching screen than in the building list screen | 1 |
64,507 | 15,896,207,649 | IssuesEvent | 2021-04-11 16:37:44 | sandboxie-plus/Sandboxie | https://api.github.com/repos/sandboxie-plus/Sandboxie | closed | Audio and video stutter in Chromium based browsers | fixed in next build | I've noticed that when using Chromium based browsers, such as Brave, to watch videos while Sandboxed the audio and the video will both stutter. This is most prevalent when the video playback is fullscreen. If playback is done outside of the Sandbox the quality is unaffected. | 1.0 | Audio and video stutter in Chromium based browsers - I've noticed that when using Chromium based browsers, such as Brave, to watch videos while Sandboxed the audio and the video will both stutter. This is most prevalent when the video playback is fullscreen. If playback is done outside of the Sandbox the quality is unaffected. | build | audio and video stutter in chromium based browsers i ve noticed that when using chromium based browsers such as brave to watch videos while sandboxed the audio and the video will both stutter this is most prevalent when the video playback is fullscreen if playback is done outside of the sandbox the quality is unaffected | 1 |
469,270 | 13,504,517,483 | IssuesEvent | 2020-09-13 18:24:10 | ixjf/MSIRGB | https://api.github.com/repos/ixjf/MSIRGB | closed | LEDs show colours reversed | bug enhancement priority: high | Since the B450 Tomahawk MAX has been recommended a lot for people who are building a system, it would be a good idea to have explicit support added for it. Right now, it will sometimes work (with inverted colours), cause the motherboard to cycle between colours even when you have every square the same colour or even do nothing at all. | 1.0 | LEDs show colours reversed - Since the B450 Tomahawk MAX has been recommended a lot for people who are building a system, it would be a good idea to have explicit support added for it. Right now, it will sometimes work (with inverted colours), cause the motherboard to cycle between colours even when you have every square the same colour or even do nothing at all. | non_build | leds show colours reversed since the tomahawk max has been recommended a lot for people who are building a system it would be a good idea to have explicit support added for it right now it will sometimes work with inverted colours cause the motherboard to cycle between colours even when you have every square the same colour or even do nothing at all | 0 |
53,589 | 13,183,362,176 | IssuesEvent | 2020-08-12 17:22:42 | bloom-housing/bloom | https://api.github.com/repos/bloom-housing/bloom | opened | Update backend/core auth tests to work without a database | Code Quality / DevOps / Build | `backend/core` auth tests are failing in CI for lack of a database. I guess we can provision one if need be, but I'm hoping there's a reasonable way to do it with a fixture so we can keep the CI configuration simpler for the time being.
Split off from #511 | 1.0 | Update backend/core auth tests to work without a database - `backend/core` auth tests are failing in CI for lack of a database. I guess we can provision one if need be, but I'm hoping there's a reasonable way to do it with a fixture so we can keep the CI configuration simpler for the time being.
Split off from #511 | build | update backend core auth tests to work without a database backend core auth tests are failing in ci for lack of a database i guess we can provision one if need be but i m hoping there s a reasonable way to do it with a fixture so we can keep the ci configuration simpler for the time being split off from | 1 |
15,950 | 3,489,829,487 | IssuesEvent | 2016-01-04 04:06:07 | Test-More/test-more | https://api.github.com/repos/Test-More/test-more | closed | RYBSKEJ/forks-0.36.tar.gz hangs with 1.302013_005 | Blocking Stable Bug Easy HasPatch Test-Stream | I have tried it with several perls and the result always was, with a downgrade to EXODIST/Test-Simple-1.001014.tar.gz I could test forks in under 3 minutes. With 1.302013_005 there was no progress for an hour. | 1.0 | RYBSKEJ/forks-0.36.tar.gz hangs with 1.302013_005 - I have tried it with several perls and the result always was, with a downgrade to EXODIST/Test-Simple-1.001014.tar.gz I could test forks in under 3 minutes. With 1.302013_005 there was no progress for an hour. | non_build | rybskej forks tar gz hangs with i have tried it with several perls and the result always was with a downgrade to exodist test simple tar gz i could test forks in under minutes with there was no progress for an hour | 0 |
49,581 | 12,378,949,347 | IssuesEvent | 2020-05-19 11:37:43 | woocommerce/woocommerce-gutenberg-products-block | https://api.github.com/repos/woocommerce/woocommerce-gutenberg-products-block | closed | Needs to consume all contexts. | type: build | https://github.com/woocommerce/woocommerce-gutenberg-products-block/blob/d74ef667af17e89d3a61d3e5c2d4519c06c33b51/assets/js/base/context/cart-checkout/checkout/processor/index.js#L45-L48
---
###### This issue was generated by [todo](https://todo.jasonet.co) based on a `todo` comment in d74ef667af17e89d3a61d3e5c2d4519c06c33b51 when #2384 was merged. cc @woocommerce. | 1.0 | Needs to consume all contexts. - https://github.com/woocommerce/woocommerce-gutenberg-products-block/blob/d74ef667af17e89d3a61d3e5c2d4519c06c33b51/assets/js/base/context/cart-checkout/checkout/processor/index.js#L45-L48
---
###### This issue was generated by [todo](https://todo.jasonet.co) based on a `todo` comment in d74ef667af17e89d3a61d3e5c2d4519c06c33b51 when #2384 was merged. cc @woocommerce. | build | needs to consume all contexts this issue was generated by based on a todo comment in when was merged cc woocommerce | 1 |
69,752 | 17,839,092,789 | IssuesEvent | 2021-09-03 07:40:37 | inmanta/inmanta-core | https://api.github.com/repos/inmanta/inmanta-core | opened | web-console clean_up_packages fails | build master task | The following indicates an error in `/clean_up_packages.js` that gets triggered during the cleanup phase of the [nightly builds](https://jenkins.inmanta.com/job/releases/job/npm/job/web-console-release/job/master/539/console):
```
$ node clean_up_packages
internal/modules/cjs/loader.js:1102
throw new ERR_REQUIRE_ESM(filename, parentPath, packageJsonPath);
^
Error [ERR_REQUIRE_ESM]: Must use import to load ES Module: /home/jenkins/workspace/s_npm_web-console-release_master/web-console/node_modules/node-fetch/src/index.js
require() of ES modules is not supported.
require() of /home/jenkins/workspace/s_npm_web-console-release_master/web-console/node_modules/node-fetch/src/index.js from /home/jenkins/workspace/s_npm_web-console-release_master/web-console/clean_up_packages.js is an ES module file as it is a .js file whose nearest parent package.json contains "type": "module" which defines all .js files in that package scope as ES modules.
Instead rename index.js to end in .cjs, change the requiring code to use import(), or remove "type": "module" from /home/jenkins/workspace/s_npm_web-console-release_master/web-console/node_modules/node-fetch/package.json.
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1102:13)
at Module.load (internal/modules/cjs/loader.js:950:32)
at Function.Module._load (internal/modules/cjs/loader.js:790:14)
at Module.require (internal/modules/cjs/loader.js:974:19)
at require (internal/modules/cjs/helpers.js:92:18)
at Object.<anonymous> (/home/jenkins/workspace/s_npm_web-console-release_master/web-console/clean_up_packages.js:1:16)
at Module._compile (internal/modules/cjs/loader.js:1085:14)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1114:10)
at Module.load (internal/modules/cjs/loader.js:950:32)
at Function.Module._load (internal/modules/cjs/loader.js:790:14) {
code: 'ERR_REQUIRE_ESM'
}
``` | 1.0 | web-console clean_up_packages fails - The following indicates an error in `/clean_up_packages.js` that gets triggered during the cleanup phase of the [nightly builds](https://jenkins.inmanta.com/job/releases/job/npm/job/web-console-release/job/master/539/console):
```
$ node clean_up_packages
internal/modules/cjs/loader.js:1102
throw new ERR_REQUIRE_ESM(filename, parentPath, packageJsonPath);
^
Error [ERR_REQUIRE_ESM]: Must use import to load ES Module: /home/jenkins/workspace/s_npm_web-console-release_master/web-console/node_modules/node-fetch/src/index.js
require() of ES modules is not supported.
require() of /home/jenkins/workspace/s_npm_web-console-release_master/web-console/node_modules/node-fetch/src/index.js from /home/jenkins/workspace/s_npm_web-console-release_master/web-console/clean_up_packages.js is an ES module file as it is a .js file whose nearest parent package.json contains "type": "module" which defines all .js files in that package scope as ES modules.
Instead rename index.js to end in .cjs, change the requiring code to use import(), or remove "type": "module" from /home/jenkins/workspace/s_npm_web-console-release_master/web-console/node_modules/node-fetch/package.json.
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1102:13)
at Module.load (internal/modules/cjs/loader.js:950:32)
at Function.Module._load (internal/modules/cjs/loader.js:790:14)
at Module.require (internal/modules/cjs/loader.js:974:19)
at require (internal/modules/cjs/helpers.js:92:18)
at Object.<anonymous> (/home/jenkins/workspace/s_npm_web-console-release_master/web-console/clean_up_packages.js:1:16)
at Module._compile (internal/modules/cjs/loader.js:1085:14)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1114:10)
at Module.load (internal/modules/cjs/loader.js:950:32)
at Function.Module._load (internal/modules/cjs/loader.js:790:14) {
code: 'ERR_REQUIRE_ESM'
}
``` | build | web console clean up packages fails the following indicates an error in clean up packages js that gets triggered during the cleanup phase of the node clean up packages internal modules cjs loader js throw new err require esm filename parentpath packagejsonpath error must use import to load es module home jenkins workspace s npm web console release master web console node modules node fetch src index js require of es modules is not supported require of home jenkins workspace s npm web console release master web console node modules node fetch src index js from home jenkins workspace s npm web console release master web console clean up packages js is an es module file as it is a js file whose nearest parent package json contains type module which defines all js files in that package scope as es modules instead rename index js to end in cjs change the requiring code to use import or remove type module from home jenkins workspace s npm web console release master web console node modules node fetch package json at object module extensions js internal modules cjs loader js at module load internal modules cjs loader js at function module load internal modules cjs loader js at module require internal modules cjs loader js at require internal modules cjs helpers js at object home jenkins workspace s npm web console release master web console clean up packages js at module compile internal modules cjs loader js at object module extensions js internal modules cjs loader js at module load internal modules cjs loader js at function module load internal modules cjs loader js code err require esm | 1 |
127,092 | 10,451,833,524 | IssuesEvent | 2019-09-19 13:37:28 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | closed | roachtest: disk-stalled/log=false,data=false failed | C-test-failure O-roachtest O-robot | SHA: https://github.com/cockroachdb/cockroach/commits/c6342c90a7fa4ceb1b674faa47a95e1726d05e79
Parameters:
To repro, try:
```
# Don't forget to check out a clean suitable branch and experiment with the
# stress invocation until the desired results present themselves. For example,
# using stress instead of stressrace and passing the '-p' stressflag which
# controls concurrency.
./scripts/gceworker.sh start && ./scripts/gceworker.sh mosh
cd ~/go/src/github.com/cockroachdb/cockroach && \
stdbuf -oL -eL \
make stressrace TESTS=disk-stalled/log=false,data=false PKG=roachtest TESTTIMEOUT=5m STRESSFLAGS='-maxtime 20m -timeout 10m' 2>&1 | tee /tmp/stress.log
```
Failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=1496387&tab=artifacts#/disk-stalled/log=false,data=false
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/20190919-1496387/disk-stalled/log=false_data=false/run_1
disk_stall.go:68,disk_stall.go:40,test_runner.go:689: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod install teamcity-1568869602-26-n1cpu4:1 charybdefs returned:
stderr:
stdout:
Service Unavailable [IP: 35.184.34.241 80]
E: Failed to fetch http://us-central1.gce.archive.ubuntu.com/ubuntu/pool/main/p/pcre3/libpcre32-3_8.38-3.1_amd64.deb 503 Service Unavailable [IP: 35.184.34.241 80]
E: Failed to fetch http://us-central1.gce.archive.ubuntu.com/ubuntu/pool/main/p/pkg-config/pkg-config_0.29.1-0ubuntu1_amd64.deb 503 Service Unavailable [IP: 35.184.34.241 80]
E: Failed to fetch http://us-central1.gce.archive.ubuntu.com/ubuntu/pool/main/m/manpages/manpages-dev_4.04-2_all.deb 503 Service Unavailable [IP: 35.184.34.241 80]
E: Failed to fetch http://us-central1.gce.archive.ubuntu.com/ubuntu/pool/main/p/python-setuptools/python-setuptools_20.7.0-1_all.deb 503 Service Unavailable [IP: 35.184.34.241 80]
E: Failed to fetch http://us-central1.gce.archive.ubuntu.com/ubuntu/pool/main/o/ocl-icd/ocl-icd-libopencl1_2.2.8-1_amd64.deb 503 Service Unavailable [IP: 35.184.34.241 80]
E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing?
Error: exit status 100
: exit status 1
``` | 2.0 | roachtest: disk-stalled/log=false,data=false failed - SHA: https://github.com/cockroachdb/cockroach/commits/c6342c90a7fa4ceb1b674faa47a95e1726d05e79
Parameters:
To repro, try:
```
# Don't forget to check out a clean suitable branch and experiment with the
# stress invocation until the desired results present themselves. For example,
# using stress instead of stressrace and passing the '-p' stressflag which
# controls concurrency.
./scripts/gceworker.sh start && ./scripts/gceworker.sh mosh
cd ~/go/src/github.com/cockroachdb/cockroach && \
stdbuf -oL -eL \
make stressrace TESTS=disk-stalled/log=false,data=false PKG=roachtest TESTTIMEOUT=5m STRESSFLAGS='-maxtime 20m -timeout 10m' 2>&1 | tee /tmp/stress.log
```
Failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=1496387&tab=artifacts#/disk-stalled/log=false,data=false
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/20190919-1496387/disk-stalled/log=false_data=false/run_1
disk_stall.go:68,disk_stall.go:40,test_runner.go:689: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod install teamcity-1568869602-26-n1cpu4:1 charybdefs returned:
stderr:
stdout:
Service Unavailable [IP: 35.184.34.241 80]
E: Failed to fetch http://us-central1.gce.archive.ubuntu.com/ubuntu/pool/main/p/pcre3/libpcre32-3_8.38-3.1_amd64.deb 503 Service Unavailable [IP: 35.184.34.241 80]
E: Failed to fetch http://us-central1.gce.archive.ubuntu.com/ubuntu/pool/main/p/pkg-config/pkg-config_0.29.1-0ubuntu1_amd64.deb 503 Service Unavailable [IP: 35.184.34.241 80]
E: Failed to fetch http://us-central1.gce.archive.ubuntu.com/ubuntu/pool/main/m/manpages/manpages-dev_4.04-2_all.deb 503 Service Unavailable [IP: 35.184.34.241 80]
E: Failed to fetch http://us-central1.gce.archive.ubuntu.com/ubuntu/pool/main/p/python-setuptools/python-setuptools_20.7.0-1_all.deb 503 Service Unavailable [IP: 35.184.34.241 80]
E: Failed to fetch http://us-central1.gce.archive.ubuntu.com/ubuntu/pool/main/o/ocl-icd/ocl-icd-libopencl1_2.2.8-1_amd64.deb 503 Service Unavailable [IP: 35.184.34.241 80]
E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing?
Error: exit status 100
: exit status 1
``` | non_build | roachtest disk stalled log false data false failed sha parameters to repro try don t forget to check out a clean suitable branch and experiment with the stress invocation until the desired results present themselves for example using stress instead of stressrace and passing the p stressflag which controls concurrency scripts gceworker sh start scripts gceworker sh mosh cd go src github com cockroachdb cockroach stdbuf ol el make stressrace tests disk stalled log false data false pkg roachtest testtimeout stressflags maxtime timeout tee tmp stress log failed test the test failed on branch master cloud gce test artifacts and logs in home agent work go src github com cockroachdb cockroach artifacts disk stalled log false data false run disk stall go disk stall go test runner go home agent work go src github com cockroachdb cockroach bin roachprod install teamcity charybdefs returned stderr stdout service unavailable e failed to fetch service unavailable e failed to fetch service unavailable e failed to fetch service unavailable e failed to fetch service unavailable e failed to fetch service unavailable e unable to fetch some archives maybe run apt get update or try with fix missing error exit status exit status | 0 |
68,708 | 21,790,255,071 | IssuesEvent | 2022-05-14 19:40:04 | primefaces/primereact | https://api.github.com/repos/primefaces/primereact | opened | Autocomplete: Virtual Scroller incompatible with arrow keys navigation | defect | ### Describe the bug
When navigating through an Autocomplete dropdown with arrow keys, the container fails to scroll when the highlighted item is no longer in view. The `onInputKeyDown` function inside the Autocomplete component uses `overlayRef` which does not represent the virtual scroller container that should be scrolling, but his parent. See: [down arrow](https://github.com/primefaces/primereact/blob/7121122438de59a687fba079ffe267845a5fc1b6/components/lib/autocomplete/AutoComplete.js#L243) and [up arrow](https://github.com/primefaces/primereact/blob/7121122438de59a687fba079ffe267845a5fc1b6/components/lib/autocomplete/AutoComplete.js#L267)
In V7 I patched the component like this: `DomHandler.scrollInView(this.virtualScrollerRef.current ? this.virtualScrollerRef.current.element : this.overlayRef.current, nextElement);` (same for the down key) and it worked perfectly fine.
But in V8, the `useImperativeHandle` hook from the Virtual Scroller component [here](https://github.com/primefaces/primereact/blob/63cf7185d2fe2922a30f24c6f7afe8559b1ef5bc/components/lib/virtualscroller/VirtualScroller.js#L460) overwrites the default ref and the element can no longer be accessed with a `useRef` hook. See this issue: #
Note: The Virtual Scroller component has its own `scrollInView` function. It uses indexes instead of dom elements. [Here](https://github.com/primefaces/primereact/blob/63cf7185d2fe2922a30f24c6f7afe8559b1ef5bc/components/lib/virtualscroller/VirtualScroller.js#L54)
### Reproducer
https://codesandbox.io/s/primereact-test-forked-ieie3z?file=/src/index.js
### PrimeReact version
8.1.0
### React version
18.x
### Language
ALL
### Build / Runtime
Create React App (CRA)
### Browser(s)
_No response_
### Steps to reproduce the behavior
1. Create an Autocomplete component with virtualScrollerOptions.
2. Make the dropdown appear.
3. Navigate the dropdown with up and down arrow keys.
4. See no scrolling of the container.
### Expected behavior
I expect the highlighted item to stay in view when navigating with the arrow keys the dropdown of an Autocomplete component using a virtual scroller. | 1.0 | Autocomplete: Virtual Scroller incompatible with arrow keys navigation - ### Describe the bug
When navigating through an Autocomplete dropdown with arrow keys, the container fails to scroll when the highlighted item is no longer in view. The `onInputKeyDown` function inside the Autocomplete component uses `overlayRef` which does not represent the virtual scroller container that should be scrolling, but his parent. See: [down arrow](https://github.com/primefaces/primereact/blob/7121122438de59a687fba079ffe267845a5fc1b6/components/lib/autocomplete/AutoComplete.js#L243) and [up arrow](https://github.com/primefaces/primereact/blob/7121122438de59a687fba079ffe267845a5fc1b6/components/lib/autocomplete/AutoComplete.js#L267)
In V7 I patched the component like this: `DomHandler.scrollInView(this.virtualScrollerRef.current ? this.virtualScrollerRef.current.element : this.overlayRef.current, nextElement);` (same for the down key) and it worked perfectly fine.
But in V8, the `useImperativeHandle` hook from the Virtual Scroller component [here](https://github.com/primefaces/primereact/blob/63cf7185d2fe2922a30f24c6f7afe8559b1ef5bc/components/lib/virtualscroller/VirtualScroller.js#L460) overwrites the default ref and the element can no longer be accessed with a `useRef` hook. See this issue: #
Note: The Virtual Scroller component has its own `scrollInView` function. It uses indexes instead of dom elements. [Here](https://github.com/primefaces/primereact/blob/63cf7185d2fe2922a30f24c6f7afe8559b1ef5bc/components/lib/virtualscroller/VirtualScroller.js#L54)
### Reproducer
https://codesandbox.io/s/primereact-test-forked-ieie3z?file=/src/index.js
### PrimeReact version
8.1.0
### React version
18.x
### Language
ALL
### Build / Runtime
Create React App (CRA)
### Browser(s)
_No response_
### Steps to reproduce the behavior
1. Create an Autocomplete component with virtualScrollerOptions.
2. Make the dropdown appear.
3. Navigate the dropdown with up and down arrow keys.
4. See no scrolling of the container.
### Expected behavior
I expect the highlighted item to stay in view when navigating with the arrow keys the dropdown of an Autocomplete component using a virtual scroller. | non_build | autocomplete virtual scroller incompatible with arrow keys navigation describe the bug when navigating through an autocomplete dropdown with arrow keys the container fails to scroll when the highlighted item is no longer in view the oninputkeydown function inside the autocomplete component uses overlayref which does not represent the virtual scroller container that should be scrolling but his parent see and in i patched the component like this domhandler scrollinview this virtualscrollerref current this virtualscrollerref current element this overlayref current nextelement same for the down key and it worked perfectly fine but in the useimperativehandle hook from the virtual scroller component overwrites the default ref and the element can no longer be accessed with a useref hook see this issue note the virtual scroller component has its own scrollinview function it uses indexes instead of dom elements reproducer primereact version react version x language all build runtime create react app cra browser s no response steps to reproduce the behavior create an autocomplete component with virtualscrolleroptions make the dropdown appear navigate the dropdown with up and down arrow keys see no scrolling of the container expected behavior i expect the highlighted item to stay in view when navigating with the arrow keys the dropdown of an autocomplete component using a virtual scroller | 0 |
231,092 | 17,661,945,820 | IssuesEvent | 2021-08-21 17:39:52 | GenericMappingTools/pygmt | https://api.github.com/repos/GenericMappingTools/pygmt | closed | Clarify resolutions of Earth relief grids requiring the region parameter | good first issue documentation help wanted eswn-workshop | **Description of the problem**
See [load_earth_relief](https://www.pygmt.org/latest/api/generated/pygmt.datasets.load_earth_relief.html#pygmt.datasets.load_earth_relief):
> region (str or list) – The subregion of the grid to load, in the forms of a list [xmin, xmax, ymin, ymax] or a string xmin/xmax/ymin/ymax. Required for Earth relief grids with resolutions higher than 5 arc-minute (i.e., 05m).
https://github.com/GenericMappingTools/pygmt/blob/d90b3fc889b53633deab6b4ab77612ac7a247c1b/pygmt/datasets/earth_relief.py#L94-L95
So, `Higher than 5 arc-minute` means "04m", "03m", "02m", "01m", "30s", "15s", "03s", adn "01s"? But "05m" is actually high resolution relief data.
| 1.0 | Clarify resolutions of Earth relief grids requiring the region parameter - **Description of the problem**
See [load_earth_relief](https://www.pygmt.org/latest/api/generated/pygmt.datasets.load_earth_relief.html#pygmt.datasets.load_earth_relief):
> region (str or list) – The subregion of the grid to load, in the forms of a list [xmin, xmax, ymin, ymax] or a string xmin/xmax/ymin/ymax. Required for Earth relief grids with resolutions higher than 5 arc-minute (i.e., 05m).
https://github.com/GenericMappingTools/pygmt/blob/d90b3fc889b53633deab6b4ab77612ac7a247c1b/pygmt/datasets/earth_relief.py#L94-L95
So, `Higher than 5 arc-minute` means "04m", "03m", "02m", "01m", "30s", "15s", "03s", adn "01s"? But "05m" is actually high resolution relief data.
| non_build | clarify resolutions of earth relief grids requiring the region parameter description of the problem see region str or list – the subregion of the grid to load in the forms of a list or a string xmin xmax ymin ymax required for earth relief grids with resolutions higher than arc minute i e so higher than arc minute means adn but is actually high resolution relief data | 0 |
681 | 3,098,350,123 | IssuesEvent | 2015-08-28 10:22:38 | Yoast/wordpress-seo | https://api.github.com/repos/Yoast/wordpress-seo | closed | Conflict with QuickShare Plugin. | compatibility wait for feedback | Hello, I *love* Yoast SEO and I use it on all of my websites. But I also use the QuickShare plugin because it is a lightweight sharing plugin. I just found out that there is a conflict between Yoast SEO and Quickshare. On the sites where I've been using Yoast SEO and Quickshare for a while... when I update to the latest version of each site, there's no conflict. But I just tried adding the latest version of Yoast SEO to a new site --- along with Quickshare --- and the only way that I can get the Quickshare bar to display at the bottom of my posts is if I disable Yoast SEO. I believe that the problem lies with Yoast SEO because QuickShare hasn't changed in a while... and past versions of Yoast worked nicely with the same version of QuickShare. I wrote to Nick Haley at CelloExpressions.com to see if he can find the conflict. He responded that he also believes the problem is with Yoast. Can you please help? I use both Yoast SEO and QuickShare on all of my websites in order to keep my sites running fast while being shared and getting found. I rave about Yoast SEO to all of my clients and I really, really, really want to be able to continue to use it. Thanks so much in advance. | True | Conflict with QuickShare Plugin. - Hello, I *love* Yoast SEO and I use it on all of my websites. But I also use the QuickShare plugin because it is a lightweight sharing plugin. I just found out that there is a conflict between Yoast SEO and Quickshare. On the sites where I've been using Yoast SEO and Quickshare for a while... when I update to the latest version of each site, there's no conflict. But I just tried adding the latest version of Yoast SEO to a new site --- along with Quickshare --- and the only way that I can get the Quickshare bar to display at the bottom of my posts is if I disable Yoast SEO. I believe that the problem lies with Yoast SEO because QuickShare hasn't changed in a while... and past versions of Yoast worked nicely with the same version of QuickShare. I wrote to Nick Haley at CelloExpressions.com to see if he can find the conflict. He responded that he also believes the problem is with Yoast. Can you please help? I use both Yoast SEO and QuickShare on all of my websites in order to keep my sites running fast while being shared and getting found. I rave about Yoast SEO to all of my clients and I really, really, really want to be able to continue to use it. Thanks so much in advance. | non_build | conflict with quickshare plugin hello i love yoast seo and i use it on all of my websites but i also use the quickshare plugin because it is a lightweight sharing plugin i just found out that there is a conflict between yoast seo and quickshare on the sites where i ve been using yoast seo and quickshare for a while when i update to the latest version of each site there s no conflict but i just tried adding the latest version of yoast seo to a new site along with quickshare and the only way that i can get the quickshare bar to display at the bottom of my posts is if i disable yoast seo i believe that the problem lies with yoast seo because quickshare hasn t changed in a while and past versions of yoast worked nicely with the same version of quickshare i wrote to nick haley at celloexpressions com to see if he can find the conflict he responded that he also believes the problem is with yoast can you please help i use both yoast seo and quickshare on all of my websites in order to keep my sites running fast while being shared and getting found i rave about yoast seo to all of my clients and i really really really want to be able to continue to use it thanks so much in advance | 0 |
21,159 | 6,988,403,681 | IssuesEvent | 2017-12-14 12:49:43 | HypothesisWorks/hypothesis-python | https://api.github.com/repos/HypothesisWorks/hypothesis-python | closed | Have Travis auto-commit any changes required for 'check-format' | CI-and-build | While I’m stealing ideas from work, we might want to borrow https://github.com/wellcometrust/platform/pull/1255
Because Travis know enough to auto-format code to pass "make check-format", and is already able to push to our repo, we should enhance it so that if it detects formatting issues on a PR, it auto-commits the necessary changes and pushes them to GitHub.
This saves the PR author a bit of work.
It also speeds up our builds – as soon as Travis pushes the new commit, it will auto-cancel the old build with the bad formatting. It reduces the amount of time it spends building patches that have a known and easily fixable problem. | 1.0 | Have Travis auto-commit any changes required for 'check-format' - While I’m stealing ideas from work, we might want to borrow https://github.com/wellcometrust/platform/pull/1255
Because Travis know enough to auto-format code to pass "make check-format", and is already able to push to our repo, we should enhance it so that if it detects formatting issues on a PR, it auto-commits the necessary changes and pushes them to GitHub.
This saves the PR author a bit of work.
It also speeds up our builds – as soon as Travis pushes the new commit, it will auto-cancel the old build with the bad formatting. It reduces the amount of time it spends building patches that have a known and easily fixable problem. | build | have travis auto commit any changes required for check format while i’m stealing ideas from work we might want to borrow because travis know enough to auto format code to pass make check format and is already able to push to our repo we should enhance it so that if it detects formatting issues on a pr it auto commits the necessary changes and pushes them to github this saves the pr author a bit of work it also speeds up our builds – as soon as travis pushes the new commit it will auto cancel the old build with the bad formatting it reduces the amount of time it spends building patches that have a known and easily fixable problem | 1 |
208,365 | 23,598,284,888 | IssuesEvent | 2022-08-23 21:40:28 | occmundial/occ-atomic | https://api.github.com/repos/occmundial/occ-atomic | closed | CVE-2022-31051 (High) detected in semantic-release-18.0.0.tgz - autoclosed | security vulnerability | ## CVE-2022-31051 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>semantic-release-18.0.0.tgz</b></p></summary>
<p>Automated semver compliant package publishing</p>
<p>Library home page: <a href="https://registry.npmjs.org/semantic-release/-/semantic-release-18.0.0.tgz">https://registry.npmjs.org/semantic-release/-/semantic-release-18.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/semantic-release/package.json</p>
<p>
Dependency Hierarchy:
- :x: **semantic-release-18.0.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/occmundial/occ-atomic/commit/c0bab5cb049fd3e17ceb2f5d6cccf05cd763cfd2">c0bab5cb049fd3e17ceb2f5d6cccf05cd763cfd2</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
semantic-release is an open source npm package for automated version management and package publishing. In affected versions secrets that would normally be masked by semantic-release can be accidentally disclosed if they contain characters that are excluded from uri encoding by `encodeURI`. Occurrence is further limited to execution contexts where push access to the related repository is not available without modifying the repository url to inject credentials. Users are advised to upgrade. Users unable to upgrade should ensure that secrets that do not contain characters that are excluded from encoding with `encodeURI` when included in a URL are already masked properly.
<p>Publish Date: 2022-06-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31051>CVE-2022-31051</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/semantic-release/semantic-release/security/advisories/GHSA-x2pg-mjhr-2m5x">https://github.com/semantic-release/semantic-release/security/advisories/GHSA-x2pg-mjhr-2m5x</a></p>
<p>Release Date: 2022-06-09</p>
<p>Fix Resolution: 19.0.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-31051 (High) detected in semantic-release-18.0.0.tgz - autoclosed - ## CVE-2022-31051 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>semantic-release-18.0.0.tgz</b></p></summary>
<p>Automated semver compliant package publishing</p>
<p>Library home page: <a href="https://registry.npmjs.org/semantic-release/-/semantic-release-18.0.0.tgz">https://registry.npmjs.org/semantic-release/-/semantic-release-18.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/semantic-release/package.json</p>
<p>
Dependency Hierarchy:
- :x: **semantic-release-18.0.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/occmundial/occ-atomic/commit/c0bab5cb049fd3e17ceb2f5d6cccf05cd763cfd2">c0bab5cb049fd3e17ceb2f5d6cccf05cd763cfd2</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
semantic-release is an open source npm package for automated version management and package publishing. In affected versions secrets that would normally be masked by semantic-release can be accidentally disclosed if they contain characters that are excluded from uri encoding by `encodeURI`. Occurrence is further limited to execution contexts where push access to the related repository is not available without modifying the repository url to inject credentials. Users are advised to upgrade. Users unable to upgrade should ensure that secrets that do not contain characters that are excluded from encoding with `encodeURI` when included in a URL are already masked properly.
<p>Publish Date: 2022-06-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31051>CVE-2022-31051</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/semantic-release/semantic-release/security/advisories/GHSA-x2pg-mjhr-2m5x">https://github.com/semantic-release/semantic-release/security/advisories/GHSA-x2pg-mjhr-2m5x</a></p>
<p>Release Date: 2022-06-09</p>
<p>Fix Resolution: 19.0.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_build | cve high detected in semantic release tgz autoclosed cve high severity vulnerability vulnerable library semantic release tgz automated semver compliant package publishing library home page a href path to dependency file package json path to vulnerable library node modules semantic release package json dependency hierarchy x semantic release tgz vulnerable library found in head commit a href found in base branch main vulnerability details semantic release is an open source npm package for automated version management and package publishing in affected versions secrets that would normally be masked by semantic release can be accidentally disclosed if they contain characters that are excluded from uri encoding by encodeuri occurrence is further limited to execution contexts where push access to the related repository is not available without modifying the repository url to inject credentials users are advised to upgrade users unable to upgrade should ensure that secrets that do not contain characters that are excluded from encoding with encodeuri when included in a url are already masked properly publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend | 0 |
26,056 | 5,222,046,182 | IssuesEvent | 2017-01-27 05:42:37 | yajra/laravel-datatables | https://api.github.com/repos/yajra/laravel-datatables | closed | How to filter query using services? | documentation question | ### How to filter query using services?
I using services and work fine but i need filter query for show in the datatable
Ex of url.
```
http://demo.com/provider/[name]
```
So need set [name] for create a filter in the query
```php
public function query()
{
$query = Provider::select('id', 'name')->where('slug', [name]);
return $this->applyScopes($query);
}
```
how to inject the parameter **[name]** ?
### System details
- Ubuntu 14.04
- PHP 5.6
- Laravel 5.3
- Laravel-Datatables 6.0
| 1.0 | How to filter query using services? - ### How to filter query using services?
I using services and work fine but i need filter query for show in the datatable
Ex of url.
```
http://demo.com/provider/[name]
```
So need set [name] for create a filter in the query
```php
public function query()
{
$query = Provider::select('id', 'name')->where('slug', [name]);
return $this->applyScopes($query);
}
```
how to inject the parameter **[name]** ?
### System details
- Ubuntu 14.04
- PHP 5.6
- Laravel 5.3
- Laravel-Datatables 6.0
| non_build | how to filter query using services how to filter query using services i using services and work fine but i need filter query for show in the datatable ex of url so need set for create a filter in the query php public function query query provider select id name where slug return this applyscopes query how to inject the parameter system details ubuntu php laravel laravel datatables | 0 |
46,204 | 11,798,635,647 | IssuesEvent | 2020-03-18 14:41:53 | spring-cloud/spring-cloud-app-broker | https://api.github.com/repos/spring-cloud/spring-cloud-app-broker | opened | Move concourse pipelines to the public concourse | build chore | At the moment we have the pipelines ready to be shared with the world but they live in a concourse instance that is not available to the public anymore. | 1.0 | Move concourse pipelines to the public concourse - At the moment we have the pipelines ready to be shared with the world but they live in a concourse instance that is not available to the public anymore. | build | move concourse pipelines to the public concourse at the moment we have the pipelines ready to be shared with the world but they live in a concourse instance that is not available to the public anymore | 1 |
47,217 | 13,056,060,349 | IssuesEvent | 2020-07-30 03:32:07 | icecube-trac/tix2 | https://api.github.com/repos/icecube-trac/tix2 | closed | new singleton implementation does not compile on RHEL4 (gcc 3.4.6)_ (Trac #151) | IceTray Migrated from Trac defect | Compilation of the trunk of offline software ends with an error (error
dump: see "additional information"). According to the error message an
attempt is being made by the Singleton implementation to use the
I3Factory() constructor. This happens both for I3ModuleFactory and for
I3ServiceFactory.
The error does not occur under RHEL5 (gcc 4.1.2)
Migrated from https://code.icecube.wisc.edu/ticket/151
```json
{
"status": "closed",
"changetime": "2009-01-08T14:51:17",
"description": "Compilation of the trunk of offline software ends with an error (error\ndump: see \"additional information\"). According to the error message an\nattempt is being made by the Singleton implementation to use the\nI3Factory() constructor. This happens both for I3ModuleFactory and for\nI3ServiceFactory.\n\nThe error does not occur under RHEL5 (gcc 4.1.2)",
"reporter": "blaufuss",
"cc": "",
"resolution": "fixed",
"_ts": "1231426277000000",
"component": "IceTray",
"summary": "new singleton implementation does not compile on RHEL4 (gcc 3.4.6)_",
"priority": "normal",
"keywords": "",
"time": "2008-11-19T02:08:07",
"milestone": "",
"owner": "troy",
"type": "defect"
}
```
| 1.0 | new singleton implementation does not compile on RHEL4 (gcc 3.4.6)_ (Trac #151) - Compilation of the trunk of offline software ends with an error (error
dump: see "additional information"). According to the error message an
attempt is being made by the Singleton implementation to use the
I3Factory() constructor. This happens both for I3ModuleFactory and for
I3ServiceFactory.
The error does not occur under RHEL5 (gcc 4.1.2)
Migrated from https://code.icecube.wisc.edu/ticket/151
```json
{
"status": "closed",
"changetime": "2009-01-08T14:51:17",
"description": "Compilation of the trunk of offline software ends with an error (error\ndump: see \"additional information\"). According to the error message an\nattempt is being made by the Singleton implementation to use the\nI3Factory() constructor. This happens both for I3ModuleFactory and for\nI3ServiceFactory.\n\nThe error does not occur under RHEL5 (gcc 4.1.2)",
"reporter": "blaufuss",
"cc": "",
"resolution": "fixed",
"_ts": "1231426277000000",
"component": "IceTray",
"summary": "new singleton implementation does not compile on RHEL4 (gcc 3.4.6)_",
"priority": "normal",
"keywords": "",
"time": "2008-11-19T02:08:07",
"milestone": "",
"owner": "troy",
"type": "defect"
}
```
| non_build | new singleton implementation does not compile on gcc trac compilation of the trunk of offline software ends with an error error dump see additional information according to the error message an attempt is being made by the singleton implementation to use the constructor this happens both for and for the error does not occur under gcc migrated from json status closed changetime description compilation of the trunk of offline software ends with an error error ndump see additional information according to the error message an nattempt is being made by the singleton implementation to use the constructor this happens both for and for n nthe error does not occur under gcc reporter blaufuss cc resolution fixed ts component icetray summary new singleton implementation does not compile on gcc priority normal keywords time milestone owner troy type defect | 0 |
50,541 | 12,519,965,755 | IssuesEvent | 2020-06-03 15:10:46 | processone/ejabberd | https://api.github.com/repos/processone/ejabberd | closed | Problems compiling ejabberd with Erlang/OTP 23 | Component:Build | ejabberd currently supports and is tested with Erlang/OTP versions 19.3 and 22.3.
This is a list of known problems compiling with 23.0, and where to find more information:
A) **eimp** and **sqlite3** complain about 'erl_interface.h' not found: https://github.com/processone/eimp/issues/12
:heavy_check_mark: Fixed in 1d7e29765e4eae3a5d6e3d594dfb2e9b4f1bd8af
B) Several other dependencies (cache_tab, fast_tls, stringprep,...) complain like this:
```
Compiling c_src/ets_cache.c
/usr/bin/ld: cannot find -lerl_interface
collect2: error: ld returned 1 exit status
ERROR: sh(cc c_src/ets_cache.o $LDFLAGS -shared
-L"/usr/lib/erlang/lib/erl_interface-4.0/lib"
-lerl_interface -lei -o priv/lib/ets_cache.so)
failed with return code 1 and the following output:
/usr/bin/ld: cannot find -lerl_interface
collect2: error: ld returned 1 exit status
```
:heavy_check_mark: Fixed in 21312c79aa1c1c319f2b1859184a5ef3126c7ac9
C) **jiffy** complains about `undef rebar_utils,get_cwd,...` see https://github.com/davisp/jiffy/issues/197
:heavy_check_mark: A workaround was committed in 2ca5712507473578fe00e4a1bce8e25a0d9f1bca
D) **epam** dependency uses erl_interface: https://github.com/processone/epam/issues/9
It uses several several functions, including _erl_free, erl_decode, erl_encode, erl_term_len_. So that dependency would require a major update to use ei_ equivalent functions. On the good news, it seems Erlang 19.3 already supports that new ei_ API, so in theory it would be possible to update epam to support from Erlang [19.3 erl_interface](http://erlang.org/documentation/doc-8.3/lib/erl_interface-3.9.3/doc/html/index.html) to [23.0 ei](http://erlang.org/documentation/doc-11.0/lib/erl_interface-4.0/doc/html/index.html).
:heavy_check_mark: epam updated to work with Erlan/OTP 23: https://github.com/processone/epam/commit/dc6e01da958b8dd21bb6e261aa185066e427e6f2
E) **xref** reports a few warnings about deprecated functions: https://github.com/processone/ejabberd/issues/3284
:heavy_check_mark: ejabberd code fixed in c0f7008e96b8159367d025c61f5230685a8c5993
:heavy_check_mark: dependencies fixed in 482917348b0890cf656d7972abae0911d512debc | 1.0 | Problems compiling ejabberd with Erlang/OTP 23 - ejabberd currently supports and is tested with Erlang/OTP versions 19.3 and 22.3.
This is a list of known problems compiling with 23.0, and where to find more information:
A) **eimp** and **sqlite3** complain about 'erl_interface.h' not found: https://github.com/processone/eimp/issues/12
:heavy_check_mark: Fixed in 1d7e29765e4eae3a5d6e3d594dfb2e9b4f1bd8af
B) Several other dependencies (cache_tab, fast_tls, stringprep,...) complain like this:
```
Compiling c_src/ets_cache.c
/usr/bin/ld: cannot find -lerl_interface
collect2: error: ld returned 1 exit status
ERROR: sh(cc c_src/ets_cache.o $LDFLAGS -shared
-L"/usr/lib/erlang/lib/erl_interface-4.0/lib"
-lerl_interface -lei -o priv/lib/ets_cache.so)
failed with return code 1 and the following output:
/usr/bin/ld: cannot find -lerl_interface
collect2: error: ld returned 1 exit status
```
:heavy_check_mark: Fixed in 21312c79aa1c1c319f2b1859184a5ef3126c7ac9
C) **jiffy** complains about `undef rebar_utils,get_cwd,...` see https://github.com/davisp/jiffy/issues/197
:heavy_check_mark: A workaround was committed in 2ca5712507473578fe00e4a1bce8e25a0d9f1bca
D) **epam** dependency uses erl_interface: https://github.com/processone/epam/issues/9
It uses several several functions, including _erl_free, erl_decode, erl_encode, erl_term_len_. So that dependency would require a major update to use ei_ equivalent functions. On the good news, it seems Erlang 19.3 already supports that new ei_ API, so in theory it would be possible to update epam to support from Erlang [19.3 erl_interface](http://erlang.org/documentation/doc-8.3/lib/erl_interface-3.9.3/doc/html/index.html) to [23.0 ei](http://erlang.org/documentation/doc-11.0/lib/erl_interface-4.0/doc/html/index.html).
:heavy_check_mark: epam updated to work with Erlan/OTP 23: https://github.com/processone/epam/commit/dc6e01da958b8dd21bb6e261aa185066e427e6f2
E) **xref** reports a few warnings about deprecated functions: https://github.com/processone/ejabberd/issues/3284
:heavy_check_mark: ejabberd code fixed in c0f7008e96b8159367d025c61f5230685a8c5993
:heavy_check_mark: dependencies fixed in 482917348b0890cf656d7972abae0911d512debc | build | problems compiling ejabberd with erlang otp ejabberd currently supports and is tested with erlang otp versions and this is a list of known problems compiling with and where to find more information a eimp and complain about erl interface h not found heavy check mark fixed in b several other dependencies cache tab fast tls stringprep complain like this compiling c src ets cache c usr bin ld cannot find lerl interface error ld returned exit status error sh cc c src ets cache o ldflags shared l usr lib erlang lib erl interface lib lerl interface lei o priv lib ets cache so failed with return code and the following output usr bin ld cannot find lerl interface error ld returned exit status heavy check mark fixed in c jiffy complains about undef rebar utils get cwd see heavy check mark a workaround was committed in d epam dependency uses erl interface it uses several several functions including erl free erl decode erl encode erl term len so that dependency would require a major update to use ei equivalent functions on the good news it seems erlang already supports that new ei api so in theory it would be possible to update epam to support from erlang to heavy check mark epam updated to work with erlan otp e xref reports a few warnings about deprecated functions heavy check mark ejabberd code fixed in heavy check mark dependencies fixed in | 1 |
173,033 | 27,373,878,961 | IssuesEvent | 2023-02-28 03:10:19 | momocus/sakazuki | https://api.github.com/repos/momocus/sakazuki | closed | 開封する、評価する、ボタンが多すぎる、まとめたい | ruby design | Issue #185 に関連。
indexに「開封する」と「評価する」の2つあるの、野暮ったい。
もがみさんの指摘通り、「開封する」で即開封されたあと、「評価しますか?」とかのUIでいい気がしてきた。 | 1.0 | 開封する、評価する、ボタンが多すぎる、まとめたい - Issue #185 に関連。
indexに「開封する」と「評価する」の2つあるの、野暮ったい。
もがみさんの指摘通り、「開封する」で即開封されたあと、「評価しますか?」とかのUIでいい気がしてきた。 | non_build | 開封する、評価する、ボタンが多すぎる、まとめたい issue に関連。 indexに「開封する」と「評価する」 、野暮ったい。 もがみさんの指摘通り、「開封する」で即開封されたあと、「評価しますか?」とかのuiでいい気がしてきた。 | 0 |
108,955 | 23,685,476,070 | IssuesEvent | 2022-08-29 05:40:37 | sast-automation-dev/openmrs-core-20 | https://api.github.com/repos/sast-automation-dev/openmrs-core-20 | opened | Code Security Report: 23 high severity findings, 151 total findings | code security findings | # Code Security Report
**Latest Scan:** 2022-08-29 05:32am
**Total Findings:** 151
**Tested Project Files:** 821
**Detected Programming Languages:** 2
<!-- SAST-MANUAL-SCAN-START -->
- [ ] Check this box to manually trigger a scan
<!-- SAST-MANUAL-SCAN-END -->
## Language: Java
| Severity | CWE | Vulnerability Type | Count |
|-|-|-|-|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-89](https://cwe.mitre.org/data/definitions/89.html)|SQL Injection|6|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-22](https://cwe.mitre.org/data/definitions/22.html)|Path/Directory Traversal|10|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-73](https://cwe.mitre.org/data/definitions/73.html)|File Manipulation|2|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-79](https://cwe.mitre.org/data/definitions/79.html)|Cross-Site Scripting|5|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-798](https://cwe.mitre.org/data/definitions/798.html)|Hardcoded Password/Credentials|21|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-611](https://cwe.mitre.org/data/definitions/611.html)|XML External Entity (XXE) Injection|6|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-338](https://cwe.mitre.org/data/definitions/338.html)|Weak Pseudo-Random|5|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-244](https://cwe.mitre.org/data/definitions/244.html)|Heap Inspection|40|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-501](https://cwe.mitre.org/data/definitions/501.html)|Trust Boundary Violation|2|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-209](https://cwe.mitre.org/data/definitions/209.html)|Error Messages Information Exposure|36|
|<img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low|[CWE-434](https://cwe.mitre.org/data/definitions/434.html)|File Upload|1|
|<img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low|[CWE-117](https://cwe.mitre.org/data/definitions/117.html)|Log Forging|15|
### Details
> The below list presents the 20 most relevant findings that need your attention. To view information on the remaining findings, navigate to the [Mend SAST Application](https://dev.whitesourcesoftware.com/sast/#/scans/60593651-a9e9-49f6-a8a5-103a67cd032f/details).
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>SQL Injection (CWE-89) : 6</summary>
#### Findings
<details>
<summary>util/DatabaseUtil.java:131</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L126-L131
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L86
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L94
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L98
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L105
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L106
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L111
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L115
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L117
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L103
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L104
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L107
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L128
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L131
</details>
</details>
<details>
<summary>util/DatabaseUtil.java:131</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L126-L131
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L88
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L96
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L101
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L108
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L109
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L116
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L122
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L124
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L103
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L104
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L107
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L128
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L131
</details>
</details>
<details>
<summary>util/DatabaseUtil.java:131</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L126-L131
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L89
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L97
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L102
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L110
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L111
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L118
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L122
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L124
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L103
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L104
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L107
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L128
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L131
</details>
</details>
<details>
<summary>util/DatabaseUtil.java:131</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L126-L131
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L88
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L96
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L101
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L108
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L109
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L116
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L122
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L124
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L87
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L88
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L92
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L95
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L128
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L131
</details>
</details>
<details>
<summary>util/DatabaseUtil.java:131</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L126-L131
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L86
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L94
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L98
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L105
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L106
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L111
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L115
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L117
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L87
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L88
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L92
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L95
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L128
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L131
</details>
</details>
<details>
<summary>util/DatabaseUtil.java:131</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L126-L131
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L89
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L97
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L102
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L110
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L111
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L118
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L122
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L124
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L87
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L88
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L92
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L95
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L128
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L131
</details>
</details>
</details>
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>Path/Directory Traversal (CWE-22) : 10</summary>
#### Findings
<details>
<summary>util/OpenmrsUtil.java:1664</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1659-L1664
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/messagesource/impl/MutableResourceBundleMessageSource.java#L208
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1662
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1664
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:1664</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1659-L1664
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/messagesource/impl/MutableResourceBundleMessageSource.java#L257
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1662
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1664
</details>
</details>
<details>
<summary>module/ModuleUtil.java:187</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/module/ModuleUtil.java#L182-L187
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/web/src/main/java/org/openmrs/web/filter/startuperror/StartupErrorFilter.java#L85
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/web/src/main/java/org/openmrs/web/filter/startuperror/StartupErrorFilter.java#L86
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/web/src/main/java/org/openmrs/web/filter/startuperror/StartupErrorFilter.java#L88
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/module/ModuleUtil.java#L179
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/module/ModuleUtil.java#L187
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:1664</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1659-L1664
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/messagesource/impl/MutableResourceBundleMessageSource.java#L317
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1662
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1664
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:1664</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1659-L1664
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/messagesource/impl/MutableResourceBundleMessageSource.java#L230
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1662
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1664
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:1664</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1659-L1664
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/messagesource/impl/MutableResourceBundleMessageSource.java#L144
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1662
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1664
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:1664</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1659-L1664
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1664
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1662
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1664
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:996</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L991-L996
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L994
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L996
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:1006</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1001-L1006
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L994
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1003
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1006
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:2041</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L2036-L2041
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L2038
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L2041
</details>
</details>
</details>
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>File Manipulation (CWE-73) : 2</summary>
#### Findings
<details>
<summary>handler/BinaryDataHandler.java:119</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/handler/BinaryDataHandler.java#L114-L119
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L242
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/handler/BinaryDataHandler.java#L68
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/ComplexData.java#L42
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/ComplexData.java#L44
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/ComplexData.java#L70
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/ComplexData.java#L71
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/handler/BinaryDataHandler.java#L117
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/handler/BinaryDataHandler.java#L119
</details>
</details>
<details>
<summary>handler/TextHandler.java:136</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/handler/TextHandler.java#L131-L136
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/handler/TextHandler.java#L132
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/handler/TextHandler.java#L136
</details>
</details>
</details>
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>Cross-Site Scripting (CWE-79) : 2</summary>
#### Findings
<details>
<summary>util/DatabaseUpdater.java:745</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUpdater.java#L740-L745
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L507
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L511
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L518
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L348
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L349
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L358
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUpdater.java#L732
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUpdater.java#L745
</details>
</details>
<details>
<summary>util/DatabaseUpdater.java:745</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUpdater.java#L740-L745
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L584
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L592
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/ConceptName.java#L132
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/ConceptName.java#L137
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L143
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L184
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L425
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L427
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L354
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L355
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L358
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUpdater.java#L732
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUpdater.java#L745
</details>
</details>
</details>
## Language: JavaScript / Node.js
| Severity | CWE | Vulnerability Type | Count |
|-|-|-|-|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-400](https://cwe.mitre.org/data/definitions/400.html)|Regex Denial of Service (ReDoS)|2|
### Details
> No high vulnerability findings detected. To view information on the remaining findings, navigate to the [Mend SAST Application](https://dev.whitesourcesoftware.com/sast/#/scans/60593651-a9e9-49f6-a8a5-103a67cd032f/details).
| 1.0 | Code Security Report: 23 high severity findings, 151 total findings - # Code Security Report
**Latest Scan:** 2022-08-29 05:32am
**Total Findings:** 151
**Tested Project Files:** 821
**Detected Programming Languages:** 2
<!-- SAST-MANUAL-SCAN-START -->
- [ ] Check this box to manually trigger a scan
<!-- SAST-MANUAL-SCAN-END -->
## Language: Java
| Severity | CWE | Vulnerability Type | Count |
|-|-|-|-|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-89](https://cwe.mitre.org/data/definitions/89.html)|SQL Injection|6|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-22](https://cwe.mitre.org/data/definitions/22.html)|Path/Directory Traversal|10|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-73](https://cwe.mitre.org/data/definitions/73.html)|File Manipulation|2|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-79](https://cwe.mitre.org/data/definitions/79.html)|Cross-Site Scripting|5|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-798](https://cwe.mitre.org/data/definitions/798.html)|Hardcoded Password/Credentials|21|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-611](https://cwe.mitre.org/data/definitions/611.html)|XML External Entity (XXE) Injection|6|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-338](https://cwe.mitre.org/data/definitions/338.html)|Weak Pseudo-Random|5|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-244](https://cwe.mitre.org/data/definitions/244.html)|Heap Inspection|40|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-501](https://cwe.mitre.org/data/definitions/501.html)|Trust Boundary Violation|2|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-209](https://cwe.mitre.org/data/definitions/209.html)|Error Messages Information Exposure|36|
|<img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low|[CWE-434](https://cwe.mitre.org/data/definitions/434.html)|File Upload|1|
|<img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low|[CWE-117](https://cwe.mitre.org/data/definitions/117.html)|Log Forging|15|
### Details
> The below list presents the 20 most relevant findings that need your attention. To view information on the remaining findings, navigate to the [Mend SAST Application](https://dev.whitesourcesoftware.com/sast/#/scans/60593651-a9e9-49f6-a8a5-103a67cd032f/details).
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>SQL Injection (CWE-89) : 6</summary>
#### Findings
<details>
<summary>util/DatabaseUtil.java:131</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L126-L131
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L86
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L94
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L98
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L105
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L106
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L111
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L115
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L117
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L103
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L104
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L107
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L128
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L131
</details>
</details>
<details>
<summary>util/DatabaseUtil.java:131</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L126-L131
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L88
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L96
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L101
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L108
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L109
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L116
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L122
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L124
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L103
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L104
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L107
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L128
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L131
</details>
</details>
<details>
<summary>util/DatabaseUtil.java:131</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L126-L131
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L89
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L97
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L102
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L110
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L111
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L118
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L122
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L124
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L103
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L104
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L107
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L128
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L131
</details>
</details>
<details>
<summary>util/DatabaseUtil.java:131</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L126-L131
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L88
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L96
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L101
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L108
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L109
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L116
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L122
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterTypeNameChangeSet.java#L124
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L87
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L88
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L92
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L95
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L128
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L131
</details>
</details>
<details>
<summary>util/DatabaseUtil.java:131</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L126-L131
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L86
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L94
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L98
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L105
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L106
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L111
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L115
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateLocationAttributeTypeNameChangeSet.java#L117
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L87
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L88
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L92
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L95
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L128
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L131
</details>
</details>
<details>
<summary>util/DatabaseUtil.java:131</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L126-L131
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L89
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L97
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L102
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L110
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L111
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L118
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L122
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/DuplicateEncounterRoleNameChangeSet.java#L124
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L87
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L88
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L92
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L95
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L128
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUtil.java#L131
</details>
</details>
</details>
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>Path/Directory Traversal (CWE-22) : 10</summary>
#### Findings
<details>
<summary>util/OpenmrsUtil.java:1664</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1659-L1664
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/messagesource/impl/MutableResourceBundleMessageSource.java#L208
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1662
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1664
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:1664</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1659-L1664
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/messagesource/impl/MutableResourceBundleMessageSource.java#L257
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1662
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1664
</details>
</details>
<details>
<summary>module/ModuleUtil.java:187</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/module/ModuleUtil.java#L182-L187
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/web/src/main/java/org/openmrs/web/filter/startuperror/StartupErrorFilter.java#L85
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/web/src/main/java/org/openmrs/web/filter/startuperror/StartupErrorFilter.java#L86
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/web/src/main/java/org/openmrs/web/filter/startuperror/StartupErrorFilter.java#L88
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/module/ModuleUtil.java#L179
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/module/ModuleUtil.java#L187
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:1664</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1659-L1664
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/messagesource/impl/MutableResourceBundleMessageSource.java#L317
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1662
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1664
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:1664</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1659-L1664
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/messagesource/impl/MutableResourceBundleMessageSource.java#L230
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1662
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1664
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:1664</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1659-L1664
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/messagesource/impl/MutableResourceBundleMessageSource.java#L144
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1662
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1664
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:1664</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1659-L1664
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1664
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1662
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1664
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:996</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L991-L996
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L994
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L996
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:1006</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1001-L1006
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L994
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1003
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L1006
</details>
</details>
<details>
<summary>util/OpenmrsUtil.java:2041</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L2036-L2041
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L2038
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L2041
</details>
</details>
</details>
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>File Manipulation (CWE-73) : 2</summary>
#### Findings
<details>
<summary>handler/BinaryDataHandler.java:119</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/handler/BinaryDataHandler.java#L114-L119
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/OpenmrsUtil.java#L242
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/handler/BinaryDataHandler.java#L68
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/ComplexData.java#L42
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/ComplexData.java#L44
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/ComplexData.java#L70
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/ComplexData.java#L71
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/handler/BinaryDataHandler.java#L117
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/handler/BinaryDataHandler.java#L119
</details>
</details>
<details>
<summary>handler/TextHandler.java:136</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/handler/TextHandler.java#L131-L136
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/handler/TextHandler.java#L132
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/obs/handler/TextHandler.java#L136
</details>
</details>
</details>
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>Cross-Site Scripting (CWE-79) : 2</summary>
#### Findings
<details>
<summary>util/DatabaseUpdater.java:745</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUpdater.java#L740-L745
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L507
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L511
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L518
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L348
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L349
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L358
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUpdater.java#L732
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUpdater.java#L745
</details>
</details>
<details>
<summary>util/DatabaseUpdater.java:745</summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUpdater.java#L740-L745
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L584
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L592
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/ConceptName.java#L132
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/ConceptName.java#L137
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L143
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L184
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L425
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L427
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L354
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L355
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/databasechange/ConceptValidatorChangeSet.java#L358
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUpdater.java#L732
https://github.com/sast-automation-dev/openmrs-core-20/blob/8ac97f80e1be122a004a0eecebc194ea21a9bc1f/openmrs-core-20/api/src/main/java/org/openmrs/util/DatabaseUpdater.java#L745
</details>
</details>
</details>
## Language: JavaScript / Node.js
| Severity | CWE | Vulnerability Type | Count |
|-|-|-|-|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-400](https://cwe.mitre.org/data/definitions/400.html)|Regex Denial of Service (ReDoS)|2|
### Details
> No high vulnerability findings detected. To view information on the remaining findings, navigate to the [Mend SAST Application](https://dev.whitesourcesoftware.com/sast/#/scans/60593651-a9e9-49f6-a8a5-103a67cd032f/details).
| non_build | code security report high severity findings total findings code security report latest scan total findings tested project files detected programming languages check this box to manually trigger a scan language java severity cwe vulnerability type count high injection high traversal high manipulation high scripting medium password credentials medium external entity xxe injection medium pseudo random medium inspection medium boundary violation medium messages information exposure low upload low forging details the below list presents the most relevant findings that need your attention to view information on the remaining findings navigate to the sql injection cwe findings util databaseutil java trace util databaseutil java trace util databaseutil java trace util databaseutil java trace util databaseutil java trace util databaseutil java trace path directory traversal cwe findings util openmrsutil java trace util openmrsutil java trace module moduleutil java trace util openmrsutil java trace util openmrsutil java trace util openmrsutil java trace util openmrsutil java trace util openmrsutil java trace util openmrsutil java trace util openmrsutil java trace file manipulation cwe findings handler binarydatahandler java trace handler texthandler java trace cross site scripting cwe findings util databaseupdater java trace util databaseupdater java trace language javascript node js severity cwe vulnerability type count medium denial of service redos details no high vulnerability findings detected to view information on the remaining findings navigate to the | 0 |
19,522 | 3,775,000,507 | IssuesEvent | 2016-03-17 11:41:33 | sass/libsass | https://api.github.com/repos/sass/libsass | closed | 3.3.3 Regression: Segfaults using @extend with unknown selector | Bug - Confirmed Bug - Should Error Dev - PR Ready Dev - Test Written Dev - WIP | There seems to be an issue with unfound selectors when using `@extend`. A segfault occurs when entering a selector which cannot be found elsewhere in the code. For example:
```scss
.z-depth-1 {
box-shadow: 0 2px 5px 0 rgba(0, 0, 0, 0.16), 0 2px 10px 0 rgba(0, 0, 0, 0.12);
}
.z-depth-1-half {
box-shadow: 0 5px 11px 0 rgba(0, 0, 0, 0.18), 0 4px 15px 0 rgba(0, 0, 0, 0.15);
}
.btn {
@extend .z-depth-1;
&:hover {
@extend .z-depth-half-1;
}
}
```
While I think that it should report an error (even though ruby sass does not) currently, anything using libsass 3.3.3 crashes with a segfault. Using libsass 3.3.2 it will be compiled into:
```css
.z-depth-1, .btn {
box-shadow: 0 2px 5px 0 rgba(0, 0, 0, 0.16), 0 2px 10px 0 rgba(0, 0, 0, 0.12);
}
.z-depth-1-half {
box-shadow: 0 5px 11px 0 rgba(0, 0, 0, 0.18), 0 4px 15px 0 rgba(0, 0, 0, 0.15);
}
````
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/30845965-3-3-3-regression-segfaults-using-extend-with-unknown-selector?utm_campaign=plugin&utm_content=tracker%2F283068&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F283068&utm_medium=issues&utm_source=github).
</bountysource-plugin> | 1.0 | 3.3.3 Regression: Segfaults using @extend with unknown selector - There seems to be an issue with unfound selectors when using `@extend`. A segfault occurs when entering a selector which cannot be found elsewhere in the code. For example:
```scss
.z-depth-1 {
box-shadow: 0 2px 5px 0 rgba(0, 0, 0, 0.16), 0 2px 10px 0 rgba(0, 0, 0, 0.12);
}
.z-depth-1-half {
box-shadow: 0 5px 11px 0 rgba(0, 0, 0, 0.18), 0 4px 15px 0 rgba(0, 0, 0, 0.15);
}
.btn {
@extend .z-depth-1;
&:hover {
@extend .z-depth-half-1;
}
}
```
While I think that it should report an error (even though ruby sass does not) currently, anything using libsass 3.3.3 crashes with a segfault. Using libsass 3.3.2 it will be compiled into:
```css
.z-depth-1, .btn {
box-shadow: 0 2px 5px 0 rgba(0, 0, 0, 0.16), 0 2px 10px 0 rgba(0, 0, 0, 0.12);
}
.z-depth-1-half {
box-shadow: 0 5px 11px 0 rgba(0, 0, 0, 0.18), 0 4px 15px 0 rgba(0, 0, 0, 0.15);
}
````
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/30845965-3-3-3-regression-segfaults-using-extend-with-unknown-selector?utm_campaign=plugin&utm_content=tracker%2F283068&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F283068&utm_medium=issues&utm_source=github).
</bountysource-plugin> | non_build | regression segfaults using extend with unknown selector there seems to be an issue with unfound selectors when using extend a segfault occurs when entering a selector which cannot be found elsewhere in the code for example scss z depth box shadow rgba rgba z depth half box shadow rgba rgba btn extend z depth hover extend z depth half while i think that it should report an error even though ruby sass does not currently anything using libsass crashes with a segfault using libsass it will be compiled into css z depth btn box shadow rgba rgba z depth half box shadow rgba rgba want to back this issue we accept bounties via | 0 |
33,582 | 9,186,929,073 | IssuesEvent | 2019-03-06 00:40:57 | aws-amplify/amplify-js | https://api.github.com/repos/aws-amplify/amplify-js | closed | Module not found: Can't resolve 'react-native' | Build bug review | Since upgrading to `@aws-amplify/auth` from `1.2.15` to `1.2.17`, I'm getting the following error from `eslint`:
```
./node_modules/@aws-amplify/auth/lib/Auth.js
Module not found: Can't resolve 'react-native' in '/opt/atlassian/pipelines/agent/build/node_modules/@aws-amplify/auth/lib'
``` | 1.0 | Module not found: Can't resolve 'react-native' - Since upgrading to `@aws-amplify/auth` from `1.2.15` to `1.2.17`, I'm getting the following error from `eslint`:
```
./node_modules/@aws-amplify/auth/lib/Auth.js
Module not found: Can't resolve 'react-native' in '/opt/atlassian/pipelines/agent/build/node_modules/@aws-amplify/auth/lib'
``` | build | module not found can t resolve react native since upgrading to aws amplify auth from to i m getting the following error from eslint node modules aws amplify auth lib auth js module not found can t resolve react native in opt atlassian pipelines agent build node modules aws amplify auth lib | 1 |
34,359 | 9,350,040,715 | IssuesEvent | 2019-04-01 00:49:24 | kubeflow/kubeflow | https://api.github.com/repos/kubeflow/kubeflow | closed | Trigger docker build images on post-submit | area/build-release priority/p2 | We'd like to trigger building our docker images on every post-submit.
In #666 we created a cron job to regularly run Argo workflows to build our docker images.
This issue goes one step further and triggers these workflows on each postsubmit.
We can use Prow to trigger jobs on post submit. The problem is securing this so that we don't give unvetted code access to our release artifacts (including staging).
Prow supports running the test pods in a separate [cluster](https://github.com/kubernetes/test-infra/blob/master/prow/getting_started.md#run-test-pods-in-a-different-namespace-or-a-different-cluster).
This works as follows
- In the prow services cluster we have a secret with a map of cluster name to credentials to talk to that cluster (AKA kubeconfig)
-- one of these clusters is "default"
- In your job you can optionally list a non-default cluster by setting "cluster: foo", prow will then schedule that job there instead of the default cluster
- for a sensitive cluster like the "security" cluster for kubernetes-security org we can have a presubmit tests in test-infra that restricts which jobs set that cluster it, if need be, as well as only the test-infra maintainers having permission to approve configuration changes.
So I think we could do the following
- Setup a GCR repository for our postsubmit builds
- This should be separate from our public release repository for security reasons and also to avoid
polluting that repository with lots of images
- Setup a separate cluster/GCP project from our release cluster that can push to staging but not public
- Configure Prow to trigger postsubmits jobs in this cluster
- Restrict prow jobs that can use this cluster to our post submit jobs.
| 1.0 | Trigger docker build images on post-submit - We'd like to trigger building our docker images on every post-submit.
In #666 we created a cron job to regularly run Argo workflows to build our docker images.
This issue goes one step further and triggers these workflows on each postsubmit.
We can use Prow to trigger jobs on post submit. The problem is securing this so that we don't give unvetted code access to our release artifacts (including staging).
Prow supports running the test pods in a separate [cluster](https://github.com/kubernetes/test-infra/blob/master/prow/getting_started.md#run-test-pods-in-a-different-namespace-or-a-different-cluster).
This works as follows
- In the prow services cluster we have a secret with a map of cluster name to credentials to talk to that cluster (AKA kubeconfig)
-- one of these clusters is "default"
- In your job you can optionally list a non-default cluster by setting "cluster: foo", prow will then schedule that job there instead of the default cluster
- for a sensitive cluster like the "security" cluster for kubernetes-security org we can have a presubmit tests in test-infra that restricts which jobs set that cluster it, if need be, as well as only the test-infra maintainers having permission to approve configuration changes.
So I think we could do the following
- Setup a GCR repository for our postsubmit builds
- This should be separate from our public release repository for security reasons and also to avoid
polluting that repository with lots of images
- Setup a separate cluster/GCP project from our release cluster that can push to staging but not public
- Configure Prow to trigger postsubmits jobs in this cluster
- Restrict prow jobs that can use this cluster to our post submit jobs.
| build | trigger docker build images on post submit we d like to trigger building our docker images on every post submit in we created a cron job to regularly run argo workflows to build our docker images this issue goes one step further and triggers these workflows on each postsubmit we can use prow to trigger jobs on post submit the problem is securing this so that we don t give unvetted code access to our release artifacts including staging prow supports running the test pods in a separate this works as follows in the prow services cluster we have a secret with a map of cluster name to credentials to talk to that cluster aka kubeconfig one of these clusters is default in your job you can optionally list a non default cluster by setting cluster foo prow will then schedule that job there instead of the default cluster for a sensitive cluster like the security cluster for kubernetes security org we can have a presubmit tests in test infra that restricts which jobs set that cluster it if need be as well as only the test infra maintainers having permission to approve configuration changes so i think we could do the following setup a gcr repository for our postsubmit builds this should be separate from our public release repository for security reasons and also to avoid polluting that repository with lots of images setup a separate cluster gcp project from our release cluster that can push to staging but not public configure prow to trigger postsubmits jobs in this cluster restrict prow jobs that can use this cluster to our post submit jobs | 1 |
270,691 | 20,605,607,597 | IssuesEvent | 2022-03-06 22:58:07 | apj520/ENG1-Team-13-Assessment-2 | https://api.github.com/repos/apj520/ENG1-Team-13-Assessment-2 | opened | 5.a Implementation of github actions configuration and basic testing | documentation | Added git action config file in **ayman_CI_branch** which builds on the testing from **testing_roscoe** branch.
Config file is prone to future updates.
Pls check comments under testing java files before implementing new tests | 1.0 | 5.a Implementation of github actions configuration and basic testing - Added git action config file in **ayman_CI_branch** which builds on the testing from **testing_roscoe** branch.
Config file is prone to future updates.
Pls check comments under testing java files before implementing new tests | non_build | a implementation of github actions configuration and basic testing added git action config file in ayman ci branch which builds on the testing from testing roscoe branch config file is prone to future updates pls check comments under testing java files before implementing new tests | 0 |
37,360 | 9,995,340,310 | IssuesEvent | 2019-07-11 20:00:32 | pravega/flink-connectors | https://api.github.com/repos/pravega/flink-connectors | closed | Update Pravega version in r0.5 branch | area/build area/release-preparation version/0.5.0 | **Problem description**
Update Pravega version in `r0.5` branch to the latest commit `6f8a8203b807a4babbad54f8efe856f3fbef916d` and the corresponding snapshot version is `0.5.0-2269.6f8a820-SNAPSHOT`.
**Problem location**
Connector `r0.5` branch build configurations
**Suggestions for an improvement**
| 1.0 | Update Pravega version in r0.5 branch - **Problem description**
Update Pravega version in `r0.5` branch to the latest commit `6f8a8203b807a4babbad54f8efe856f3fbef916d` and the corresponding snapshot version is `0.5.0-2269.6f8a820-SNAPSHOT`.
**Problem location**
Connector `r0.5` branch build configurations
**Suggestions for an improvement**
| build | update pravega version in branch problem description update pravega version in branch to the latest commit and the corresponding snapshot version is snapshot problem location connector branch build configurations suggestions for an improvement | 1 |
1,508 | 2,544,928,770 | IssuesEvent | 2015-01-29 14:04:23 | ipython/ipython | https://api.github.com/repos/ipython/ipython | opened | Dashboard : Number of selected items | design-review | I think a number of selected items near the buttons would be nice.
Or at least a indication one/many. I tend to scroll and switch windows before going back to top to click on the action, and I'm not sure how many items are selected; or if something was selected from a previous time I was on the windows, and I did nothing. | 1.0 | Dashboard : Number of selected items - I think a number of selected items near the buttons would be nice.
Or at least a indication one/many. I tend to scroll and switch windows before going back to top to click on the action, and I'm not sure how many items are selected; or if something was selected from a previous time I was on the windows, and I did nothing. | non_build | dashboard number of selected items i think a number of selected items near the buttons would be nice or at least a indication one many i tend to scroll and switch windows before going back to top to click on the action and i m not sure how many items are selected or if something was selected from a previous time i was on the windows and i did nothing | 0 |
59,456 | 14,592,802,807 | IssuesEvent | 2020-12-19 19:23:49 | NBCommunity/Issues | https://api.github.com/repos/NBCommunity/Issues | opened | Reference commands missing | ❗ High priority 🧟 Zombie Survival 🧰 Map Build | The following commands are missing from ZS and MB:
- **/reference [player]** used by Operator+ to reference a prospective Moderator applicant.
- **/references [player]** used by Operator+ to see a player's references.
- **/myreferences** used by players to see their references. | 1.0 | Reference commands missing - The following commands are missing from ZS and MB:
- **/reference [player]** used by Operator+ to reference a prospective Moderator applicant.
- **/references [player]** used by Operator+ to see a player's references.
- **/myreferences** used by players to see their references. | build | reference commands missing the following commands are missing from zs and mb reference used by operator to reference a prospective moderator applicant references used by operator to see a player s references myreferences used by players to see their references | 1 |
46,775 | 11,885,640,580 | IssuesEvent | 2020-03-27 20:02:37 | flutter/flutter | https://api.github.com/repos/flutter/flutter | closed | Unable to find included file "Pods/Target Support Files/Pods-Runner/Pods-Runner.debug.xcconfig" | a: build p: tooling t: xcode tool waiting for customer response ⌺ platform-ios | Hi, I'm getting this error when building the Flutter app for iOS simulator. Here's the complete log
```bash
Launching lib/main_learner.dart on iPhone Xs in debug mode...
Running Xcode build...
Xcode build done. 24.1s
Failed to build iOS app
Error output from Xcode build:
↳
2019-12-09 21:44:32.774 xcodebuild[37612:6496433] DVTAssertions: Warning in /Library/Caches/com.apple.xbs/Sources/IDEXcode3ProjectSupport/IDEXcode3ProjectSupport-15508/Xcode3Core/LegacyProjects/Frameworks/DevToolsCore/DevToolsCore/BuildSystem/Runtime/PBXTargetBuildContext.mm:759
Details: unexpected successful exit code from cancelled command <C0018:'CpResource Generated.xcconfig':P14>
Object: <PBXTargetBuildContext: 0x7fbf3a459b80>
Method: -createCommandInvocationRecordFromInvocation:
Thread: <NSThread: 0x7fbf3bc57e40>{number = 23, name = (null)}
Please file a bug at https://feedbackassistant.apple.com with this warning message and any useful information you can provide.
2019-12-09 21:44:32.774 xcodebuild[37612:6496433] DVTAssertions: Warning in /Library/Caches/com.apple.xbs/Sources/IDEXcode3ProjectSupport/IDEXcode3ProjectSupport-15508/Xcode3Core/LegacyProjects/Frameworks/DevToolsCore/DevToolsCore/BuildSystem/Runtime/PBXTargetBuildContext.mm:759
Details: unexpected successful exit code from cancelled command <C0025:'CpResource Release_instructor-live.xcconfig':P14>
Object: <PBXTargetBuildContext: 0x7fbf3a459b80>
Method: -createCommandInvocationRecordFromInvocation:
Thread: <NSThread: 0x7fbf3bc57e40>{number = 23, name = (null)}
Please file a bug at https://feedbackassistant.apple.com with this warning message and any useful information you can provide.
2019-12-09 21:44:32.774 xcodebuild[37612:6496433] DVTAssertions: Warning in /Library/Caches/com.apple.xbs/Sources/IDEXcode3ProjectSupport/IDEXcode3ProjectSupport-15508/Xcode3Core/LegacyProjects/Frameworks/DevToolsCore/DevToolsCore/BuildSystem/Runtime/PBXTargetBuildContext.mm:759
Details: unexpected successful exit code from cancelled command <C0023:'CpResource Debug_learner.xcconfig':P14>
Object: <PBXTargetBuildContext: 0x7fbf3a459b80>
Method: -createCommandInvocationRecordFromInvocation:
Thread: <NSThread: 0x7fbf3bc57e40>{number = 23, name = (null)}
Please file a bug at https://feedbackassistant.apple.com with this warning message and any useful information you can provide.
2019-12-09 21:44:32.774 xcodebuild[37612:6496433] DVTAssertions: Warning in /Library/Caches/com.apple.xbs/Sources/IDEXcode3ProjectSupport/IDEXcode3ProjectSupport-15508/Xcode3Core/LegacyProjects/Frameworks/DevToolsCore/DevToolsCore/BuildSystem/Runtime/PBXTargetBuildContext.mm:759
Details: unexpected successful exit code from cancelled command <C0026:'CpResource Release_learner-live.xcconfig':P14>
Object: <PBXTargetBuildContext: 0x7fbf3a459b80>
Method: -createCommandInvocationRecordFromInvocation:
Thread: <NSThread: 0x7fbf3bc57e40>{number = 23, name = (null)}
Please file a bug at https://feedbackassistant.apple.com with this warning message and any useful information you can provide.
** BUILD FAILED **
```
```bash
Xcode's output:
↳
=== BUILD TARGET Runner OF PROJECT Runner WITH CONFIGURATION Debug-Learner ===
Debug_learner.xcconfig line 2: Unable to find included file "Pods/Target Support Files/Pods-Runner/Pods-Runner.debug.xcconfig"
Debug_learner.xcconfig line 2: Unable to find included file "Pods/Target Support Files/Pods-Runner/Pods-Runner.debug.xcconfig"
The use of Swift 3 @objc inference in Swift 4 mode is deprecated. Please address deprecated @objc inference warnings, test your code with “Use of deprecated Swift 3 @objc inference” logging enabled, and then disable inference by changing the "Swift 3 @objc Inference" build setting to "Default" for the "Runner" target.
=== BUILD TARGET Runner OF PROJECT Runner WITH CONFIGURATION Debug-Learner ===
error: /Users/arthuryessayan/FlutterProjects/mobile/ios/Pods/Target Support Files/Pods-Runner/Pods-Runner.release.xcconfig: No such file or directory
Could not build the application for the simulator.
Error launching application on iPhone Xs.
```
Output for flutter doctor -v
```bash
[✓] Flutter (Channel stable, v1.9.1+hotfix.6, on Mac OS X 10.15.1 19B88, locale en-AM)
• Flutter version 1.9.1+hotfix.6 at /Users/arthuryessayan/flutter
• Framework revision 68587a0916 (3 months ago), 2019-09-13 19:46:58 -0700
• Engine revision b863200c37
• Dart version 2.5.0
[✓] Android toolchain - develop for Android devices (Android SDK version 29.0.2)
• Android SDK at /Users/arthuryessayan/Library/Android/sdk
• Android NDK location not configured (optional; useful for native profiling support)
• Platform android-29, build-tools 29.0.2
• Java binary at: /Applications/Android Studio.app/Contents/jre/jdk/Contents/Home/bin/java
• Java version OpenJDK Runtime Environment (build 1.8.0_202-release-1483-b49-5587405)
• All Android licenses accepted.
[✓] Xcode - develop for iOS and macOS (Xcode 11.2.1)
• Xcode at /Applications/Xcode.app/Contents/Developer
• Xcode 11.2.1, Build version 11B500
• CocoaPods version 1.8.4
[✓] Android Studio (version 3.5)
• Android Studio at /Applications/Android Studio.app/Contents
• Flutter plugin version 42.0.1
• Dart plugin version 191.8593
• Java version OpenJDK Runtime Environment (build 1.8.0_202-release-1483-b49-5587405)
[✓] VS Code (version 1.40.1)
• VS Code at /Applications/Visual Studio Code.app/Contents
• Flutter extension version 3.4.1
[✓] Connected device (3 available)
• AOSP on IA Emulator • emulator-5554 • android-x86 • Android 9 (API 28) (emulator)
• Arthur’s iPhone • 5b4664bd78df5d4cfccc682d6916bbxxxxxxxxxx • ios • iOS 12.4.1
• iPhone Xs • 18659A3D-4493-47A9-A9AF-0970B35C0235 • ios • com.apple.CoreSimulator.SimRuntime.iOS-12-2 (simulator)
• No issues found!
```
I also tried to move the .plist files in the "google" folder to "Runner" folder, but still no luck
<img width="720" alt="Screen Shot 2019-12-09 at 21 53 22" src="https://user-images.githubusercontent.com/4170315/70459821-6b0fbc80-1ace-11ea-8453-40bd25acbe1c.png">
| 1.0 | Unable to find included file "Pods/Target Support Files/Pods-Runner/Pods-Runner.debug.xcconfig" - Hi, I'm getting this error when building the Flutter app for iOS simulator. Here's the complete log
```bash
Launching lib/main_learner.dart on iPhone Xs in debug mode...
Running Xcode build...
Xcode build done. 24.1s
Failed to build iOS app
Error output from Xcode build:
↳
2019-12-09 21:44:32.774 xcodebuild[37612:6496433] DVTAssertions: Warning in /Library/Caches/com.apple.xbs/Sources/IDEXcode3ProjectSupport/IDEXcode3ProjectSupport-15508/Xcode3Core/LegacyProjects/Frameworks/DevToolsCore/DevToolsCore/BuildSystem/Runtime/PBXTargetBuildContext.mm:759
Details: unexpected successful exit code from cancelled command <C0018:'CpResource Generated.xcconfig':P14>
Object: <PBXTargetBuildContext: 0x7fbf3a459b80>
Method: -createCommandInvocationRecordFromInvocation:
Thread: <NSThread: 0x7fbf3bc57e40>{number = 23, name = (null)}
Please file a bug at https://feedbackassistant.apple.com with this warning message and any useful information you can provide.
2019-12-09 21:44:32.774 xcodebuild[37612:6496433] DVTAssertions: Warning in /Library/Caches/com.apple.xbs/Sources/IDEXcode3ProjectSupport/IDEXcode3ProjectSupport-15508/Xcode3Core/LegacyProjects/Frameworks/DevToolsCore/DevToolsCore/BuildSystem/Runtime/PBXTargetBuildContext.mm:759
Details: unexpected successful exit code from cancelled command <C0025:'CpResource Release_instructor-live.xcconfig':P14>
Object: <PBXTargetBuildContext: 0x7fbf3a459b80>
Method: -createCommandInvocationRecordFromInvocation:
Thread: <NSThread: 0x7fbf3bc57e40>{number = 23, name = (null)}
Please file a bug at https://feedbackassistant.apple.com with this warning message and any useful information you can provide.
2019-12-09 21:44:32.774 xcodebuild[37612:6496433] DVTAssertions: Warning in /Library/Caches/com.apple.xbs/Sources/IDEXcode3ProjectSupport/IDEXcode3ProjectSupport-15508/Xcode3Core/LegacyProjects/Frameworks/DevToolsCore/DevToolsCore/BuildSystem/Runtime/PBXTargetBuildContext.mm:759
Details: unexpected successful exit code from cancelled command <C0023:'CpResource Debug_learner.xcconfig':P14>
Object: <PBXTargetBuildContext: 0x7fbf3a459b80>
Method: -createCommandInvocationRecordFromInvocation:
Thread: <NSThread: 0x7fbf3bc57e40>{number = 23, name = (null)}
Please file a bug at https://feedbackassistant.apple.com with this warning message and any useful information you can provide.
2019-12-09 21:44:32.774 xcodebuild[37612:6496433] DVTAssertions: Warning in /Library/Caches/com.apple.xbs/Sources/IDEXcode3ProjectSupport/IDEXcode3ProjectSupport-15508/Xcode3Core/LegacyProjects/Frameworks/DevToolsCore/DevToolsCore/BuildSystem/Runtime/PBXTargetBuildContext.mm:759
Details: unexpected successful exit code from cancelled command <C0026:'CpResource Release_learner-live.xcconfig':P14>
Object: <PBXTargetBuildContext: 0x7fbf3a459b80>
Method: -createCommandInvocationRecordFromInvocation:
Thread: <NSThread: 0x7fbf3bc57e40>{number = 23, name = (null)}
Please file a bug at https://feedbackassistant.apple.com with this warning message and any useful information you can provide.
** BUILD FAILED **
```
```bash
Xcode's output:
↳
=== BUILD TARGET Runner OF PROJECT Runner WITH CONFIGURATION Debug-Learner ===
Debug_learner.xcconfig line 2: Unable to find included file "Pods/Target Support Files/Pods-Runner/Pods-Runner.debug.xcconfig"
Debug_learner.xcconfig line 2: Unable to find included file "Pods/Target Support Files/Pods-Runner/Pods-Runner.debug.xcconfig"
The use of Swift 3 @objc inference in Swift 4 mode is deprecated. Please address deprecated @objc inference warnings, test your code with “Use of deprecated Swift 3 @objc inference” logging enabled, and then disable inference by changing the "Swift 3 @objc Inference" build setting to "Default" for the "Runner" target.
=== BUILD TARGET Runner OF PROJECT Runner WITH CONFIGURATION Debug-Learner ===
error: /Users/arthuryessayan/FlutterProjects/mobile/ios/Pods/Target Support Files/Pods-Runner/Pods-Runner.release.xcconfig: No such file or directory
Could not build the application for the simulator.
Error launching application on iPhone Xs.
```
Output for flutter doctor -v
```bash
[✓] Flutter (Channel stable, v1.9.1+hotfix.6, on Mac OS X 10.15.1 19B88, locale en-AM)
• Flutter version 1.9.1+hotfix.6 at /Users/arthuryessayan/flutter
• Framework revision 68587a0916 (3 months ago), 2019-09-13 19:46:58 -0700
• Engine revision b863200c37
• Dart version 2.5.0
[✓] Android toolchain - develop for Android devices (Android SDK version 29.0.2)
• Android SDK at /Users/arthuryessayan/Library/Android/sdk
• Android NDK location not configured (optional; useful for native profiling support)
• Platform android-29, build-tools 29.0.2
• Java binary at: /Applications/Android Studio.app/Contents/jre/jdk/Contents/Home/bin/java
• Java version OpenJDK Runtime Environment (build 1.8.0_202-release-1483-b49-5587405)
• All Android licenses accepted.
[✓] Xcode - develop for iOS and macOS (Xcode 11.2.1)
• Xcode at /Applications/Xcode.app/Contents/Developer
• Xcode 11.2.1, Build version 11B500
• CocoaPods version 1.8.4
[✓] Android Studio (version 3.5)
• Android Studio at /Applications/Android Studio.app/Contents
• Flutter plugin version 42.0.1
• Dart plugin version 191.8593
• Java version OpenJDK Runtime Environment (build 1.8.0_202-release-1483-b49-5587405)
[✓] VS Code (version 1.40.1)
• VS Code at /Applications/Visual Studio Code.app/Contents
• Flutter extension version 3.4.1
[✓] Connected device (3 available)
• AOSP on IA Emulator • emulator-5554 • android-x86 • Android 9 (API 28) (emulator)
• Arthur’s iPhone • 5b4664bd78df5d4cfccc682d6916bbxxxxxxxxxx • ios • iOS 12.4.1
• iPhone Xs • 18659A3D-4493-47A9-A9AF-0970B35C0235 • ios • com.apple.CoreSimulator.SimRuntime.iOS-12-2 (simulator)
• No issues found!
```
I also tried to move the .plist files in the "google" folder to "Runner" folder, but still no luck
<img width="720" alt="Screen Shot 2019-12-09 at 21 53 22" src="https://user-images.githubusercontent.com/4170315/70459821-6b0fbc80-1ace-11ea-8453-40bd25acbe1c.png">
| build | unable to find included file pods target support files pods runner pods runner debug xcconfig hi i m getting this error when building the flutter app for ios simulator here s the complete log bash launching lib main learner dart on iphone xs in debug mode running xcode build xcode build done failed to build ios app error output from xcode build ↳ xcodebuild dvtassertions warning in library caches com apple xbs sources legacyprojects frameworks devtoolscore devtoolscore buildsystem runtime pbxtargetbuildcontext mm details unexpected successful exit code from cancelled command object method createcommandinvocationrecordfrominvocation thread number name null please file a bug at with this warning message and any useful information you can provide xcodebuild dvtassertions warning in library caches com apple xbs sources legacyprojects frameworks devtoolscore devtoolscore buildsystem runtime pbxtargetbuildcontext mm details unexpected successful exit code from cancelled command object method createcommandinvocationrecordfrominvocation thread number name null please file a bug at with this warning message and any useful information you can provide xcodebuild dvtassertions warning in library caches com apple xbs sources legacyprojects frameworks devtoolscore devtoolscore buildsystem runtime pbxtargetbuildcontext mm details unexpected successful exit code from cancelled command object method createcommandinvocationrecordfrominvocation thread number name null please file a bug at with this warning message and any useful information you can provide xcodebuild dvtassertions warning in library caches com apple xbs sources legacyprojects frameworks devtoolscore devtoolscore buildsystem runtime pbxtargetbuildcontext mm details unexpected successful exit code from cancelled command object method createcommandinvocationrecordfrominvocation thread number name null please file a bug at with this warning message and any useful information you can provide build failed bash xcode s output ↳ build target runner of project runner with configuration debug learner debug learner xcconfig line unable to find included file pods target support files pods runner pods runner debug xcconfig debug learner xcconfig line unable to find included file pods target support files pods runner pods runner debug xcconfig the use of swift objc inference in swift mode is deprecated please address deprecated objc inference warnings test your code with “use of deprecated swift objc inference” logging enabled and then disable inference by changing the swift objc inference build setting to default for the runner target build target runner of project runner with configuration debug learner error users arthuryessayan flutterprojects mobile ios pods target support files pods runner pods runner release xcconfig no such file or directory could not build the application for the simulator error launching application on iphone xs output for flutter doctor v bash flutter channel stable hotfix on mac os x locale en am • flutter version hotfix at users arthuryessayan flutter • framework revision months ago • engine revision • dart version android toolchain develop for android devices android sdk version • android sdk at users arthuryessayan library android sdk • android ndk location not configured optional useful for native profiling support • platform android build tools • java binary at applications android studio app contents jre jdk contents home bin java • java version openjdk runtime environment build release • all android licenses accepted xcode develop for ios and macos xcode • xcode at applications xcode app contents developer • xcode build version • cocoapods version android studio version • android studio at applications android studio app contents • flutter plugin version • dart plugin version • java version openjdk runtime environment build release vs code version • vs code at applications visual studio code app contents • flutter extension version connected device available • aosp on ia emulator • emulator • android • android api emulator • arthur’s iphone • • ios • ios • iphone xs • • ios • com apple coresimulator simruntime ios simulator • no issues found i also tried to move the plist files in the google folder to runner folder but still no luck img width alt screen shot at src | 1 |
86,275 | 8,030,542,291 | IssuesEvent | 2018-07-27 20:03:33 | owncloud/client | https://api.github.com/repos/owncloud/client | reopened | Re-shares not shown in client UI | PR available ReadyToTest Sharing p2-high | ### Expected behaviour
Owner of a share should also see recipients of re-shares, same as in web UI.
### Actual behaviour
Re-shares not shown in client UI.
### Steps to reproduce
1. User A shares folder with User B
2. User B shares received folder with User C
### Server UI
| User A | User B | User C |
|---|---|---|
|  |  |  |
### Client UI
| User A | User B | User C |
|---|---|---|
| <img width="443" alt="owncloud_sharing_and_ownclouds_and_mstingl_oc_and_rocket_chat__and_inbox__8_677_messages_" src="https://user-images.githubusercontent.com/214010/43006671-96b71cca-8c36-11e8-842d-bc8869de41da.png"> | <img width="446" alt="owncloud_sharing_and_ownclouds_and_mstingl_oc_and_rocket_chat__and_inbox__8_677_messages_" src="https://user-images.githubusercontent.com/214010/43006726-be7d73a8-8c36-11e8-80b6-9f6990b714bc.png"> | <img width="406" alt="owncloud_sharing_and_ownclouds_and_mstingl_oc_and_rocket_chat__and_inbox__8_677_messages_" src="https://user-images.githubusercontent.com/214010/43006765-db95c382-8c36-11e8-81ca-b1e67b37d3cb.png"> |
### Server configuration
ownCloud version:
10.0.9
### Client configuration
Client version:
2.4.2
Operating system:
macOS 10.13
OS language:
EN
Qt version used by client package (Linux only, see also Settings dialog):
Qt 5.6.2
## Solution
PR from @jnweiger https://github.com/owncloud/client/pull/6665 | 1.0 | Re-shares not shown in client UI - ### Expected behaviour
Owner of a share should also see recipients of re-shares, same as in web UI.
### Actual behaviour
Re-shares not shown in client UI.
### Steps to reproduce
1. User A shares folder with User B
2. User B shares received folder with User C
### Server UI
| User A | User B | User C |
|---|---|---|
|  |  |  |
### Client UI
| User A | User B | User C |
|---|---|---|
| <img width="443" alt="owncloud_sharing_and_ownclouds_and_mstingl_oc_and_rocket_chat__and_inbox__8_677_messages_" src="https://user-images.githubusercontent.com/214010/43006671-96b71cca-8c36-11e8-842d-bc8869de41da.png"> | <img width="446" alt="owncloud_sharing_and_ownclouds_and_mstingl_oc_and_rocket_chat__and_inbox__8_677_messages_" src="https://user-images.githubusercontent.com/214010/43006726-be7d73a8-8c36-11e8-80b6-9f6990b714bc.png"> | <img width="406" alt="owncloud_sharing_and_ownclouds_and_mstingl_oc_and_rocket_chat__and_inbox__8_677_messages_" src="https://user-images.githubusercontent.com/214010/43006765-db95c382-8c36-11e8-81ca-b1e67b37d3cb.png"> |
### Server configuration
ownCloud version:
10.0.9
### Client configuration
Client version:
2.4.2
Operating system:
macOS 10.13
OS language:
EN
Qt version used by client package (Linux only, see also Settings dialog):
Qt 5.6.2
## Solution
PR from @jnweiger https://github.com/owncloud/client/pull/6665 | non_build | re shares not shown in client ui expected behaviour owner of a share should also see recipients of re shares same as in web ui actual behaviour re shares not shown in client ui steps to reproduce user a shares folder with user b user b shares received folder with user c server ui user a user b user c client ui user a user b user c img width alt owncloud sharing and ownclouds and mstingl oc and rocket chat and inbox messages src img width alt owncloud sharing and ownclouds and mstingl oc and rocket chat and inbox messages src img width alt owncloud sharing and ownclouds and mstingl oc and rocket chat and inbox messages src server configuration owncloud version client configuration client version operating system macos os language en qt version used by client package linux only see also settings dialog qt solution pr from jnweiger | 0 |
44,618 | 11,472,034,807 | IssuesEvent | 2020-02-09 15:00:00 | apache/celix | https://api.github.com/repos/apache/celix | closed | Update build environment to newer ubuntu LTS | build/environment kind/improvement | The current travis env uses ubuntu 14.04. This should be moved to a newer ubuntu LTS (18.04)
Also enable some test from celix_context_bundles_tests.cc
| 1.0 | Update build environment to newer ubuntu LTS - The current travis env uses ubuntu 14.04. This should be moved to a newer ubuntu LTS (18.04)
Also enable some test from celix_context_bundles_tests.cc
| build | update build environment to newer ubuntu lts the current travis env uses ubuntu this should be moved to a newer ubuntu lts also enable some test from celix context bundles tests cc | 1 |
94,172 | 27,134,955,206 | IssuesEvent | 2023-02-16 12:33:54 | UCLH-Foundry/FlowEHR | https://api.github.com/repos/UCLH-Foundry/FlowEHR | opened | Deploy per-app infrastructure | Building Clinical Product Feature | Create the code defining the individual app infrastructure (such as deploying individual Apps on an app service, creating a new Cosmos DB collection for state storage, and so on), as well as setting up gold store connectivity, will be defined within the FlowEHR repo (in `/serve`). More info in [design doc](https://github.com/UCLH-Foundry/Book-of-FlowEHR/pull/148/files)
## Acceptance criteria
- `make apps` or similar will deploy all hosting services required for running a sample containerised application with DB connectivity
- There will be no manual sharing of connection strings for the app to consume
- All required app configuration will be available to the container via App Service config injection
## Tasks
- [ ] New repository created within Serve App Container Registry
- [ ] App Service with custom container enabled, with auto-push from ACR registry
- [ ] Cosmos DB database provisioned
- [ ] MSI connectivity to Feature store
- [ ] MSI connectivity to Cosmos DB Database | 1.0 | Deploy per-app infrastructure - Create the code defining the individual app infrastructure (such as deploying individual Apps on an app service, creating a new Cosmos DB collection for state storage, and so on), as well as setting up gold store connectivity, will be defined within the FlowEHR repo (in `/serve`). More info in [design doc](https://github.com/UCLH-Foundry/Book-of-FlowEHR/pull/148/files)
## Acceptance criteria
- `make apps` or similar will deploy all hosting services required for running a sample containerised application with DB connectivity
- There will be no manual sharing of connection strings for the app to consume
- All required app configuration will be available to the container via App Service config injection
## Tasks
- [ ] New repository created within Serve App Container Registry
- [ ] App Service with custom container enabled, with auto-push from ACR registry
- [ ] Cosmos DB database provisioned
- [ ] MSI connectivity to Feature store
- [ ] MSI connectivity to Cosmos DB Database | build | deploy per app infrastructure create the code defining the individual app infrastructure such as deploying individual apps on an app service creating a new cosmos db collection for state storage and so on as well as setting up gold store connectivity will be defined within the flowehr repo in serve more info in acceptance criteria make apps or similar will deploy all hosting services required for running a sample containerised application with db connectivity there will be no manual sharing of connection strings for the app to consume all required app configuration will be available to the container via app service config injection tasks new repository created within serve app container registry app service with custom container enabled with auto push from acr registry cosmos db database provisioned msi connectivity to feature store msi connectivity to cosmos db database | 1 |
61,332 | 14,970,660,881 | IssuesEvent | 2021-01-27 19:57:07 | awslabs/amazon-kinesis-video-streams-producer-sdk-cpp | https://api.github.com/repos/awslabs/amazon-kinesis-video-streams-producer-sdk-cpp | closed | [QUESTION] How do I build on Mac OS without building dependencies? | build documentation macOS question | I am trying to build the SDK and example on MacOS 10.15.7 without building the dependencies, but have run into issues.
I have run the following:
```
brew install pkg-config openssl cmake gstreamer gst-plugins-base gst-plugins-good gst-plugins-bad gst-plugins-ugly log4cplus gst-libav
```
```
cmake .. -DBUILD_GSTREAMER_PLUGIN=ON -DBUILD_DEPENDENCIES=OFF
```
^ this `cmake` fails with the following error:
```
...
-- Found PkgConfig: /usr/local/bin/pkg-config (found version "0.29.2")
-- Could NOT find OpenSSL, try to set the path to OpenSSL root folder in the system variable OPENSSL_ROOT_DIR (missing: OPENSSL_INCLUDE_DIR)
CMake Error at dependency/libkvscproducer/kvscproducer-src/CMakeLists.txt:118 (message):
OpenSSL is not found. Make sure to export PKG_CONFIG_PATH to where
OpenSSL's pc file is
-- Configuring incomplete, errors occurred!
See also "/Users/mrl/workspaces/amazon-kinesis-video-streams-producer-sdk-cpp/build/CMakeFiles/CMakeOutput.log".
```
So I ran the following command which I found in the console output when brew installed openssl (note that I am using the fish shell):
```
set -gx PKG_CONFIG_PATH "/usr/local/opt/openssl@1.1/lib/pkgconfig"
```
I was then able to successfully re-run `cmake`. **I think that the documentation or `cmake` configuration could use some enhancement to help avoid this issue.**
When I then ran `make`, I got the following error and I am not sure how to get around it:
```
...
[ 79%] Linking C shared library libcproducer.dylib
ld: cannot link directly with dylib/framework, your binary is not an allowed client of /usr/lib/libcrypto.dylib for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
make[2]: *** [dependency/libkvscproducer/kvscproducer-src/libcproducer.dylib] Error 1
make[1]: *** [dependency/libkvscproducer/kvscproducer-src/CMakeFiles/cproducer.dir/all] Error 2
make: *** [all] Error 2
```
**How can I resolve this issue? And what can be done to prevent others from running into this in the future?** | 1.0 | [QUESTION] How do I build on Mac OS without building dependencies? - I am trying to build the SDK and example on MacOS 10.15.7 without building the dependencies, but have run into issues.
I have run the following:
```
brew install pkg-config openssl cmake gstreamer gst-plugins-base gst-plugins-good gst-plugins-bad gst-plugins-ugly log4cplus gst-libav
```
```
cmake .. -DBUILD_GSTREAMER_PLUGIN=ON -DBUILD_DEPENDENCIES=OFF
```
^ this `cmake` fails with the following error:
```
...
-- Found PkgConfig: /usr/local/bin/pkg-config (found version "0.29.2")
-- Could NOT find OpenSSL, try to set the path to OpenSSL root folder in the system variable OPENSSL_ROOT_DIR (missing: OPENSSL_INCLUDE_DIR)
CMake Error at dependency/libkvscproducer/kvscproducer-src/CMakeLists.txt:118 (message):
OpenSSL is not found. Make sure to export PKG_CONFIG_PATH to where
OpenSSL's pc file is
-- Configuring incomplete, errors occurred!
See also "/Users/mrl/workspaces/amazon-kinesis-video-streams-producer-sdk-cpp/build/CMakeFiles/CMakeOutput.log".
```
So I ran the following command which I found in the console output when brew installed openssl (note that I am using the fish shell):
```
set -gx PKG_CONFIG_PATH "/usr/local/opt/openssl@1.1/lib/pkgconfig"
```
I was then able to successfully re-run `cmake`. **I think that the documentation or `cmake` configuration could use some enhancement to help avoid this issue.**
When I then ran `make`, I got the following error and I am not sure how to get around it:
```
...
[ 79%] Linking C shared library libcproducer.dylib
ld: cannot link directly with dylib/framework, your binary is not an allowed client of /usr/lib/libcrypto.dylib for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
make[2]: *** [dependency/libkvscproducer/kvscproducer-src/libcproducer.dylib] Error 1
make[1]: *** [dependency/libkvscproducer/kvscproducer-src/CMakeFiles/cproducer.dir/all] Error 2
make: *** [all] Error 2
```
**How can I resolve this issue? And what can be done to prevent others from running into this in the future?** | build | how do i build on mac os without building dependencies i am trying to build the sdk and example on macos without building the dependencies but have run into issues i have run the following brew install pkg config openssl cmake gstreamer gst plugins base gst plugins good gst plugins bad gst plugins ugly gst libav cmake dbuild gstreamer plugin on dbuild dependencies off this cmake fails with the following error found pkgconfig usr local bin pkg config found version could not find openssl try to set the path to openssl root folder in the system variable openssl root dir missing openssl include dir cmake error at dependency libkvscproducer kvscproducer src cmakelists txt message openssl is not found make sure to export pkg config path to where openssl s pc file is configuring incomplete errors occurred see also users mrl workspaces amazon kinesis video streams producer sdk cpp build cmakefiles cmakeoutput log so i ran the following command which i found in the console output when brew installed openssl note that i am using the fish shell set gx pkg config path usr local opt openssl lib pkgconfig i was then able to successfully re run cmake i think that the documentation or cmake configuration could use some enhancement to help avoid this issue when i then ran make i got the following error and i am not sure how to get around it linking c shared library libcproducer dylib ld cannot link directly with dylib framework your binary is not an allowed client of usr lib libcrypto dylib for architecture clang error linker command failed with exit code use v to see invocation make error make error make error how can i resolve this issue and what can be done to prevent others from running into this in the future | 1 |
258,408 | 19,559,752,945 | IssuesEvent | 2022-01-03 14:43:11 | RTUITLab/ITTV | https://api.github.com/repos/RTUITLab/ITTV | closed | Write requirements | documentation | Описать в README SDK, которые необходимо поставить для начала работы. Kinect SDK, Visual Studio и её компоненты, и тд | 1.0 | Write requirements - Описать в README SDK, которые необходимо поставить для начала работы. Kinect SDK, Visual Studio и её компоненты, и тд | non_build | write requirements описать в readme sdk которые необходимо поставить для начала работы kinect sdk visual studio и её компоненты и тд | 0 |
11,916 | 5,108,204,628 | IssuesEvent | 2017-01-05 17:00:51 | semihalf-berestovskyy-andriy/test2 | https://api.github.com/repos/semihalf-berestovskyy-andriy/test2 | closed | Tools - Sikuli | build & test enhancement | Note: the issue was imported automatically from Bugzilla with bugzilla2issues.py tool
# Bugzilla Bug ID: 23
Date: 2015-06-03 08:46:37 +0200
From: Bogdan Pricope <bogdan.pricope@enea.com>
To: Sorin Vultureanu <sorin.vultureanu@enea.com>
CC: jose.pekkarinen@nokia.com
Last updated: 2015-11-24 11:53:50 +0100
## Bugzilla Comment ID: 28
Date: 2015-06-03 08:46:37 +0200
From: Bogdan Pricope <bogdan.pricope@enea.com>
Tools - Sikuli
## Bugzilla Comment ID: 131
Date: 2015-11-24 11:39:14 +0100
From: José Pekkarinen <jose.pekkarinen@nokia.com>
I think this should be closed unless anyone is interested in working in the integration of this tool.
## Bugzilla Comment ID: 137
Date: 2015-11-24 11:53:50 +0100
From: Sorin Vultureanu <sorin.vultureanu@enea.com>
This tool did not come to discussion since the start of the incubation project.
It can be close as this is not used in ODP and we don't plan to use it.
| 1.0 | Tools - Sikuli - Note: the issue was imported automatically from Bugzilla with bugzilla2issues.py tool
# Bugzilla Bug ID: 23
Date: 2015-06-03 08:46:37 +0200
From: Bogdan Pricope <bogdan.pricope@enea.com>
To: Sorin Vultureanu <sorin.vultureanu@enea.com>
CC: jose.pekkarinen@nokia.com
Last updated: 2015-11-24 11:53:50 +0100
## Bugzilla Comment ID: 28
Date: 2015-06-03 08:46:37 +0200
From: Bogdan Pricope <bogdan.pricope@enea.com>
Tools - Sikuli
## Bugzilla Comment ID: 131
Date: 2015-11-24 11:39:14 +0100
From: José Pekkarinen <jose.pekkarinen@nokia.com>
I think this should be closed unless anyone is interested in working in the integration of this tool.
## Bugzilla Comment ID: 137
Date: 2015-11-24 11:53:50 +0100
From: Sorin Vultureanu <sorin.vultureanu@enea.com>
This tool did not come to discussion since the start of the incubation project.
It can be close as this is not used in ODP and we don't plan to use it.
| build | tools sikuli note the issue was imported automatically from bugzilla with py tool bugzilla bug id date from bogdan pricope to sorin vultureanu cc jose pekkarinen nokia com last updated bugzilla comment id date from bogdan pricope tools sikuli bugzilla comment id date from josé pekkarinen i think this should be closed unless anyone is interested in working in the integration of this tool bugzilla comment id date from sorin vultureanu this tool did not come to discussion since the start of the incubation project it can be close as this is not used in odp and we don t plan to use it | 1 |
51,848 | 12,819,384,937 | IssuesEvent | 2020-07-06 01:55:58 | ballerina-platform/ballerina-lang | https://api.github.com/repos/ballerina-platform/ballerina-lang | closed | Constraint of the map not shown in function API docs | Area/BuildTools Component/Docerina Type/Bug Type/Docs | 
In the above it just simply say `map env`. It's missing the constraint of the `env` map. | 1.0 | Constraint of the map not shown in function API docs - 
In the above it just simply say `map env`. It's missing the constraint of the `env` map. | build | constraint of the map not shown in function api docs in the above it just simply say map env it s missing the constraint of the env map | 1 |
13,160 | 5,306,314,584 | IssuesEvent | 2017-02-11 00:26:02 | rust-lang/rust | https://api.github.com/repos/rust-lang/rust | closed | tests.mk/runtest.rs splits RUSTFLAGS/host_rustcflags,target_rustcflags inappropriately | A-build | edit: **TL;DR:** tests.mk and runtest.rs respectively split `RUSTFLAGS` and `{host,target}_rustcflags` inappropriately; reproduce this in rust upstream (1.8 and earlier) by:
- test.mk bug: set `RUSTFLAGS="-C link-args=\"-Wl,-Bsymbolic-functions -Wl,-z,relro\""` when building tests
- runtest.rs bug: set `RUSTFLAGS="-C link-args='-Wl,-Bsymbolic-functions -Wl,-z,relro'"` when running tests.
**These are two separate and independent bugs.**
---
Hi, I'm having a hard time debugging a build failure and was wondering if anyone could help. We have multiple identical build failures for Ubuntu across different versions of rustc e.g. see [1.5 on amd64-xenial](https://launchpadlibrarian.net/232314146/buildlog_ubuntu-xenial-amd64.rustc_1.5.0+dfsg1-1_BUILDING.txt.gz), [1.8-nightly-20160209 on i386-xenial](https://launchpadlibrarian.net/237257461/buildlog_ubuntu-xenial-i386.rustc_1.8.0~~nightly.20160209+dfsg1-1_BUILDING.txt.gz), [1.8-nightly-20160209 on amd64-xenial](https://launchpadlibrarian.net/237383512/buildlog_ubuntu-xenial-amd64.rustc_1.8.0~~nightly.20160209+dfsg1-1_BUILDING.txt.gz). The packages are based on the [working](https://buildd.debian.org/status/package.php?p=rustc) Debian packages.
The failure looks related to the command line being run; however they are very similar on Debian and Ubuntu (newlines manually inserted for clarity):
```
--- debian (from my local testing)
+++ ubuntu (from the 1.8 amd64-xenial link above)
@@ -1,41 +1,41 @@
LD_LIBRARY_PATH=
/«BUILDDIR»/rustc-1.8.0~~nightly.20160209+dfsg1/x86_64-unknown-linux-gnu/stage2/lib:
/usr/lib/llvm-3.7/lib:
$LD_LIBRARY_PATH
x86_64-unknown-linux-gnu/stage2/bin/compiletest
--compile-lib-path x86_64-unknown-linux-gnu/stage2/lib
--run-lib-path x86_64-unknown-linux-gnu/stage2/lib/rustlib/x86_64-unknown-linux-gnu/lib
--rustc-path x86_64-unknown-linux-gnu/stage2/bin/rustc
--rustdoc-path x86_64-unknown-linux-gnu/stage2/bin/rustdoc
--llvm-bin-path /usr/lib/llvm-3.7/bin
--aux-base /«BUILDDIR»/rustc-1.8.0~~nightly.20160209+dfsg1/src/test/auxiliary/
--stage-id stage2-x86_64-unknown-linux-gnu
--target x86_64-unknown-linux-gnu
--host x86_64-unknown-linux-gnu
--python "/usr/bin/python2.7"
- --gdb-version="GNU gdb (Debian 7.10-1+b1) 7.10"
+ --gdb-version="GNU gdb (Ubuntu 7.10.1-0ubuntu1) 7.10.1"
--lldb-version=""
--android-cross-path=/opt/ndk_standalone
--adb-path=
--adb-test-dir=
- --host-rustcflags " -C link-args="-Wl,-z,relro" --cfg rtopt -C rpath -O -L x86_64-unknown-linux-gnu/rt"
+ --host-rustcflags " -C link-args="-Wl,-Bsymbolic-functions -Wl,-z,relro" --cfg rtopt -C rpath -O -L x86_64-unknown-linux-gnu/rt"
--lldb-python-dir=
- --target-rustcflags " -C link-args="-Wl,-z,relro" --cfg rtopt -C rpath -O -L x86_64-unknown-linux-gnu/rt"
+ --target-rustcflags " -C link-args="-Wl,-Bsymbolic-functions -Wl,-z,relro" --cfg rtopt -C rpath -O -L x86_64-unknown-linux-gnu/rt"
--verbose
--valgrind-path ""/usr/bin/valgrind"
--error-exitcode=100
--fair-sched=try
--quiet
--soname-synonyms=somalloc=NONE
--suppressions=/«BUILDDIR»/rustc-1.8.0~~nightly.20160209+dfsg1/src/etc/x86.supp
--tool=memcheck
--leak-check=full"
--force-valgrind
--src-base /«BUILDDIR»/rustc-1.8.0~~nightly.20160209+dfsg1/src/test/rustdoc/
--build-base x86_64-unknown-linux-gnu/test/rustdoc/
--mode rustdoc
--logfile tmp/check-stage2-T-x86_64-unknown-linux-gnu-H-x86_64-unknown-linux-gnu-rustdocck.log &&
touch -r
tmp/check-stage2-T-x86_64-unknown-linux-gnu-H-x86_64-unknown-linux-gnu-rustdocck.ok.start_time
tmp/check-stage2-T-x86_64-unknown-linux-gnu-H-x86_64-unknown-linux-gnu-rustdocck.ok &&
rm tmp/check-stage2-T-x86_64-unknown-linux-gnu-H-x86_64-unknown-linux-gnu-rustdocck.ok.start_time
```
The failure on Ubuntu looks like this:
```
run rustdocck [i686-unknown-linux-gnu]: i686-unknown-linux-gnu/stage2/bin/compiletest
thread '<main>' panicked at 'UnrecognizedOption("W")', /«BUILDDIR»/rustc-1.8.0~~nightly.20160209+dfsg1/src/compiletest/compiletest.rs:101
stack backtrace:
1: 0xf737b154 - sys::backtrace::tracing::imp::write::h59a57150de078a41Btu
2: 0xf7383891 - panicking::default_handler::_$u7b$$u7b$closure$u7d$$u7d$::closure.43171
3: 0xf7383412 - panicking::default_handler::he7b491197c6c99803Wy
4: 0xf7347e85 - sys_common::unwind::begin_unwind_inner::h0ac7a886eb1f5e41qit
5: 0xf7348937 - sys_common::unwind::begin_unwind_fmt::h5539aae7d4657c99wht
6: 0xf76e23cd - parse_config::hd63e903e447ce232uCd
7: 0xf76da65c - main::hcc56979a8470e836GBd
8: 0xf7382e8a - sys_common::unwind::try::try_fn::h1480467219455099781
9: 0xf7378a57 - __rust_try
10: 0xf7382b5c - rt::lang_start::hb58bd81a27354c059Oy
11: 0xf76e635d - main
12: 0xf710271d - __libc_start_main
13: 0xf76d9d10 - <unknown>
/«BUILDDIR»/rustc-1.8.0~~nightly.20160209+dfsg1/mk/tests.mk:743: recipe for target 'tmp/check-stage2-T-i686-unknown-linux-gnu-H-i686-unknown-linux-gnu-rustdocck.ok' failed
make[2]: *** [tmp/check-stage2-T-i686-unknown-linux-gnu-H-i686-unknown-linux-gnu-rustdocck.ok] Error 101
```
| 1.0 | tests.mk/runtest.rs splits RUSTFLAGS/host_rustcflags,target_rustcflags inappropriately - edit: **TL;DR:** tests.mk and runtest.rs respectively split `RUSTFLAGS` and `{host,target}_rustcflags` inappropriately; reproduce this in rust upstream (1.8 and earlier) by:
- test.mk bug: set `RUSTFLAGS="-C link-args=\"-Wl,-Bsymbolic-functions -Wl,-z,relro\""` when building tests
- runtest.rs bug: set `RUSTFLAGS="-C link-args='-Wl,-Bsymbolic-functions -Wl,-z,relro'"` when running tests.
**These are two separate and independent bugs.**
---
Hi, I'm having a hard time debugging a build failure and was wondering if anyone could help. We have multiple identical build failures for Ubuntu across different versions of rustc e.g. see [1.5 on amd64-xenial](https://launchpadlibrarian.net/232314146/buildlog_ubuntu-xenial-amd64.rustc_1.5.0+dfsg1-1_BUILDING.txt.gz), [1.8-nightly-20160209 on i386-xenial](https://launchpadlibrarian.net/237257461/buildlog_ubuntu-xenial-i386.rustc_1.8.0~~nightly.20160209+dfsg1-1_BUILDING.txt.gz), [1.8-nightly-20160209 on amd64-xenial](https://launchpadlibrarian.net/237383512/buildlog_ubuntu-xenial-amd64.rustc_1.8.0~~nightly.20160209+dfsg1-1_BUILDING.txt.gz). The packages are based on the [working](https://buildd.debian.org/status/package.php?p=rustc) Debian packages.
The failure looks related to the command line being run; however they are very similar on Debian and Ubuntu (newlines manually inserted for clarity):
```
--- debian (from my local testing)
+++ ubuntu (from the 1.8 amd64-xenial link above)
@@ -1,41 +1,41 @@
LD_LIBRARY_PATH=
/«BUILDDIR»/rustc-1.8.0~~nightly.20160209+dfsg1/x86_64-unknown-linux-gnu/stage2/lib:
/usr/lib/llvm-3.7/lib:
$LD_LIBRARY_PATH
x86_64-unknown-linux-gnu/stage2/bin/compiletest
--compile-lib-path x86_64-unknown-linux-gnu/stage2/lib
--run-lib-path x86_64-unknown-linux-gnu/stage2/lib/rustlib/x86_64-unknown-linux-gnu/lib
--rustc-path x86_64-unknown-linux-gnu/stage2/bin/rustc
--rustdoc-path x86_64-unknown-linux-gnu/stage2/bin/rustdoc
--llvm-bin-path /usr/lib/llvm-3.7/bin
--aux-base /«BUILDDIR»/rustc-1.8.0~~nightly.20160209+dfsg1/src/test/auxiliary/
--stage-id stage2-x86_64-unknown-linux-gnu
--target x86_64-unknown-linux-gnu
--host x86_64-unknown-linux-gnu
--python "/usr/bin/python2.7"
- --gdb-version="GNU gdb (Debian 7.10-1+b1) 7.10"
+ --gdb-version="GNU gdb (Ubuntu 7.10.1-0ubuntu1) 7.10.1"
--lldb-version=""
--android-cross-path=/opt/ndk_standalone
--adb-path=
--adb-test-dir=
- --host-rustcflags " -C link-args="-Wl,-z,relro" --cfg rtopt -C rpath -O -L x86_64-unknown-linux-gnu/rt"
+ --host-rustcflags " -C link-args="-Wl,-Bsymbolic-functions -Wl,-z,relro" --cfg rtopt -C rpath -O -L x86_64-unknown-linux-gnu/rt"
--lldb-python-dir=
- --target-rustcflags " -C link-args="-Wl,-z,relro" --cfg rtopt -C rpath -O -L x86_64-unknown-linux-gnu/rt"
+ --target-rustcflags " -C link-args="-Wl,-Bsymbolic-functions -Wl,-z,relro" --cfg rtopt -C rpath -O -L x86_64-unknown-linux-gnu/rt"
--verbose
--valgrind-path ""/usr/bin/valgrind"
--error-exitcode=100
--fair-sched=try
--quiet
--soname-synonyms=somalloc=NONE
--suppressions=/«BUILDDIR»/rustc-1.8.0~~nightly.20160209+dfsg1/src/etc/x86.supp
--tool=memcheck
--leak-check=full"
--force-valgrind
--src-base /«BUILDDIR»/rustc-1.8.0~~nightly.20160209+dfsg1/src/test/rustdoc/
--build-base x86_64-unknown-linux-gnu/test/rustdoc/
--mode rustdoc
--logfile tmp/check-stage2-T-x86_64-unknown-linux-gnu-H-x86_64-unknown-linux-gnu-rustdocck.log &&
touch -r
tmp/check-stage2-T-x86_64-unknown-linux-gnu-H-x86_64-unknown-linux-gnu-rustdocck.ok.start_time
tmp/check-stage2-T-x86_64-unknown-linux-gnu-H-x86_64-unknown-linux-gnu-rustdocck.ok &&
rm tmp/check-stage2-T-x86_64-unknown-linux-gnu-H-x86_64-unknown-linux-gnu-rustdocck.ok.start_time
```
The failure on Ubuntu looks like this:
```
run rustdocck [i686-unknown-linux-gnu]: i686-unknown-linux-gnu/stage2/bin/compiletest
thread '<main>' panicked at 'UnrecognizedOption("W")', /«BUILDDIR»/rustc-1.8.0~~nightly.20160209+dfsg1/src/compiletest/compiletest.rs:101
stack backtrace:
1: 0xf737b154 - sys::backtrace::tracing::imp::write::h59a57150de078a41Btu
2: 0xf7383891 - panicking::default_handler::_$u7b$$u7b$closure$u7d$$u7d$::closure.43171
3: 0xf7383412 - panicking::default_handler::he7b491197c6c99803Wy
4: 0xf7347e85 - sys_common::unwind::begin_unwind_inner::h0ac7a886eb1f5e41qit
5: 0xf7348937 - sys_common::unwind::begin_unwind_fmt::h5539aae7d4657c99wht
6: 0xf76e23cd - parse_config::hd63e903e447ce232uCd
7: 0xf76da65c - main::hcc56979a8470e836GBd
8: 0xf7382e8a - sys_common::unwind::try::try_fn::h1480467219455099781
9: 0xf7378a57 - __rust_try
10: 0xf7382b5c - rt::lang_start::hb58bd81a27354c059Oy
11: 0xf76e635d - main
12: 0xf710271d - __libc_start_main
13: 0xf76d9d10 - <unknown>
/«BUILDDIR»/rustc-1.8.0~~nightly.20160209+dfsg1/mk/tests.mk:743: recipe for target 'tmp/check-stage2-T-i686-unknown-linux-gnu-H-i686-unknown-linux-gnu-rustdocck.ok' failed
make[2]: *** [tmp/check-stage2-T-i686-unknown-linux-gnu-H-i686-unknown-linux-gnu-rustdocck.ok] Error 101
```
| build | tests mk runtest rs splits rustflags host rustcflags target rustcflags inappropriately edit tl dr tests mk and runtest rs respectively split rustflags and host target rustcflags inappropriately reproduce this in rust upstream and earlier by test mk bug set rustflags c link args wl bsymbolic functions wl z relro when building tests runtest rs bug set rustflags c link args wl bsymbolic functions wl z relro when running tests these are two separate and independent bugs hi i m having a hard time debugging a build failure and was wondering if anyone could help we have multiple identical build failures for ubuntu across different versions of rustc e g see the packages are based on the debian packages the failure looks related to the command line being run however they are very similar on debian and ubuntu newlines manually inserted for clarity debian from my local testing ubuntu from the xenial link above ld library path «builddir» rustc nightly unknown linux gnu lib usr lib llvm lib ld library path unknown linux gnu bin compiletest compile lib path unknown linux gnu lib run lib path unknown linux gnu lib rustlib unknown linux gnu lib rustc path unknown linux gnu bin rustc rustdoc path unknown linux gnu bin rustdoc llvm bin path usr lib llvm bin aux base «builddir» rustc nightly src test auxiliary stage id unknown linux gnu target unknown linux gnu host unknown linux gnu python usr bin gdb version gnu gdb debian gdb version gnu gdb ubuntu lldb version android cross path opt ndk standalone adb path adb test dir host rustcflags c link args wl z relro cfg rtopt c rpath o l unknown linux gnu rt host rustcflags c link args wl bsymbolic functions wl z relro cfg rtopt c rpath o l unknown linux gnu rt lldb python dir target rustcflags c link args wl z relro cfg rtopt c rpath o l unknown linux gnu rt target rustcflags c link args wl bsymbolic functions wl z relro cfg rtopt c rpath o l unknown linux gnu rt verbose valgrind path usr bin valgrind error exitcode fair sched try quiet soname synonyms somalloc none suppressions «builddir» rustc nightly src etc supp tool memcheck leak check full force valgrind src base «builddir» rustc nightly src test rustdoc build base unknown linux gnu test rustdoc mode rustdoc logfile tmp check t unknown linux gnu h unknown linux gnu rustdocck log touch r tmp check t unknown linux gnu h unknown linux gnu rustdocck ok start time tmp check t unknown linux gnu h unknown linux gnu rustdocck ok rm tmp check t unknown linux gnu h unknown linux gnu rustdocck ok start time the failure on ubuntu looks like this run rustdocck unknown linux gnu bin compiletest thread panicked at unrecognizedoption w «builddir» rustc nightly src compiletest compiletest rs stack backtrace sys backtrace tracing imp write panicking default handler closure closure panicking default handler sys common unwind begin unwind inner sys common unwind begin unwind fmt parse config main sys common unwind try try fn rust try rt lang start main libc start main «builddir» rustc nightly mk tests mk recipe for target tmp check t unknown linux gnu h unknown linux gnu rustdocck ok failed make error | 1 |
98,012 | 29,180,862,790 | IssuesEvent | 2023-05-19 11:47:32 | atc0005/brick | https://api.github.com/repos/atc0005/brick | closed | Makefile: Refresh recipes to add "standard" set, new package-related options | enhancement dependencies builds packages | ## Overview
Update/sync this project's Makefile against recent changes to the atc0005/check-cert project Makefile to provide the same functionality.
## TODO
- arch and OS-specific builds
- `release-build`
- used to generate just the binaries provided by a specific project (e.g., skip Windows (all), skip Linux x86)
- `links` (and arch-specific variations)
- used to generate download links for easier bulk retrieval of release assets
- `quick`
- `go build` without any custom settings applied
- it is a *quick* build to test/prototype binaries without waiting for release-level optimizations to be applied
- `depsinstall`
- install build related dependencies
- package generation
- `packages-stable`
- `packages-dev`
| 1.0 | Makefile: Refresh recipes to add "standard" set, new package-related options - ## Overview
Update/sync this project's Makefile against recent changes to the atc0005/check-cert project Makefile to provide the same functionality.
## TODO
- arch and OS-specific builds
- `release-build`
- used to generate just the binaries provided by a specific project (e.g., skip Windows (all), skip Linux x86)
- `links` (and arch-specific variations)
- used to generate download links for easier bulk retrieval of release assets
- `quick`
- `go build` without any custom settings applied
- it is a *quick* build to test/prototype binaries without waiting for release-level optimizations to be applied
- `depsinstall`
- install build related dependencies
- package generation
- `packages-stable`
- `packages-dev`
| build | makefile refresh recipes to add standard set new package related options overview update sync this project s makefile against recent changes to the check cert project makefile to provide the same functionality todo arch and os specific builds release build used to generate just the binaries provided by a specific project e g skip windows all skip linux links and arch specific variations used to generate download links for easier bulk retrieval of release assets quick go build without any custom settings applied it is a quick build to test prototype binaries without waiting for release level optimizations to be applied depsinstall install build related dependencies package generation packages stable packages dev | 1 |
7,054 | 3,934,163,290 | IssuesEvent | 2016-04-25 21:36:23 | jens-maus/yam | https://api.github.com/repos/jens-maus/yam | closed | 'Download messages larger than' value doesn't get saved (stays 1024) | #major @undecided bug Configuration fixed nightly build | **Originally by _javierdlr@euskalnet.net_ on 2012-07-13 00:06:01 +0200**
___
## Summary
MSG_CO_DOWNLOAD_LARGE_MAILS1
_Download messages larger than
value is 1024KB, if I change it to 128KB and restart YAM it doesn't get saved in config.
## Steps to reproduce
1.Change 'Download messages larger than' value to 128
2.Save settings
## Expected results
128KB
## Actual results
Always 1024KB
## Regression
## Notes | 1.0 | 'Download messages larger than' value doesn't get saved (stays 1024) - **Originally by _javierdlr@euskalnet.net_ on 2012-07-13 00:06:01 +0200**
___
## Summary
MSG_CO_DOWNLOAD_LARGE_MAILS1
_Download messages larger than
value is 1024KB, if I change it to 128KB and restart YAM it doesn't get saved in config.
## Steps to reproduce
1.Change 'Download messages larger than' value to 128
2.Save settings
## Expected results
128KB
## Actual results
Always 1024KB
## Regression
## Notes | build | download messages larger than value doesn t get saved stays originally by javierdlr euskalnet net on summary msg co download large download messages larger than value is if i change it to and restart yam it doesn t get saved in config steps to reproduce change download messages larger than value to save settings expected results actual results always regression notes | 1 |
89,619 | 25,856,374,537 | IssuesEvent | 2022-12-13 14:03:36 | IBM-Blockchain/blockchain-vscode-extension | https://api.github.com/repos/IBM-Blockchain/blockchain-vscode-extension | closed | Improve tests | help wanted wontfix test build | ## Description
The tests are currently fairly unreliable in the merge builds and the vscode-test extension has also changed... https://code.visualstudio.com/api/working-with-extensions/testing-extension#migrating-from-vscode
## Possible Fix
- Migrate tests to [@vscode/test-electron](https://github.com/microsoft/vscode-test) from the the old `vscode` tests
- Fix flakes | 1.0 | Improve tests - ## Description
The tests are currently fairly unreliable in the merge builds and the vscode-test extension has also changed... https://code.visualstudio.com/api/working-with-extensions/testing-extension#migrating-from-vscode
## Possible Fix
- Migrate tests to [@vscode/test-electron](https://github.com/microsoft/vscode-test) from the the old `vscode` tests
- Fix flakes | build | improve tests description the tests are currently fairly unreliable in the merge builds and the vscode test extension has also changed possible fix migrate tests to from the the old vscode tests fix flakes | 1 |
40,558 | 10,549,748,713 | IssuesEvent | 2019-10-03 09:26:38 | godotengine/godot | https://api.github.com/repos/godotengine/godot | closed | Many compilation warnings | bug topic:buildsystem | **Godot version:**
Latest master #7b64a24
**OS version:**
MacOSX Mojave
Scons v3.0.1
**Issue description:**
there are many compilation warnings:
```
thirdparty/bullet/LinearMath/btVector3.h:335:7: warning: argument value 10880 is outside the valid range [0, 255] [-Wargument-outside-range]
y = bt_splat_ps(y, 0x80);
^~~~~~~~~~~~~~~~~~~~
thirdparty/bullet/LinearMath/btVector3.h:43:29: note: expanded from macro 'bt_splat_ps'
#define bt_splat_ps(_a, _i) bt_pshufd_ps((_a), BT_SHUFFLE(_i, _i, _i, _i))
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
thirdparty/bullet/LinearMath/btVector3.h:41:33: note: expanded from macro 'bt_pshufd_ps'
#define bt_pshufd_ps(_a, _mask) _mm_shuffle_ps((_a), (_a), (_mask))
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
```
**Steps to reproduce:**
Try compile source
| 1.0 | Many compilation warnings - **Godot version:**
Latest master #7b64a24
**OS version:**
MacOSX Mojave
Scons v3.0.1
**Issue description:**
there are many compilation warnings:
```
thirdparty/bullet/LinearMath/btVector3.h:335:7: warning: argument value 10880 is outside the valid range [0, 255] [-Wargument-outside-range]
y = bt_splat_ps(y, 0x80);
^~~~~~~~~~~~~~~~~~~~
thirdparty/bullet/LinearMath/btVector3.h:43:29: note: expanded from macro 'bt_splat_ps'
#define bt_splat_ps(_a, _i) bt_pshufd_ps((_a), BT_SHUFFLE(_i, _i, _i, _i))
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
thirdparty/bullet/LinearMath/btVector3.h:41:33: note: expanded from macro 'bt_pshufd_ps'
#define bt_pshufd_ps(_a, _mask) _mm_shuffle_ps((_a), (_a), (_mask))
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
```
**Steps to reproduce:**
Try compile source
| build | many compilation warnings godot version latest master os version macosx mojave scons issue description there are many compilation warnings thirdparty bullet linearmath h warning argument value is outside the valid range y bt splat ps y thirdparty bullet linearmath h note expanded from macro bt splat ps define bt splat ps a i bt pshufd ps a bt shuffle i i i i thirdparty bullet linearmath h note expanded from macro bt pshufd ps define bt pshufd ps a mask mm shuffle ps a a mask steps to reproduce try compile source | 1 |
39,008 | 10,276,823,116 | IssuesEvent | 2019-08-24 21:04:52 | BotBuilderCommunity/botbuilder-community-js | https://api.github.com/repos/BotBuilderCommunity/botbuilder-community-js | closed | Middleware structure and name changes | build question | As referenced in #70 and originally discussed in #64, we need to break apart "collective" packages to give users individual choice, but also have an overall "suite" install still in place. While looking into this for my recent Watson integration, I realized that we ended middleware packages with "middleware", and likely put that in the wrong place.
As per my PR, we probably want to do something like this:
* botbuilder-middleware
* botbuilder-middleware-engines
* botbuilder-middleware-text-analytics
* botbuilder-middleware-watson-nlu
...which would mean changing something like "botbuilder-text-analytics-middleware" to "botbuilder-middleware-text-analytics". This is _better_ in my opinion because of tab completion and search.
If implemented, however, it would require changes to the folder structure and NPM package (deprecating old ones) for the above, as well as the spell check and recognizers middleware to bring everything into alignment. | 1.0 | Middleware structure and name changes - As referenced in #70 and originally discussed in #64, we need to break apart "collective" packages to give users individual choice, but also have an overall "suite" install still in place. While looking into this for my recent Watson integration, I realized that we ended middleware packages with "middleware", and likely put that in the wrong place.
As per my PR, we probably want to do something like this:
* botbuilder-middleware
* botbuilder-middleware-engines
* botbuilder-middleware-text-analytics
* botbuilder-middleware-watson-nlu
...which would mean changing something like "botbuilder-text-analytics-middleware" to "botbuilder-middleware-text-analytics". This is _better_ in my opinion because of tab completion and search.
If implemented, however, it would require changes to the folder structure and NPM package (deprecating old ones) for the above, as well as the spell check and recognizers middleware to bring everything into alignment. | build | middleware structure and name changes as referenced in and originally discussed in we need to break apart collective packages to give users individual choice but also have an overall suite install still in place while looking into this for my recent watson integration i realized that we ended middleware packages with middleware and likely put that in the wrong place as per my pr we probably want to do something like this botbuilder middleware botbuilder middleware engines botbuilder middleware text analytics botbuilder middleware watson nlu which would mean changing something like botbuilder text analytics middleware to botbuilder middleware text analytics this is better in my opinion because of tab completion and search if implemented however it would require changes to the folder structure and npm package deprecating old ones for the above as well as the spell check and recognizers middleware to bring everything into alignment | 1 |
9,525 | 13,508,682,829 | IssuesEvent | 2020-09-14 08:06:08 | hajke-gu/text-mod | https://api.github.com/repos/hajke-gu/text-mod | opened | The text visualization must work with real-time and external data. | Requirement 8 | # Requirement 8
## Description
"The text visualization must work with real-time and external data."
## Acceptance criteria
- Acceptance criteria 1
- Acceptance criteria 2
| 1.0 | The text visualization must work with real-time and external data. - # Requirement 8
## Description
"The text visualization must work with real-time and external data."
## Acceptance criteria
- Acceptance criteria 1
- Acceptance criteria 2
| non_build | the text visualization must work with real time and external data requirement description the text visualization must work with real time and external data acceptance criteria acceptance criteria acceptance criteria | 0 |
157,799 | 6,016,007,257 | IssuesEvent | 2017-06-07 05:03:12 | lightertu/Groupify | https://api.github.com/repos/lightertu/Groupify | closed | New Participants Group Assignments after group capacity is updated | Activities Backend bug Medium Priority | we need to consider the fact that, say, when the user changed the group size from 10 to 5, some groups will disappear, and we need to handle this case. | 1.0 | New Participants Group Assignments after group capacity is updated - we need to consider the fact that, say, when the user changed the group size from 10 to 5, some groups will disappear, and we need to handle this case. | non_build | new participants group assignments after group capacity is updated we need to consider the fact that say when the user changed the group size from to some groups will disappear and we need to handle this case | 0 |
238,401 | 26,099,722,490 | IssuesEvent | 2022-12-27 04:41:30 | nidhi7598/expat_python | https://api.github.com/repos/nidhi7598/expat_python | opened | CVE-2021-46143 (High) detected in buffalo-gplexpat-2.1.0 | security vulnerability | ## CVE-2021-46143 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>buffalo-gplexpat-2.1.0</b></p></summary>
<p>
<p>Educational Linux Distribution</p>
<p>Library home page: <a href=https://sourceforge.net/projects/buffalo-gpl/>https://sourceforge.net/projects/buffalo-gpl/</a></p>
<p>Found in HEAD commit: <a href="https://github.com/nidhi7598/expat_python/commit/9e3040c46c4afb4e320ca14b4008d7f2701f0776">9e3040c46c4afb4e320ca14b4008d7f2701f0776</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/lib/xmlparse.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In doProlog in xmlparse.c in Expat (aka libexpat) before 2.4.3, an integer overflow exists for m_groupSize.
<p>Publish Date: 2022-01-06
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-46143>CVE-2021-46143</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-46143">https://nvd.nist.gov/vuln/detail/CVE-2021-46143</a></p>
<p>Release Date: 2022-01-06</p>
<p>Fix Resolution: expat - 2.2.6-2+deb10u2,2.2.10-2+deb11u1,2.2.0-2+deb9u4,2.4.3-1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-46143 (High) detected in buffalo-gplexpat-2.1.0 - ## CVE-2021-46143 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>buffalo-gplexpat-2.1.0</b></p></summary>
<p>
<p>Educational Linux Distribution</p>
<p>Library home page: <a href=https://sourceforge.net/projects/buffalo-gpl/>https://sourceforge.net/projects/buffalo-gpl/</a></p>
<p>Found in HEAD commit: <a href="https://github.com/nidhi7598/expat_python/commit/9e3040c46c4afb4e320ca14b4008d7f2701f0776">9e3040c46c4afb4e320ca14b4008d7f2701f0776</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/lib/xmlparse.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In doProlog in xmlparse.c in Expat (aka libexpat) before 2.4.3, an integer overflow exists for m_groupSize.
<p>Publish Date: 2022-01-06
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-46143>CVE-2021-46143</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-46143">https://nvd.nist.gov/vuln/detail/CVE-2021-46143</a></p>
<p>Release Date: 2022-01-06</p>
<p>Fix Resolution: expat - 2.2.6-2+deb10u2,2.2.10-2+deb11u1,2.2.0-2+deb9u4,2.4.3-1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_build | cve high detected in buffalo gplexpat cve high severity vulnerability vulnerable library buffalo gplexpat educational linux distribution library home page a href found in head commit a href found in base branch master vulnerable source files lib xmlparse c vulnerability details in doprolog in xmlparse c in expat aka libexpat before an integer overflow exists for m groupsize publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution expat step up your open source security game with mend | 0 |
53,683 | 13,194,724,654 | IssuesEvent | 2020-08-13 17:20:20 | elastic/elasticsearch | https://api.github.com/repos/elastic/elasticsearch | closed | Typo in Taskill in reaper service for Windows | :Core/Infra/Build >bug Team:Core/Infra | There is a typo on this line of the reaper service in the build infrastructure:
https://github.com/elastic/elasticsearch/blob/a72760e55befd53c4be049712a8cca9690a34bc2/buildSrc/src/main/java/org/elasticsearch/gradle/ReaperService.java#L62
`Taskill` should be `Taskkill`.
Obviously it's trivial to fix the typo but I don't understand what effect this will have on Windows test runs. It would be good if somebody who understands how the class is used could assess the impact of not running the correct command in the past and switching to running the correct command in the future. | 1.0 | Typo in Taskill in reaper service for Windows - There is a typo on this line of the reaper service in the build infrastructure:
https://github.com/elastic/elasticsearch/blob/a72760e55befd53c4be049712a8cca9690a34bc2/buildSrc/src/main/java/org/elasticsearch/gradle/ReaperService.java#L62
`Taskill` should be `Taskkill`.
Obviously it's trivial to fix the typo but I don't understand what effect this will have on Windows test runs. It would be good if somebody who understands how the class is used could assess the impact of not running the correct command in the past and switching to running the correct command in the future. | build | typo in taskill in reaper service for windows there is a typo on this line of the reaper service in the build infrastructure taskill should be taskkill obviously it s trivial to fix the typo but i don t understand what effect this will have on windows test runs it would be good if somebody who understands how the class is used could assess the impact of not running the correct command in the past and switching to running the correct command in the future | 1 |
92,881 | 26,793,000,501 | IssuesEvent | 2023-02-01 09:53:10 | CosmosOS/Cosmos | https://api.github.com/repos/CosmosOS/Cosmos | closed | Cosmos is broken on macOS | Area: Visual Studio Integration Area: Build | **Have you checked Github Issues for similar errors?**
Yes. Didn't found anything
**Exception**
error MSB6003: Impossible d'exécuter la tâche exécutable spécifiée "IL2CPU". System.ComponentModel.Win32Exception (8): An error occurred trying to start process '/opt/cosmos/Build/IL2CPU/IL2CPU' with working directory '/Users/raphaelm/Projects/OpenKernel/OpenKernel'. Exec format error
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at System.Diagnostics.Process.ForkAndExecProcess(ProcessStartInfo startInfo, String resolvedFilename, String[] argv, String[] envp, String cwd, Boolean setCredentials, UInt32 userId, UInt32 groupId, UInt32[] groups, Int32& stdinFd, Int32& stdoutFd, Int32& stderrFd, Boolean usesTerminal, Boolean throwOnNoExec)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at Microsoft.Build.Utilities.ToolTask.ExecuteTool(String pathToTool, String responseFileCommands, String commandLineCommands)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at Microsoft.Build.Utilities.ToolTask.Execute()
**Visual Studio Output Logs**
Génération de la solution OpenKernel (Debug)
La génération a démarré 02/01/2023 13:21:30.
__________________________________________________
Projet "/Users/raphaelm/Projects/OpenKernel/OpenKernel/OpenKernel.csproj" (Build cibles) :
GenerateTargetFrameworkMonikerAttribute cible :
La cible est ignorée "GenerateTargetFrameworkMonikerAttribute", car tous les fichiers de sortie sont à jour par rapport aux fichiers d'entrée.
CoreGenerateAssemblyInfo cible :
La cible est ignorée "CoreGenerateAssemblyInfo", car tous les fichiers de sortie sont à jour par rapport aux fichiers d'entrée.
CoreCompile cible :
La cible est ignorée "CoreCompile", car tous les fichiers de sortie sont à jour par rapport aux fichiers d'entrée.
GenerateBuildDependencyFile cible :
La cible est ignorée "GenerateBuildDependencyFile", car tous les fichiers de sortie sont à jour par rapport aux fichiers d'entrée.
CopyFilesToOutputDirectory cible :
OpenKernel -> /Users/raphaelm/Projects/OpenKernel/OpenKernel/bin/Debug/net6.0/OpenKernel.dll
IL2CPU cible :
/opt/cosmos/Build/IL2CPU/IL2CPU KernelPkg:
EnableDebug:True
EnableStackCorruptionDetection:True
StackCorruptionDetectionLevel:MethodFooters
DebugMode:Source
TraceAssemblies:
DebugCom:1
TargetAssembly:/Users/raphaelm/Projects/OpenKernel/OpenKernel/bin/Debug/net6.0/OpenKernel.dll
OutputFilename:/Users/raphaelm/Projects/OpenKernel/OpenKernel/bin/Debug/net6.0/OpenKernel.asm
EnableLogging:True
EmitDebugSymbols:True
IgnoreDebugStubAttribute:False
CompileVBEMultiboot:False
VBEResolution:800x600x32
RemoveBootDebugOutput:False
AllowComments:False
References:/Users/raphaelm/Projects/OpenKernel/OpenKernel/obj/Debug/net6.0/OpenKernel.dll
References:/Users/raphaelm/.nuget/packages/cosmos.common/0.1.0-localbuild/lib/net6.0/Cosmos.Common.dll
References:/Users/raphaelm/.nuget/packages/cosmos.core/0.1.0-localbuild/lib/net6.0/Cosmos.Core.dll
References:/Users/raphaelm/.nuget/packages/cosmos.debug.kernel/0.1.0-localbuild/lib/netstandard2.0/Cosmos.Debug.Kernel.dll
References:/Users/raphaelm/.nuget/packages/cosmos.hal2/0.1.0-localbuild/lib/net6.0/Cosmos.HAL2.dll
References:/Users/raphaelm/.nuget/packages/cosmos.system2/0.1.0-localbuild/lib/net6.0/Cosmos.System2.dll
References:/Users/raphaelm/.nuget/packages/il2cpu.api/0.1.0-localbuild/lib/netstandard2.0/IL2CPU.API.dll
PlugsReferences:/opt/cosmos/Kernel/Cosmos.Core_Asm.dll
PlugsReferences:/opt/cosmos/Kernel/Cosmos.Core_Plugs.dll
PlugsReferences:/opt/cosmos/Kernel/Cosmos.Debug.Kernel.Plugs.Asm.dll
PlugsReferences:/opt/cosmos/Kernel/Cosmos.System2_Plugs.dll
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: Impossible d'exécuter la tâche exécutable spécifiée "IL2CPU". System.ComponentModel.Win32Exception (8): An error occurred trying to start process '/opt/cosmos/Build/IL2CPU/IL2CPU' with working directory '/Users/raphaelm/Projects/OpenKernel/OpenKernel'. Exec format error
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at System.Diagnostics.Process.ForkAndExecProcess(ProcessStartInfo startInfo, String resolvedFilename, String[] argv, String[] envp, String cwd, Boolean setCredentials, UInt32 userId, UInt32 groupId, UInt32[] groups, Int32& stdinFd, Int32& stdoutFd, Int32& stderrFd, Boolean usesTerminal, Boolean throwOnNoExec)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at Microsoft.Build.Utilities.ToolTask.ExecuteTool(String pathToTool, String responseFileCommands, String commandLineCommands)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at Microsoft.Build.Utilities.ToolTask.Execute()
IL2CPU task took 00:00:00.2939811
Génération de la cible "IL2CPU" terminée dans le projet "OpenKernel.csproj" -- ÉCHEC.
Génération du projet "OpenKernel.csproj" terminée -- ÉCHEC.
ÉCHEC de la build.
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: Impossible d'exécuter la tâche exécutable spécifiée "IL2CPU". System.ComponentModel.Win32Exception (8): An error occurred trying to start process '/opt/cosmos/Build/IL2CPU/IL2CPU' with working directory '/Users/raphaelm/Projects/OpenKernel/OpenKernel'. Exec format error
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at System.Diagnostics.Process.ForkAndExecProcess(ProcessStartInfo startInfo, String resolvedFilename, String[] argv, String[] envp, String cwd, Boolean setCredentials, UInt32 userId, UInt32 groupId, UInt32[] groups, Int32& stdinFd, Int32& stdoutFd, Int32& stderrFd, Boolean usesTerminal, Boolean throwOnNoExec)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at Microsoft.Build.Utilities.ToolTask.ExecuteTool(String pathToTool, String responseFileCommands, String commandLineCommands)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at Microsoft.Build.Utilities.ToolTask.Execute()
0 Avertissement(s)
1 Erreur(s)
Temps écoulé 00:00:00.60
========== Build : 0 réussite(s), 1 échec(s), 0 à jour, 0 ignorée(s) ==========
Version : 1 erreurs, 0 avertissements
**How To Reproduce**
Install Cosmos on a macOS host/guest with this commands :
make
make install
make nuget-install
And make a new library project and add Cosmos NuGet packages.
Make a kernel class and build.
**Screenshots**

**Context**
Before posting please confirm that the following are in order
[ ] Both Cosmos VS Extensions are installed
[*] In the NuGet Package Manager "Include prerelease" is selected
[*] The Cosmos NuGet package store is selected (NOT nuget.org) in 'Manage NuGet Packages'
[*] The Cosmos NuGet packages are installed
**macOS Version**
macOS 11 (Big Sur)
Darwin Kernel 20.6.0
| 1.0 | Cosmos is broken on macOS - **Have you checked Github Issues for similar errors?**
Yes. Didn't found anything
**Exception**
error MSB6003: Impossible d'exécuter la tâche exécutable spécifiée "IL2CPU". System.ComponentModel.Win32Exception (8): An error occurred trying to start process '/opt/cosmos/Build/IL2CPU/IL2CPU' with working directory '/Users/raphaelm/Projects/OpenKernel/OpenKernel'. Exec format error
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at System.Diagnostics.Process.ForkAndExecProcess(ProcessStartInfo startInfo, String resolvedFilename, String[] argv, String[] envp, String cwd, Boolean setCredentials, UInt32 userId, UInt32 groupId, UInt32[] groups, Int32& stdinFd, Int32& stdoutFd, Int32& stderrFd, Boolean usesTerminal, Boolean throwOnNoExec)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at Microsoft.Build.Utilities.ToolTask.ExecuteTool(String pathToTool, String responseFileCommands, String commandLineCommands)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at Microsoft.Build.Utilities.ToolTask.Execute()
**Visual Studio Output Logs**
Génération de la solution OpenKernel (Debug)
La génération a démarré 02/01/2023 13:21:30.
__________________________________________________
Projet "/Users/raphaelm/Projects/OpenKernel/OpenKernel/OpenKernel.csproj" (Build cibles) :
GenerateTargetFrameworkMonikerAttribute cible :
La cible est ignorée "GenerateTargetFrameworkMonikerAttribute", car tous les fichiers de sortie sont à jour par rapport aux fichiers d'entrée.
CoreGenerateAssemblyInfo cible :
La cible est ignorée "CoreGenerateAssemblyInfo", car tous les fichiers de sortie sont à jour par rapport aux fichiers d'entrée.
CoreCompile cible :
La cible est ignorée "CoreCompile", car tous les fichiers de sortie sont à jour par rapport aux fichiers d'entrée.
GenerateBuildDependencyFile cible :
La cible est ignorée "GenerateBuildDependencyFile", car tous les fichiers de sortie sont à jour par rapport aux fichiers d'entrée.
CopyFilesToOutputDirectory cible :
OpenKernel -> /Users/raphaelm/Projects/OpenKernel/OpenKernel/bin/Debug/net6.0/OpenKernel.dll
IL2CPU cible :
/opt/cosmos/Build/IL2CPU/IL2CPU KernelPkg:
EnableDebug:True
EnableStackCorruptionDetection:True
StackCorruptionDetectionLevel:MethodFooters
DebugMode:Source
TraceAssemblies:
DebugCom:1
TargetAssembly:/Users/raphaelm/Projects/OpenKernel/OpenKernel/bin/Debug/net6.0/OpenKernel.dll
OutputFilename:/Users/raphaelm/Projects/OpenKernel/OpenKernel/bin/Debug/net6.0/OpenKernel.asm
EnableLogging:True
EmitDebugSymbols:True
IgnoreDebugStubAttribute:False
CompileVBEMultiboot:False
VBEResolution:800x600x32
RemoveBootDebugOutput:False
AllowComments:False
References:/Users/raphaelm/Projects/OpenKernel/OpenKernel/obj/Debug/net6.0/OpenKernel.dll
References:/Users/raphaelm/.nuget/packages/cosmos.common/0.1.0-localbuild/lib/net6.0/Cosmos.Common.dll
References:/Users/raphaelm/.nuget/packages/cosmos.core/0.1.0-localbuild/lib/net6.0/Cosmos.Core.dll
References:/Users/raphaelm/.nuget/packages/cosmos.debug.kernel/0.1.0-localbuild/lib/netstandard2.0/Cosmos.Debug.Kernel.dll
References:/Users/raphaelm/.nuget/packages/cosmos.hal2/0.1.0-localbuild/lib/net6.0/Cosmos.HAL2.dll
References:/Users/raphaelm/.nuget/packages/cosmos.system2/0.1.0-localbuild/lib/net6.0/Cosmos.System2.dll
References:/Users/raphaelm/.nuget/packages/il2cpu.api/0.1.0-localbuild/lib/netstandard2.0/IL2CPU.API.dll
PlugsReferences:/opt/cosmos/Kernel/Cosmos.Core_Asm.dll
PlugsReferences:/opt/cosmos/Kernel/Cosmos.Core_Plugs.dll
PlugsReferences:/opt/cosmos/Kernel/Cosmos.Debug.Kernel.Plugs.Asm.dll
PlugsReferences:/opt/cosmos/Kernel/Cosmos.System2_Plugs.dll
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: Impossible d'exécuter la tâche exécutable spécifiée "IL2CPU". System.ComponentModel.Win32Exception (8): An error occurred trying to start process '/opt/cosmos/Build/IL2CPU/IL2CPU' with working directory '/Users/raphaelm/Projects/OpenKernel/OpenKernel'. Exec format error
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at System.Diagnostics.Process.ForkAndExecProcess(ProcessStartInfo startInfo, String resolvedFilename, String[] argv, String[] envp, String cwd, Boolean setCredentials, UInt32 userId, UInt32 groupId, UInt32[] groups, Int32& stdinFd, Int32& stdoutFd, Int32& stderrFd, Boolean usesTerminal, Boolean throwOnNoExec)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at Microsoft.Build.Utilities.ToolTask.ExecuteTool(String pathToTool, String responseFileCommands, String commandLineCommands)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at Microsoft.Build.Utilities.ToolTask.Execute()
IL2CPU task took 00:00:00.2939811
Génération de la cible "IL2CPU" terminée dans le projet "OpenKernel.csproj" -- ÉCHEC.
Génération du projet "OpenKernel.csproj" terminée -- ÉCHEC.
ÉCHEC de la build.
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: Impossible d'exécuter la tâche exécutable spécifiée "IL2CPU". System.ComponentModel.Win32Exception (8): An error occurred trying to start process '/opt/cosmos/Build/IL2CPU/IL2CPU' with working directory '/Users/raphaelm/Projects/OpenKernel/OpenKernel'. Exec format error
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at System.Diagnostics.Process.ForkAndExecProcess(ProcessStartInfo startInfo, String resolvedFilename, String[] argv, String[] envp, String cwd, Boolean setCredentials, UInt32 userId, UInt32 groupId, UInt32[] groups, Int32& stdinFd, Int32& stdoutFd, Int32& stderrFd, Boolean usesTerminal, Boolean throwOnNoExec)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at Microsoft.Build.Utilities.ToolTask.ExecuteTool(String pathToTool, String responseFileCommands, String commandLineCommands)
/Users/raphaelm/.nuget/packages/cosmos.build/0.1.0-localbuild/build/Cosmos.Build.targets(191,9): error MSB6003: at Microsoft.Build.Utilities.ToolTask.Execute()
0 Avertissement(s)
1 Erreur(s)
Temps écoulé 00:00:00.60
========== Build : 0 réussite(s), 1 échec(s), 0 à jour, 0 ignorée(s) ==========
Version : 1 erreurs, 0 avertissements
**How To Reproduce**
Install Cosmos on a macOS host/guest with this commands :
make
make install
make nuget-install
And make a new library project and add Cosmos NuGet packages.
Make a kernel class and build.
**Screenshots**

**Context**
Before posting please confirm that the following are in order
[ ] Both Cosmos VS Extensions are installed
[*] In the NuGet Package Manager "Include prerelease" is selected
[*] The Cosmos NuGet package store is selected (NOT nuget.org) in 'Manage NuGet Packages'
[*] The Cosmos NuGet packages are installed
**macOS Version**
macOS 11 (Big Sur)
Darwin Kernel 20.6.0
| build | cosmos is broken on macos have you checked github issues for similar errors yes didn t found anything exception error impossible d exécuter la tâche exécutable spécifiée system componentmodel an error occurred trying to start process opt cosmos build with working directory users raphaelm projects openkernel openkernel exec format error users raphaelm nuget packages cosmos build localbuild build cosmos build targets error at system diagnostics process forkandexecprocess processstartinfo startinfo string resolvedfilename string argv string envp string cwd boolean setcredentials userid groupid groups stdinfd stdoutfd stderrfd boolean usesterminal boolean throwonnoexec users raphaelm nuget packages cosmos build localbuild build cosmos build targets error at system diagnostics process startcore processstartinfo startinfo users raphaelm nuget packages cosmos build localbuild build cosmos build targets error at microsoft build utilities tooltask executetool string pathtotool string responsefilecommands string commandlinecommands users raphaelm nuget packages cosmos build localbuild build cosmos build targets error at microsoft build utilities tooltask execute visual studio output logs génération de la solution openkernel debug la génération a démarré projet users raphaelm projects openkernel openkernel openkernel csproj build cibles generatetargetframeworkmonikerattribute cible la cible est ignorée generatetargetframeworkmonikerattribute car tous les fichiers de sortie sont à jour par rapport aux fichiers d entrée coregenerateassemblyinfo cible la cible est ignorée coregenerateassemblyinfo car tous les fichiers de sortie sont à jour par rapport aux fichiers d entrée corecompile cible la cible est ignorée corecompile car tous les fichiers de sortie sont à jour par rapport aux fichiers d entrée generatebuilddependencyfile cible la cible est ignorée generatebuilddependencyfile car tous les fichiers de sortie sont à jour par rapport aux fichiers d entrée copyfilestooutputdirectory cible openkernel users raphaelm projects openkernel openkernel bin debug openkernel dll cible opt cosmos build kernelpkg enabledebug true enablestackcorruptiondetection true stackcorruptiondetectionlevel methodfooters debugmode source traceassemblies debugcom targetassembly users raphaelm projects openkernel openkernel bin debug openkernel dll outputfilename users raphaelm projects openkernel openkernel bin debug openkernel asm enablelogging true emitdebugsymbols true ignoredebugstubattribute false compilevbemultiboot false vberesolution removebootdebugoutput false allowcomments false references users raphaelm projects openkernel openkernel obj debug openkernel dll references users raphaelm nuget packages cosmos common localbuild lib cosmos common dll references users raphaelm nuget packages cosmos core localbuild lib cosmos core dll references users raphaelm nuget packages cosmos debug kernel localbuild lib cosmos debug kernel dll references users raphaelm nuget packages cosmos localbuild lib cosmos dll references users raphaelm nuget packages cosmos localbuild lib cosmos dll references users raphaelm nuget packages api localbuild lib api dll plugsreferences opt cosmos kernel cosmos core asm dll plugsreferences opt cosmos kernel cosmos core plugs dll plugsreferences opt cosmos kernel cosmos debug kernel plugs asm dll plugsreferences opt cosmos kernel cosmos plugs dll users raphaelm nuget packages cosmos build localbuild build cosmos build targets error impossible d exécuter la tâche exécutable spécifiée system componentmodel an error occurred trying to start process opt cosmos build with working directory users raphaelm projects openkernel openkernel exec format error users raphaelm nuget packages cosmos build localbuild build cosmos build targets error at system diagnostics process forkandexecprocess processstartinfo startinfo string resolvedfilename string argv string envp string cwd boolean setcredentials userid groupid groups stdinfd stdoutfd stderrfd boolean usesterminal boolean throwonnoexec users raphaelm nuget packages cosmos build localbuild build cosmos build targets error at system diagnostics process startcore processstartinfo startinfo users raphaelm nuget packages cosmos build localbuild build cosmos build targets error at microsoft build utilities tooltask executetool string pathtotool string responsefilecommands string commandlinecommands users raphaelm nuget packages cosmos build localbuild build cosmos build targets error at microsoft build utilities tooltask execute task took génération de la cible terminée dans le projet openkernel csproj échec génération du projet openkernel csproj terminée échec échec de la build users raphaelm nuget packages cosmos build localbuild build cosmos build targets error impossible d exécuter la tâche exécutable spécifiée system componentmodel an error occurred trying to start process opt cosmos build with working directory users raphaelm projects openkernel openkernel exec format error users raphaelm nuget packages cosmos build localbuild build cosmos build targets error at system diagnostics process forkandexecprocess processstartinfo startinfo string resolvedfilename string argv string envp string cwd boolean setcredentials userid groupid groups stdinfd stdoutfd stderrfd boolean usesterminal boolean throwonnoexec users raphaelm nuget packages cosmos build localbuild build cosmos build targets error at system diagnostics process startcore processstartinfo startinfo users raphaelm nuget packages cosmos build localbuild build cosmos build targets error at microsoft build utilities tooltask executetool string pathtotool string responsefilecommands string commandlinecommands users raphaelm nuget packages cosmos build localbuild build cosmos build targets error at microsoft build utilities tooltask execute avertissement s erreur s temps écoulé build réussite s échec s à jour ignorée s version erreurs avertissements how to reproduce install cosmos on a macos host guest with this commands make make install make nuget install and make a new library project and add cosmos nuget packages make a kernel class and build screenshots context before posting please confirm that the following are in order both cosmos vs extensions are installed in the nuget package manager include prerelease is selected the cosmos nuget package store is selected not nuget org in manage nuget packages the cosmos nuget packages are installed macos version macos big sur darwin kernel | 1 |
48,440 | 2,998,169,660 | IssuesEvent | 2015-07-23 12:40:42 | jayway/powermock | https://api.github.com/repos/jayway/powermock | closed | set/getInternalState should be able to set state depending on the field type or annotation | enhancement imported Milestone-Release1.0 Priority-Medium | _From [johan.ha...@gmail.com](https://code.google.com/u/105676376875942041029/) on November 07, 2008 10:58:09_
This would be a nice feature because then you can refactor the field name
without alterting the test. E.g.
setInternalState(instance, Service.class, serviceInstance);
would set the serviceInstance to the first field of type Service.class.
_Original issue: http://code.google.com/p/powermock/issues/detail?id=63_ | 1.0 | set/getInternalState should be able to set state depending on the field type or annotation - _From [johan.ha...@gmail.com](https://code.google.com/u/105676376875942041029/) on November 07, 2008 10:58:09_
This would be a nice feature because then you can refactor the field name
without alterting the test. E.g.
setInternalState(instance, Service.class, serviceInstance);
would set the serviceInstance to the first field of type Service.class.
_Original issue: http://code.google.com/p/powermock/issues/detail?id=63_ | non_build | set getinternalstate should be able to set state depending on the field type or annotation from on november this would be a nice feature because then you can refactor the field name without alterting the test e g setinternalstate instance service class serviceinstance would set the serviceinstance to the first field of type service class original issue | 0 |
57,173 | 14,016,463,822 | IssuesEvent | 2020-10-29 14:33:16 | notepad-plus-plus/notepad-plus-plus | https://api.github.com/repos/notepad-plus-plus/notepad-plus-plus | closed | Update Visual Studio project file, mainly for missing items | accepted build / code | I noticed that the Visual Studio project [FILE](https://github.com/notepad-plus-plus/notepad-plus-plus/blob/0689a9445343a6c99f11959d8b4a91d598bc8495/PowerEditor/visual.net/notepadPlus.vcxproj#L1) is a bit out-of-date; it doesn't include certain files as part of the project, thus potentially causing identifiers to be missed when a "Find in Files" search (a Visual Studio one, NOT a Notepad++ one) is conducted with scope "Entire Solution".
What made me notice this is that I searched on an identifier defined in `FindReplaceDlg_rc.h` and Visual Studio did not tell me where it was defined.
I see that some other `*_rc.h` files are not part of the "project". Is there any reason for this?
Otherwise, I plan to go through the project file and update it for missing items and other problems... | 1.0 | Update Visual Studio project file, mainly for missing items - I noticed that the Visual Studio project [FILE](https://github.com/notepad-plus-plus/notepad-plus-plus/blob/0689a9445343a6c99f11959d8b4a91d598bc8495/PowerEditor/visual.net/notepadPlus.vcxproj#L1) is a bit out-of-date; it doesn't include certain files as part of the project, thus potentially causing identifiers to be missed when a "Find in Files" search (a Visual Studio one, NOT a Notepad++ one) is conducted with scope "Entire Solution".
What made me notice this is that I searched on an identifier defined in `FindReplaceDlg_rc.h` and Visual Studio did not tell me where it was defined.
I see that some other `*_rc.h` files are not part of the "project". Is there any reason for this?
Otherwise, I plan to go through the project file and update it for missing items and other problems... | build | update visual studio project file mainly for missing items i noticed that the visual studio project is a bit out of date it doesn t include certain files as part of the project thus potentially causing identifiers to be missed when a find in files search a visual studio one not a notepad one is conducted with scope entire solution what made me notice this is that i searched on an identifier defined in findreplacedlg rc h and visual studio did not tell me where it was defined i see that some other rc h files are not part of the project is there any reason for this otherwise i plan to go through the project file and update it for missing items and other problems | 1 |
100,878 | 30,801,821,326 | IssuesEvent | 2023-08-01 02:27:52 | osquery/osquery | https://api.github.com/repos/osquery/osquery | closed | how can I compile osquery for Red Hat on ppc64le | question build | <!-- Thank you for contributing to osquery! -->
How can I compile or get rpm for Red Hat ppc64le architecture
<!--
I want to test osquery on RedHat ppc64le arch, is there a build instructions support i can use ?
-->
### What new feature do you want?
<!-- Please describe with as much detail as possible. Include examples. -->
### How is this new feature useful?
Red Hat on ppc64le is becoming popular os due to high memory support, apps such as SAP using it now.
### How can this be implemented?
Provide rpm or build instructions or support to build issues
I can try build it , but facing some issues building osquery toolchain
If you can help debug i can try build
| 1.0 | how can I compile osquery for Red Hat on ppc64le - <!-- Thank you for contributing to osquery! -->
How can I compile or get rpm for Red Hat ppc64le architecture
<!--
I want to test osquery on RedHat ppc64le arch, is there a build instructions support i can use ?
-->
### What new feature do you want?
<!-- Please describe with as much detail as possible. Include examples. -->
### How is this new feature useful?
Red Hat on ppc64le is becoming popular os due to high memory support, apps such as SAP using it now.
### How can this be implemented?
Provide rpm or build instructions or support to build issues
I can try build it , but facing some issues building osquery toolchain
If you can help debug i can try build
| build | how can i compile osquery for red hat on how can i compile or get rpm for red hat architecture i want to test osquery on redhat arch is there a build instructions support i can use what new feature do you want how is this new feature useful red hat on is becoming popular os due to high memory support apps such as sap using it now how can this be implemented provide rpm or build instructions or support to build issues i can try build it but facing some issues building osquery toolchain if you can help debug i can try build | 1 |
87,329 | 25,087,521,829 | IssuesEvent | 2022-11-08 01:47:54 | alandefreitas/matplotplusplus | https://api.github.com/repos/alandefreitas/matplotplusplus | closed | Great library, but too much CMake Garbage in my directory. | enhancement - build system | **Feature category**
- [x] *enhancement - build system*
- [ ] *enhancement - backends*
- [ ] *enhancement - build system*
- [ ] *enhancement - documentation*
- [ ] *enhancement - plot categories*
**The problem**
<!--Please be civil. This is an environment for collaboration.-->
Hello I'm new to c++, the library is amazing, it works great,
but I just want to run "g++ main.cpp -o main -lmatplot" and for it to work without the garbage cmake files that I have no clue what is doing.
**The solution I'd like**
Make it work as a dynamic library, and make it possible to install it with "sudo apt install matplot-dev"
**Alternatives I've considered**
**Additional context**
<!--optional-->
| 1.0 | Great library, but too much CMake Garbage in my directory. - **Feature category**
- [x] *enhancement - build system*
- [ ] *enhancement - backends*
- [ ] *enhancement - build system*
- [ ] *enhancement - documentation*
- [ ] *enhancement - plot categories*
**The problem**
<!--Please be civil. This is an environment for collaboration.-->
Hello I'm new to c++, the library is amazing, it works great,
but I just want to run "g++ main.cpp -o main -lmatplot" and for it to work without the garbage cmake files that I have no clue what is doing.
**The solution I'd like**
Make it work as a dynamic library, and make it possible to install it with "sudo apt install matplot-dev"
**Alternatives I've considered**
**Additional context**
<!--optional-->
| build | great library but too much cmake garbage in my directory feature category enhancement build system enhancement backends enhancement build system enhancement documentation enhancement plot categories the problem hello i m new to c the library is amazing it works great but i just want to run g main cpp o main lmatplot and for it to work without the garbage cmake files that i have no clue what is doing the solution i d like make it work as a dynamic library and make it possible to install it with sudo apt install matplot dev alternatives i ve considered additional context | 1 |
214,762 | 7,276,619,289 | IssuesEvent | 2018-02-21 16:52:24 | phetsims/rosetta | https://api.github.com/repos/phetsims/rosetta | closed | 'Save' translation is not saving strings | chipper:2.0 priority:3-medium | Originally reported by a translator on Jan 8, 2018.
> The first thing that I've figured out that I cannot save the work.- on first image you see that the feedback is showing that the job is saved, but actually, it has been disappeared as soon as I've left the page.
I have been able to reproduce the problem by translating a string, clicking Save, navigating away and back to the saved translation and the translated string is gone.


| 1.0 | 'Save' translation is not saving strings - Originally reported by a translator on Jan 8, 2018.
> The first thing that I've figured out that I cannot save the work.- on first image you see that the feedback is showing that the job is saved, but actually, it has been disappeared as soon as I've left the page.
I have been able to reproduce the problem by translating a string, clicking Save, navigating away and back to the saved translation and the translated string is gone.


| non_build | save translation is not saving strings originally reported by a translator on jan the first thing that i ve figured out that i cannot save the work on first image you see that the feedback is showing that the job is saved but actually it has been disappeared as soon as i ve left the page i have been able to reproduce the problem by translating a string clicking save navigating away and back to the saved translation and the translated string is gone | 0 |
187,574 | 14,428,378,709 | IssuesEvent | 2020-12-06 09:30:40 | kalexmills/github-vet-tests-dec2020 | https://api.github.com/repos/kalexmills/github-vet-tests-dec2020 | closed | srinandan/edgemicroctl: vendor/k8s.io/kubernetes/staging/src/k8s.io/apiextensions-apiserver/test/integration/basic_test.go; 5 LoC | fresh test tiny vendored |
Found a possible issue in [srinandan/edgemicroctl](https://www.github.com/srinandan/edgemicroctl) at [vendor/k8s.io/kubernetes/staging/src/k8s.io/apiextensions-apiserver/test/integration/basic_test.go](https://github.com/srinandan/edgemicroctl/blob/b0e68f943122d83e4040f5c37d76a203e47d76ba/vendor/k8s.io/kubernetes/staging/src/k8s.io/apiextensions-apiserver/test/integration/basic_test.go#L684-L688)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> function call which takes a reference to a at line 685 may start a goroutine
[Click here to see the code in its original context.](https://github.com/srinandan/edgemicroctl/blob/b0e68f943122d83e4040f5c37d76a203e47d76ba/vendor/k8s.io/kubernetes/staging/src/k8s.io/apiextensions-apiserver/test/integration/basic_test.go#L684-L688)
<details>
<summary>Click here to show the 5 line(s) of Go which triggered the analyzer.</summary>
```go
for _, a := range createdList.(*unstructured.UnstructuredList).Items {
if e := instances[a.GetNamespace()]; !reflect.DeepEqual(e, &a) {
t.Errorf("expected %v, got %v", e, a)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: b0e68f943122d83e4040f5c37d76a203e47d76ba
| 1.0 | srinandan/edgemicroctl: vendor/k8s.io/kubernetes/staging/src/k8s.io/apiextensions-apiserver/test/integration/basic_test.go; 5 LoC -
Found a possible issue in [srinandan/edgemicroctl](https://www.github.com/srinandan/edgemicroctl) at [vendor/k8s.io/kubernetes/staging/src/k8s.io/apiextensions-apiserver/test/integration/basic_test.go](https://github.com/srinandan/edgemicroctl/blob/b0e68f943122d83e4040f5c37d76a203e47d76ba/vendor/k8s.io/kubernetes/staging/src/k8s.io/apiextensions-apiserver/test/integration/basic_test.go#L684-L688)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> function call which takes a reference to a at line 685 may start a goroutine
[Click here to see the code in its original context.](https://github.com/srinandan/edgemicroctl/blob/b0e68f943122d83e4040f5c37d76a203e47d76ba/vendor/k8s.io/kubernetes/staging/src/k8s.io/apiextensions-apiserver/test/integration/basic_test.go#L684-L688)
<details>
<summary>Click here to show the 5 line(s) of Go which triggered the analyzer.</summary>
```go
for _, a := range createdList.(*unstructured.UnstructuredList).Items {
if e := instances[a.GetNamespace()]; !reflect.DeepEqual(e, &a) {
t.Errorf("expected %v, got %v", e, a)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: b0e68f943122d83e4040f5c37d76a203e47d76ba
| non_build | srinandan edgemicroctl vendor io kubernetes staging src io apiextensions apiserver test integration basic test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message function call which takes a reference to a at line may start a goroutine click here to show the line s of go which triggered the analyzer go for a range createdlist unstructured unstructuredlist items if e instances reflect deepequal e a t errorf expected v got v e a leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id | 0 |
26,223 | 7,803,449,745 | IssuesEvent | 2018-06-11 00:06:25 | haskell/cabal | https://api.github.com/repos/haskell/cabal | closed | [nix-local-build] Need new-style clean | cabal-install: nix-local-build priority: low type: enhancement | A few bugs:
1. "cabal clean" doesn't clean dist-newtyle. Maybe need a "cabal new-clean"?
2. cabal clean obviously isn't going to respect projects
CC @dcoutts
| 1.0 | [nix-local-build] Need new-style clean - A few bugs:
1. "cabal clean" doesn't clean dist-newtyle. Maybe need a "cabal new-clean"?
2. cabal clean obviously isn't going to respect projects
CC @dcoutts
| build | need new style clean a few bugs cabal clean doesn t clean dist newtyle maybe need a cabal new clean cabal clean obviously isn t going to respect projects cc dcoutts | 1 |
726,728 | 25,009,000,482 | IssuesEvent | 2022-11-03 14:00:43 | Tau-ri-Dev/JSGMod-1.12.2 | https://api.github.com/repos/Tau-ri-Dev/JSGMod-1.12.2 | closed | Bug: FPS drops | Bug/Issue Low priority Known issue Waiting on info Visual bug | **Describe the bug**
FPS issues in latest AUNIS updates.
I can't play the latest versions of Aunis because of this. Before these updates I could play with a modpack containing at least 100 mods with a minimum of 60 fps. Whereas today, even with mods increasing the performance of the game like Optifine or BetterFPS or FoamFix, I cannot play beyond 8fps (with only Anuis and the mods to make it work).
**To Reproduce**
You must already have a rather average config ... (Not horrible, just average)
Then after launching the game on a world (whether flat or normal) open the inventory.
This brings me down from 60 to 1fps in the inventory menu. After taking Aunis items from my hotbar the fps are 8 until I drop those items.
**Expected behavior**
Maybe add a button in the "low config" configurations that would lower the item details? (seems to be related to the items anyway)
Otherwise maybe change something so that the mod is faster or less greedy when you have these items...
**Screenshots**
I don't have a screen to show as the concerns are lack of FPS...
**Mod version**
The latest version of Aunis that I can use is : aunis-1.12.2-4.9.2.1-alpha
Or the one above but not really higher.
| 1.0 | Bug: FPS drops - **Describe the bug**
FPS issues in latest AUNIS updates.
I can't play the latest versions of Aunis because of this. Before these updates I could play with a modpack containing at least 100 mods with a minimum of 60 fps. Whereas today, even with mods increasing the performance of the game like Optifine or BetterFPS or FoamFix, I cannot play beyond 8fps (with only Anuis and the mods to make it work).
**To Reproduce**
You must already have a rather average config ... (Not horrible, just average)
Then after launching the game on a world (whether flat or normal) open the inventory.
This brings me down from 60 to 1fps in the inventory menu. After taking Aunis items from my hotbar the fps are 8 until I drop those items.
**Expected behavior**
Maybe add a button in the "low config" configurations that would lower the item details? (seems to be related to the items anyway)
Otherwise maybe change something so that the mod is faster or less greedy when you have these items...
**Screenshots**
I don't have a screen to show as the concerns are lack of FPS...
**Mod version**
The latest version of Aunis that I can use is : aunis-1.12.2-4.9.2.1-alpha
Or the one above but not really higher.
| non_build | bug fps drops describe the bug fps issues in latest aunis updates i can t play the latest versions of aunis because of this before these updates i could play with a modpack containing at least mods with a minimum of fps whereas today even with mods increasing the performance of the game like optifine or betterfps or foamfix i cannot play beyond with only anuis and the mods to make it work to reproduce you must already have a rather average config not horrible just average then after launching the game on a world whether flat or normal open the inventory this brings me down from to in the inventory menu after taking aunis items from my hotbar the fps are until i drop those items expected behavior maybe add a button in the low config configurations that would lower the item details seems to be related to the items anyway otherwise maybe change something so that the mod is faster or less greedy when you have these items screenshots i don t have a screen to show as the concerns are lack of fps mod version the latest version of aunis that i can use is aunis alpha or the one above but not really higher | 0 |
64,297 | 15,859,447,529 | IssuesEvent | 2021-04-08 08:02:02 | OpenNMT/CTranslate2 | https://api.github.com/repos/OpenNMT/CTranslate2 | closed | Ctranslate2 on NVIDIA Jetson | build gpu | I am trying to get Ctranslate2 on GPU(Nvidia Jetson).
For this I tried to compile
But stuck with this error -
`-- Compiling for multiple CPU ISA and enabling runtime dispatch
-- A library with BLAS API found.
-- Autodetected CUDA architecture(s): 7.2
-- NVCC compilation flags: -std=c++11;-gencode;arch=compute_72,code=sm_72
-- Found CUB include directory: /home/gpu-ctranslate/CTranslate2/third_party/cub
-- Found Thrust include directory: /home/gpu-ctranslate/CTranslate2/third_party/thrust
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
CUDA_cublas_device_LIBRARY (ADVANCED)
linked by target "ctranslate2" in directory /home/gpu-ctranslate/CTranslate2
-- Configuring incomplete, errors occurred!
See also "/home/gpu-ctranslate/CTranslate2/CMakeFiles/CMakeOutput.log".
See also "/home/gpu-ctranslate/CTranslate2/CMakeFiles/CMakeError.log".
what I have to add more.` | 1.0 | Ctranslate2 on NVIDIA Jetson - I am trying to get Ctranslate2 on GPU(Nvidia Jetson).
For this I tried to compile
But stuck with this error -
`-- Compiling for multiple CPU ISA and enabling runtime dispatch
-- A library with BLAS API found.
-- Autodetected CUDA architecture(s): 7.2
-- NVCC compilation flags: -std=c++11;-gencode;arch=compute_72,code=sm_72
-- Found CUB include directory: /home/gpu-ctranslate/CTranslate2/third_party/cub
-- Found Thrust include directory: /home/gpu-ctranslate/CTranslate2/third_party/thrust
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
CUDA_cublas_device_LIBRARY (ADVANCED)
linked by target "ctranslate2" in directory /home/gpu-ctranslate/CTranslate2
-- Configuring incomplete, errors occurred!
See also "/home/gpu-ctranslate/CTranslate2/CMakeFiles/CMakeOutput.log".
See also "/home/gpu-ctranslate/CTranslate2/CMakeFiles/CMakeError.log".
what I have to add more.` | build | on nvidia jetson i am trying to get on gpu nvidia jetson for this i tried to compile but stuck with this error compiling for multiple cpu isa and enabling runtime dispatch a library with blas api found autodetected cuda architecture s nvcc compilation flags std c gencode arch compute code sm found cub include directory home gpu ctranslate third party cub found thrust include directory home gpu ctranslate third party thrust cmake error the following variables are used in this project but they are set to notfound please set them or make sure they are set and tested correctly in the cmake files cuda cublas device library advanced linked by target in directory home gpu ctranslate configuring incomplete errors occurred see also home gpu ctranslate cmakefiles cmakeoutput log see also home gpu ctranslate cmakefiles cmakeerror log what i have to add more | 1 |
66,351 | 16,596,030,158 | IssuesEvent | 2021-06-01 13:36:15 | root-project/root | https://api.github.com/repos/root-project/root | closed | tutorial/math/exampleFunction.py, tutorials/math/multivarGaus.C not disabled without MathMore | bug in:Build System | ### Describe the bug
```
Processing /data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default/root/tutorials/math/multivarGaus.C...
In file included from input_line_10:1:
/data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default/root/tutorials/math/multivarGaus.C:12:15: error: no type named 'GSLRandomEngine' in namespace 'ROOT::Math'
ROOT::Math::GSLRandomEngine rnd;
~~~~~~~~~~~~^
CMake Error at /data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default/build/RootTestDriver.cmake:237 (message):
error code: 1
```
```
Directory: /data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default/build/runtutorials
"tutorial-math-exampleFunction-py" start time: May 11 01:28 CEST
Output:
----------------------------------------------------------
****************************************
Minimizer is Minuit / Migrad
MinFCN = 1.687e-08
NDf = 0
Edm = 3.37793e-08
NCalls = 146
Par_0 = 0.999952 +/- 1.00372
Par_1 = 0.999892 +/- 2.00986
Error in <TUnixSystem::FindDynamicLibrary>: libMathMore[.so | .dll | .dylib | .sl | .dl | .a] does not exist in /data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default/build/lib:/data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default/build/lib:.:/data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default/build/lib:/lib64/tls/haswell/x86_64:/lib64/tls/haswell:/lib64/tls/x86_64:/lib64/tls:/lib64/haswell/x86_64:/lib64/haswell:/lib64/x86_64:/lib64:/usr/lib64/tls/haswell/x86_64:/usr/lib64/tls/haswell:/usr/lib64/tls/x86_64:/usr/lib64/tls:/usr/lib64/haswell/x86_64:/usr/lib64/haswell:/usr/lib64/x86_64:/usr/lib64:/data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default
```
### Expected behavior
no libMathMore ==> tutorials disabled for ctest
### To Reproduce
https://lcgapp-services.cern.ch/root-jenkins/job/root-benchmark/2577/BUILDTYPE=Release,COMPILER=default,LABEL=performance-cc8,SPEC=default/parsed_console/
| 1.0 | tutorial/math/exampleFunction.py, tutorials/math/multivarGaus.C not disabled without MathMore - ### Describe the bug
```
Processing /data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default/root/tutorials/math/multivarGaus.C...
In file included from input_line_10:1:
/data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default/root/tutorials/math/multivarGaus.C:12:15: error: no type named 'GSLRandomEngine' in namespace 'ROOT::Math'
ROOT::Math::GSLRandomEngine rnd;
~~~~~~~~~~~~^
CMake Error at /data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default/build/RootTestDriver.cmake:237 (message):
error code: 1
```
```
Directory: /data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default/build/runtutorials
"tutorial-math-exampleFunction-py" start time: May 11 01:28 CEST
Output:
----------------------------------------------------------
****************************************
Minimizer is Minuit / Migrad
MinFCN = 1.687e-08
NDf = 0
Edm = 3.37793e-08
NCalls = 146
Par_0 = 0.999952 +/- 1.00372
Par_1 = 0.999892 +/- 2.00986
Error in <TUnixSystem::FindDynamicLibrary>: libMathMore[.so | .dll | .dylib | .sl | .dl | .a] does not exist in /data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default/build/lib:/data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default/build/lib:.:/data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default/build/lib:/lib64/tls/haswell/x86_64:/lib64/tls/haswell:/lib64/tls/x86_64:/lib64/tls:/lib64/haswell/x86_64:/lib64/haswell:/lib64/x86_64:/lib64:/usr/lib64/tls/haswell/x86_64:/usr/lib64/tls/haswell:/usr/lib64/tls/x86_64:/usr/lib64/tls:/usr/lib64/haswell/x86_64:/usr/lib64/haswell:/usr/lib64/x86_64:/usr/lib64:/data/sftnight/workspace/root-benchmark/BUILDTYPE/Release/COMPILER/default/LABEL/performance-cc8/SPEC/default
```
### Expected behavior
no libMathMore ==> tutorials disabled for ctest
### To Reproduce
https://lcgapp-services.cern.ch/root-jenkins/job/root-benchmark/2577/BUILDTYPE=Release,COMPILER=default,LABEL=performance-cc8,SPEC=default/parsed_console/
| build | tutorial math examplefunction py tutorials math multivargaus c not disabled without mathmore describe the bug processing data sftnight workspace root benchmark buildtype release compiler default label performance spec default root tutorials math multivargaus c in file included from input line data sftnight workspace root benchmark buildtype release compiler default label performance spec default root tutorials math multivargaus c error no type named gslrandomengine in namespace root math root math gslrandomengine rnd cmake error at data sftnight workspace root benchmark buildtype release compiler default label performance spec default build roottestdriver cmake message error code directory data sftnight workspace root benchmark buildtype release compiler default label performance spec default build runtutorials tutorial math examplefunction py start time may cest output minimizer is minuit migrad minfcn ndf edm ncalls par par error in libmathmore does not exist in data sftnight workspace root benchmark buildtype release compiler default label performance spec default build lib data sftnight workspace root benchmark buildtype release compiler default label performance spec default build lib data sftnight workspace root benchmark buildtype release compiler default label performance spec default build lib tls haswell tls haswell tls tls haswell haswell usr tls haswell usr tls haswell usr tls usr tls usr haswell usr haswell usr usr data sftnight workspace root benchmark buildtype release compiler default label performance spec default expected behavior no libmathmore tutorials disabled for ctest to reproduce | 1 |
48,061 | 12,141,604,373 | IssuesEvent | 2020-04-23 23:00:27 | tensorflow/tensorflow | https://api.github.com/repos/tensorflow/tensorflow | closed | Need to update nightly-devel and nightly-devel-gpu docker images | TF 2.2 subtype: ubuntu/linux type:build/install | Currently (as expected):
$ docker run --rm -it tensorflow/tensorflow:nightly bash
\# python -c "import tensorflow as tf; print(tf.__version__)"
2.2.0-dev20200422
However:
$ docker run --rm -it tensorflow/tensorflow:nightly-devel bash
\# python -c "import tensorflow as tf; print(tf.__version__)"
1.12.0-rc0
Same goes for the nightly-devel-gpu docker image. | 1.0 | Need to update nightly-devel and nightly-devel-gpu docker images - Currently (as expected):
$ docker run --rm -it tensorflow/tensorflow:nightly bash
\# python -c "import tensorflow as tf; print(tf.__version__)"
2.2.0-dev20200422
However:
$ docker run --rm -it tensorflow/tensorflow:nightly-devel bash
\# python -c "import tensorflow as tf; print(tf.__version__)"
1.12.0-rc0
Same goes for the nightly-devel-gpu docker image. | build | need to update nightly devel and nightly devel gpu docker images currently as expected docker run rm it tensorflow tensorflow nightly bash python c import tensorflow as tf print tf version however docker run rm it tensorflow tensorflow nightly devel bash python c import tensorflow as tf print tf version same goes for the nightly devel gpu docker image | 1 |
37,401 | 9,998,225,048 | IssuesEvent | 2019-07-12 07:35:45 | ShaikASK/Testing | https://api.github.com/repos/ShaikASK/Testing | closed | Candidate Dashboard : Empty Screen is being displayed upon removing the uploaded attachment | Beta Release #5 Build#1 Candidate Dashboard Candidate Module Defect P1 | Steps To Replicate :
1.Launch the URL
2.Sign in as Candidate
3.Sign the Offer Letter
4.Navigate to “Common Details” screen
5.Fill all the required webforms and click on Next button
6.Navigate to Dashboard screen
7.Click on start button displayed against any Document which has “HR Certification with Upload Documents” settings
8.Navigate to required documents and upload supported document against “Upload Document” section
9.Remove the uploaded document from upload document section
Experienced Behavior : Observed that Empty Screen is being displayed upon removing the uploaded attachment (Refer Screen Shot)
Expected Behavior : Ensure that Empty Screen should not be displayed upon removing the uploaded document

| 1.0 | Candidate Dashboard : Empty Screen is being displayed upon removing the uploaded attachment - Steps To Replicate :
1.Launch the URL
2.Sign in as Candidate
3.Sign the Offer Letter
4.Navigate to “Common Details” screen
5.Fill all the required webforms and click on Next button
6.Navigate to Dashboard screen
7.Click on start button displayed against any Document which has “HR Certification with Upload Documents” settings
8.Navigate to required documents and upload supported document against “Upload Document” section
9.Remove the uploaded document from upload document section
Experienced Behavior : Observed that Empty Screen is being displayed upon removing the uploaded attachment (Refer Screen Shot)
Expected Behavior : Ensure that Empty Screen should not be displayed upon removing the uploaded document

| build | candidate dashboard empty screen is being displayed upon removing the uploaded attachment steps to replicate launch the url sign in as candidate sign the offer letter navigate to “common details” screen fill all the required webforms and click on next button navigate to dashboard screen click on start button displayed against any document which has “hr certification with upload documents” settings navigate to required documents and upload supported document against “upload document” section remove the uploaded document from upload document section experienced behavior observed that empty screen is being displayed upon removing the uploaded attachment refer screen shot expected behavior ensure that empty screen should not be displayed upon removing the uploaded document | 1 |
617 | 2,594,251,518 | IssuesEvent | 2015-02-20 01:08:13 | BALL-Project/ball | https://api.github.com/repos/BALL-Project/ball | closed | undefined reference when compiled without --enable-gsl | C: Buildsystem P: major R: fixed T: defect | **Reported by mcfrost on 2 Aug 39313996 08:53 UTC**
When ball devel gets configured without --enable-gsl, the library libBALL.so contains an unresolved symbol to 'RMSDMinimizer::computeTransformation' | 1.0 | undefined reference when compiled without --enable-gsl - **Reported by mcfrost on 2 Aug 39313996 08:53 UTC**
When ball devel gets configured without --enable-gsl, the library libBALL.so contains an unresolved symbol to 'RMSDMinimizer::computeTransformation' | build | undefined reference when compiled without enable gsl reported by mcfrost on aug utc when ball devel gets configured without enable gsl the library libball so contains an unresolved symbol to rmsdminimizer computetransformation | 1 |
186,276 | 14,394,659,934 | IssuesEvent | 2020-12-03 01:49:22 | github-vet/rangeclosure-findings | https://api.github.com/repos/github-vet/rangeclosure-findings | closed | logosapollos/ego: src/pkg/regexp/all_test.go; 3 LoC | fresh test tiny |
Found a possible issue in [logosapollos/ego](https://www.github.com/logosapollos/ego) at [src/pkg/regexp/all_test.go](https://github.com/logosapollos/ego/blob/61fab933d8850b59a60a88ef3456f6a7470e9d02/src/pkg/regexp/all_test.go#L94-L96)
The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements
which capture loop variables.
[Click here to see the code in its original context.](https://github.com/logosapollos/ego/blob/61fab933d8850b59a60a88ef3456f6a7470e9d02/src/pkg/regexp/all_test.go#L94-L96)
<details>
<summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary>
```go
for _, test := range findTests {
matchTest(t, &test)
}
```
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> function call which takes a reference to test at line 95 may start a goroutine
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 61fab933d8850b59a60a88ef3456f6a7470e9d02
| 1.0 | logosapollos/ego: src/pkg/regexp/all_test.go; 3 LoC -
Found a possible issue in [logosapollos/ego](https://www.github.com/logosapollos/ego) at [src/pkg/regexp/all_test.go](https://github.com/logosapollos/ego/blob/61fab933d8850b59a60a88ef3456f6a7470e9d02/src/pkg/regexp/all_test.go#L94-L96)
The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements
which capture loop variables.
[Click here to see the code in its original context.](https://github.com/logosapollos/ego/blob/61fab933d8850b59a60a88ef3456f6a7470e9d02/src/pkg/regexp/all_test.go#L94-L96)
<details>
<summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary>
```go
for _, test := range findTests {
matchTest(t, &test)
}
```
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> function call which takes a reference to test at line 95 may start a goroutine
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 61fab933d8850b59a60a88ef3456f6a7470e9d02
| non_build | logosapollos ego src pkg regexp all test go loc found a possible issue in at the below snippet of go code triggered static analysis which searches for goroutines and or defer statements which capture loop variables click here to show the line s of go which triggered the analyzer go for test range findtests matchtest t test below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message function call which takes a reference to test at line may start a goroutine leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id | 0 |
284,804 | 30,913,692,592 | IssuesEvent | 2023-08-05 02:37:58 | Nivaskumark/kernel_v4.19.72_old | https://api.github.com/repos/Nivaskumark/kernel_v4.19.72_old | reopened | WS-2021-0334 (High) detected in linux-yoctov5.4.51 | Mend: dependency security vulnerability | ## WS-2021-0334 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in HEAD commit: <a href="https://github.com/Nivaskumark/kernel_v4.19.72/commit/ce49083a1c14be2d13cb5e878257d293e6c748bc">ce49083a1c14be2d13cb5e878257d293e6c748bc</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/netfilter/nf_synproxy_core.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/netfilter/nf_synproxy_core.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Linux/Kernel in versions v5.13-rc1 to v5.13-rc6 is vulnerable to out of bounds when parsing TCP options
<p>Publish Date: 2021-05-31
<p>URL: <a href=https://github.com/gregkh/linux/commit/6defc77d48eff74075b80ad5925061b2fc010d98>WS-2021-0334</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://osv.dev/vulnerability/UVI-2021-1000919">https://osv.dev/vulnerability/UVI-2021-1000919</a></p>
<p>Release Date: 2021-05-31</p>
<p>Fix Resolution: v5.4.128</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2021-0334 (High) detected in linux-yoctov5.4.51 - ## WS-2021-0334 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in HEAD commit: <a href="https://github.com/Nivaskumark/kernel_v4.19.72/commit/ce49083a1c14be2d13cb5e878257d293e6c748bc">ce49083a1c14be2d13cb5e878257d293e6c748bc</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/netfilter/nf_synproxy_core.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/netfilter/nf_synproxy_core.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Linux/Kernel in versions v5.13-rc1 to v5.13-rc6 is vulnerable to out of bounds when parsing TCP options
<p>Publish Date: 2021-05-31
<p>URL: <a href=https://github.com/gregkh/linux/commit/6defc77d48eff74075b80ad5925061b2fc010d98>WS-2021-0334</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://osv.dev/vulnerability/UVI-2021-1000919">https://osv.dev/vulnerability/UVI-2021-1000919</a></p>
<p>Release Date: 2021-05-31</p>
<p>Fix Resolution: v5.4.128</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_build | ws high detected in linux ws high severity vulnerability vulnerable library linux yocto linux embedded kernel library home page a href found in head commit a href found in base branch master vulnerable source files net netfilter nf synproxy core c net netfilter nf synproxy core c vulnerability details linux kernel in versions to is vulnerable to out of bounds when parsing tcp options publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend | 0 |
27,951 | 8,055,637,741 | IssuesEvent | 2018-08-02 09:55:13 | carla-simulator/carla | https://api.github.com/repos/carla-simulator/carla | opened | Building on Windows 0.9.X | build system help wanted in progress | # New and automated instructions
Make sure you have:
```
Unreal Engine 4.19
Visual Studio 2017 (as default version, if not set it)
Windows SDK 10 (should be installed with VS2017)
```
Instructions:
```
1) git clone https://github.com/carla-simulator/carla.git
2) git checkout 0.9-win32
3) Download https://drive.google.com/uc?id=1FtC00CrDb7Kz5StBAwb6vqOGbzZtpROx&export=download
4) Extract in Unreal/CarlaUE4/Content/Carla
5) make launch
```
## Before posting in this issue:
- Search for **similar comments** (just a quick `ctrl+F` search).
- Make sure you use the **correct Unreal Engine version**.
If your questions satisfy the previous points, we will be glad to answer them. :grimacing:
## Discord
Consider join our discord channel for a further talk.
We have a Windows specific channel.
[](https://discord.gg/42KJdRj)
- - -
## Manual build
Follow the instructions from 1 to 4, at this point, double-clicking on `Unreal/CarlaUE4/CarlaUE4.uproject` should work. A warning about Carla plugin being not compatible appears sometimes, it’s an Unreal’s bug and can be ignored, press `no` to disable Carla plugin. Then press `yes` when asked to compile the missing modules.
If that doesn’t work, you can open the project with Visual Studio. Right click the `CarlaUE4.uproject` and select `Generate Visual Studio Project Files`. Then open the Visual Studio solution it generates (*.sln). Hopefully this will give some hints on what fails.
- - -
PS: If you are coming from an open issue related with the build please make reference to it
| 1.0 | Building on Windows 0.9.X - # New and automated instructions
Make sure you have:
```
Unreal Engine 4.19
Visual Studio 2017 (as default version, if not set it)
Windows SDK 10 (should be installed with VS2017)
```
Instructions:
```
1) git clone https://github.com/carla-simulator/carla.git
2) git checkout 0.9-win32
3) Download https://drive.google.com/uc?id=1FtC00CrDb7Kz5StBAwb6vqOGbzZtpROx&export=download
4) Extract in Unreal/CarlaUE4/Content/Carla
5) make launch
```
## Before posting in this issue:
- Search for **similar comments** (just a quick `ctrl+F` search).
- Make sure you use the **correct Unreal Engine version**.
If your questions satisfy the previous points, we will be glad to answer them. :grimacing:
## Discord
Consider join our discord channel for a further talk.
We have a Windows specific channel.
[](https://discord.gg/42KJdRj)
- - -
## Manual build
Follow the instructions from 1 to 4, at this point, double-clicking on `Unreal/CarlaUE4/CarlaUE4.uproject` should work. A warning about Carla plugin being not compatible appears sometimes, it’s an Unreal’s bug and can be ignored, press `no` to disable Carla plugin. Then press `yes` when asked to compile the missing modules.
If that doesn’t work, you can open the project with Visual Studio. Right click the `CarlaUE4.uproject` and select `Generate Visual Studio Project Files`. Then open the Visual Studio solution it generates (*.sln). Hopefully this will give some hints on what fails.
- - -
PS: If you are coming from an open issue related with the build please make reference to it
| build | building on windows x new and automated instructions make sure you have unreal engine visual studio as default version if not set it windows sdk should be installed with instructions git clone git checkout download extract in unreal content carla make launch before posting in this issue search for similar comments just a quick ctrl f search make sure you use the correct unreal engine version if your questions satisfy the previous points we will be glad to answer them grimacing discord consider join our discord channel for a further talk we have a windows specific channel manual build follow the instructions from to at this point double clicking on unreal uproject should work a warning about carla plugin being not compatible appears sometimes it’s an unreal’s bug and can be ignored press no to disable carla plugin then press yes when asked to compile the missing modules if that doesn’t work you can open the project with visual studio right click the uproject and select generate visual studio project files then open the visual studio solution it generates sln hopefully this will give some hints on what fails ps if you are coming from an open issue related with the build please make reference to it | 1 |
75,015 | 20,603,291,933 | IssuesEvent | 2022-03-06 15:54:45 | rizinorg/rizin | https://api.github.com/repos/rizinorg/rizin | closed | Show the `mu_*` asserts in the AppVeyor (Windows) logs | good first issue rz-test buildsystem Windows | ### Work environment
| Questions | Answers
|------------------------------------------------------|--------------------
| OS/arch/bits (mandatory) | Windows 64 bits
| File format of the file you reverse (mandatory) | -
| Architecture/bits of the file (mandatory) | -
| `rizin -v` full output, **not truncated** (mandatory) | https://github.com/rizinorg/rizin/commit/a25ef4e9c6d9734679e841d8f4a2a868922761f6
### Expected behavior
Windows AppVeyor unit tests output should print the exact assert
```
Run ninja -C build test
ninja: Entering directory `build'
[0/1] Running all tests.
1/89 addr_interval OK 0.02s
2/89 agraph OK 0.10s
3/89 analysis_cc OK 0.05s
4/89 analysis_class_graph OK 0.10s
5/89 analysis_function OK 0.04s
6/89 analysis_hints OK 0.04s
7/89 analysis_meta OK 0.07s
8/89 analysis_op OK 0.05s
9/89 analysis_var OK 0.04s
10/89 analysis_global_var OK 0.03s
11/89 analysis_xrefs OK 0.03s
12/89 annotated_code OK 0.02s
13/89 analysis_block OK 1.19s
14/89 base64 OK 0.02s
15/89 autocmplt OK 0.96s
16/89 bin OK 0.04s
17/89 bin_lines OK 0.22s
18/89 binheap OK 0.02s
19/89 bitmap OK 0.02s
20/89 buf OK 0.17s
21/89 cmd OK 0.02s
22/89 config OK 0.02s
23/89 cons OK 0.02s
24/89 contrbtree OK 0.02s
25/89 core_bin OK 0.91s
26/89 core_cmd OK 0.31s
27/89 core_seek OK 0.52s
28/89 big OK 2.69s
29/89 debruijn OK 0.02s
30/89 debug OK 0.02s
31/89 debug_session OK 0.02s
32/89 diff OK 0.03s
33/89 core_task OK 0.19s
34/89 dwarf_info OK 0.03s
35/89 dwarf OK 0.05s
36/89 endian OK 0.02s
37/89 event OK 0.02s
38/89 file OK 0.02s
39/89 flags OK 0.02s
40/89 glob OK 0.02s
41/89 graph OK 0.02s
42/89 hash OK 0.02s
43/89 hex OK 0.02s
44/89 id_storage OK 0.02s
45/89 idpool OK 0.02s
46/89 idstorage OK 0.02s
47/89 intervaltree OK 0.30s
48/89 io OK 0.02s
49/89 itv OK 0.02s
50/89 io_ihex OK 0.09s
51/89 json OK 0.02s
52/89 list OK 0.02s
53/89 ovf OK 0.02s
54/89 pdb OK 0.23s
55/89 pj OK 0.02s
56/89 project_migrate OK 0.59s
57/89 queue OK 0.02s
58/89 rbtree OK 0.10s
59/89 reg OK 0.02s
60/89 run OK 0.02s
61/89 rz_test OK 0.02s
62/89 rzpipe OK 0.16s
63/89 serialize_analysis OK 0.16s
64/89 serialize_config OK 0.02s
65/89 serialize_flag OK 0.02s
66/89 serialize_spaces OK 0.02s
67/89 serialize_types OK 0.05s
68/89 dwarf_integration OK 2.23s
69/89 skiplist OK 0.02s
70/89 skyline OK 0.02s
71/89 spaces OK 0.02s
72/89 sparse OK 0.02s
73/89 stack OK 0.02s
74/89 str OK 0.02s
75/89 strbuf OK 0.02s
76/89 subprocess OK 0.04s
77/89 table OK 0.02s
78/89 task OK 0.02s
79/89 tree OK 0.02s
80/89 sign OK 0.38s
81/89 type OK 0.09s
82/89 uleb128 OK 0.02s
83/89 unum OK 0.02s
84/89 util OK 0.02s
85/89 vector OK 0.02s
86/89 inflate_deflate FAIL 0.03s exit status 1
87/89 bin_vfiles OK 0.04s
88/89 cpu_platform_profiles OK 0.70s
89/89 open_analyse_save_load_project OK 2.26s
The output from the failed tests:
86/89 inflate_deflate FAIL 0.03s exit status 1
--- command ---
10:49:49 /home/runner/work/rizin/rizin/build/test/unit/test_inflate_deflate
--- stdout ---
test_rz_inflate ERR
[XX] Fail at line 31: Windows inflate OS bits: expected 11, got 3.
test_rz_deflate OK
test_rz_inflate_buf OK
test_rz_deflate_buf OK
-------
Summary of Failures:
86/89 inflate_deflate FAIL 0.03s exit status 1
Ok: 88
Expected Fail: 0
Fail: 1
Unexpected Pass: 0
Skipped: 0
Timeout: 0
Full log written to /home/runner/work/rizin/rizin/build/meson-logs/testlog.txt
FAILED: meson-test
/usr/local/bin/meson test --no-rebuild --print-errorlogs
ninja: build stopped: subcommand failed.
```
### Actual behavior
```
%PYTHON%\Scripts\meson test -C build -t 10
ninja: Entering directory `C:\projects\rizin\build'
ninja: no work to do.
1/89 addr_interval OK 0.02s
2/89 agraph OK 0.11s
3/89 analysis_cc OK 0.04s
4/89 analysis_class_graph OK 0.11s
5/89 analysis_function OK 0.03s
6/89 analysis_hints OK 0.03s
7/89 analysis_meta OK 0.05s
8/89 analysis_op OK 0.05s
9/89 analysis_var OK 0.03s
10/89 analysis_global_var OK 0.02s
11/89 analysis_xrefs OK 0.02s
12/89 annotated_code OK 0.02s
13/89 analysis_block OK 0.80s
14/89 base64 OK 0.00s
15/89 autocmplt OK 0.93s
16/89 big OK 0.64s
17/89 bin OK 0.06s
18/89 binheap OK 0.02s
19/89 bitmap OK 0.02s
20/89 bin_lines OK 0.20s
21/89 cmd OK 0.02s
22/89 config OK 0.02s
23/89 cons OK 0.02s
24/89 contrbtree OK 0.02s
25/89 buf OK 0.23s
26/89 core_cmd OK 0.34s
27/89 core_seek OK 0.62s
28/89 core_task OK 0.19s
29/89 debruijn OK 0.03s
30/89 debug OK 0.03s
31/89 debug_session OK 0.02s
32/89 diff OK 0.03s
33/89 dwarf OK 0.03s
34/89 dwarf_info OK 0.03s
35/89 dwarf_integration OK 1.98s
36/89 endian OK 0.02s
37/89 event OK 0.02s
38/89 file OK 0.02s
39/89 flags OK 0.02s
40/89 glob OK 0.02s
41/89 graph OK 0.02s
42/89 hash OK 0.02s
43/89 hex OK 0.02s
44/89 id_storage OK 0.02s
45/89 idpool OK 0.02s
46/89 idstorage OK 0.02s
47/89 intervaltree OK 0.19s
48/89 io OK 0.02s
49/89 itv OK 0.02s
50/89 io_ihex OK 0.30s
51/89 json OK 0.02s
52/89 list OK 0.02s
53/89 ovf OK 0.00s
54/89 pdb OK 0.28s
55/89 pj OK 0.02s
56/89 core_bin OK 4.65s
57/89 queue OK 0.02s
58/89 rbtree OK 0.08s
59/89 reg OK 0.02s
60/89 run OK 0.02s
61/89 rz_test OK 0.02s
62/89 rzpipe OK 0.02s
63/89 serialize_analysis OK 0.16s
64/89 serialize_config OK 0.00s
65/89 serialize_flag OK 0.00s
66/89 serialize_spaces OK 0.00s
67/89 serialize_types OK 0.06s
68/89 project_migrate OK 0.73s
69/89 skiplist OK 0.02s
70/89 skyline OK 0.02s
71/89 spaces OK 0.02s
72/89 sparse OK 0.01s
73/89 stack OK 0.00s
74/89 str OK 0.02s
75/89 strbuf OK 0.02s
76/89 subprocess OK 0.12s
77/89 table OK 0.02s
78/89 task OK 0.05s
79/89 tree OK 0.02s
80/89 sign OK 0.41s
81/89 uleb128 OK 0.02s
82/89 unum OK 0.02s
83/89 type OK 0.09s
84/89 util OK 0.02s
85/89 vector OK 0.02s
86/89 inflate_deflate FAIL 0.03s exit status 1
>>> PATH=C:/projects/rizin/build/librz/analysis;C:/projects/rizin/build/librz/asm;C:/projects/rizin/build/librz/bin;C:/projects/rizin/build/librz/bp;C:/projects/rizin/build/librz/config;C:/projects/rizin/build/librz/cons;C:/projects/rizin/build/librz/core;C:/projects/rizin/build/librz/diff;C:/projects/rizin/build/librz/crypto;C:/projects/rizin/build/librz/debug;C:/projects/rizin/build/librz/egg;C:/projects/rizin/build/librz/flag;C:/projects/rizin/build/librz/hash;C:/projects/rizin/build/librz/io;C:/projects/rizin/build/librz/lang;C:/projects/rizin/build/librz/magic;C:/projects/rizin/build/librz/main;C:/projects/rizin/build/librz/parse;C:/projects/rizin/build/librz/reg;C:/projects/rizin/build/librz/search;C:/projects/rizin/build/librz/socket;C:/projects/rizin/build/librz/syscall;C:/projects/rizin/build/librz/type;C:/projects/rizin/build/librz/util;C:\projects\rizin\rizin-vs2019_64-v0.3.0-git\bin;C:\Python38-x64;C:\msys64\mingw64\bin;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\\Extensions\Microsoft\IntelliCode\CLI;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.29.30037\bin\HostX64\x64;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\VC\VCPackages;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\CommonExtensions\Microsoft\TestWindow;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\CommonExtensions\Microsoft\TeamFoundation\Team Explorer;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\MSBuild\Current\bin\Roslyn;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Team Tools\Performance Tools\x64;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Team Tools\Performance Tools;C:\Program Files (x86)\Microsoft Visual Studio\Shared\Common\VSPerfCollectionTools\vs2019\\x64;C:\Program Files (x86)\Microsoft Visual Studio\Shared\Common\VSPerfCollectionTools\vs2019\;C:\Program Files (x86)\Microsoft SDKs\Windows\v10.0A\bin\NETFX 4.8 Tools\x64\;C:\Program Files (x86)\HTML Help Workshop;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\CommonExtensions\Microsoft\FSharp\Tools;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\Tools\devinit;C:\Program Files (x86)\Windows Kits\10\bin\10.0.19041.0\x64;C:\Program Files (x86)\Windows Kits\10\bin\x64;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\\MSBuild\Current\Bin;C:\Windows\Microsoft.NET\Framework64\v4.0.30319;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\Tools\;C:\Program Files (x86)\Microsoft SDKs\Azure\CLI2\wbin;C:\Program Files\Git\cmd;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Windows\System32\OpenSSH\;C:\Program Files\PowerShell\7\;C:\Program Files\7-Zip;C:\Program Files\Microsoft\Web Platform Installer\;C:\Tools\NuGet;C:\Tools\PsTools;C:\Program Files\Git\usr\bin;C:\Program Files\Git LFS;C:\Program Files\Mercurial\;C:\Program Files (x86)\Subversion\bin;C:\Program Files\Docker\Docker\resources\bin;C:\ProgramData\DockerDesktop\version-bin;C:\Program Files\dotnet\;C:\Program Files\Microsoft SQL Server\130\Tools\Binn\;C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\170\Tools\Binn\;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\150;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\MSBuild\Current\Bin;C:\Tools\xUnit;C:\Tools\xUnit20;C:\Tools\NUnit\bin;C:\Tools\NUnit3\bin;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\Extensions\TestPlatform;C:\Ruby193\bin;C:\Tools\WebDriver;C:\Python27;C:\Python27\Scripts;C:\Program Files (x86)\nodejs\;C:\Program Files\nodejs;C:\Program Files (x86)\iojs;C:\Program Files\iojs;C:\Users\appveyor\AppData\Roaming\npm;C:\Program Files (x86)\Yarn\bin\;C:\go\bin;C:\Program Files\Java\jdk1.8.0\bin;C:\Program Files\erl10.7\bin;C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\130\Tools\Binn\;C:\Program Files (x86)\Microsoft SQL Server\140\Tools\Binn\;C:\Program Files\Microsoft SQL Server\140\Tools\Binn\;C:\Program Files\Microsoft SQL Server\140\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\150\Tools\Binn\;C:\Program Files\Microsoft SQL Server\150\Tools\Binn\;C:\Program Files\Microsoft SQL Server\150\DTS\Binn\;C:\Program Files\Amazon\AWSCLI\;C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\bin;C:\Program Files (x86)\Windows Kits\8.1\Windows Performance Toolkit\;C:\Program Files (x86)\Microsoft DirectX SDK;C:\Program Files\Microsoft Service Fabric\bin\Fabric\Fabric.Code;C:\Program Files\Microsoft SDKs\Service Fabric\Tools\ServiceFabricLocalClusterManager;C:\Program Files (x86)\Microsoft SQL Server\110\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\120\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\130\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\140\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\150\DTS\Binn\;C:\Tools\Doxygen;C:\Tools\Graphviz\bin;C:\Program Files\CMake\bin;C:\ProgramData\chocolatey\bin;C:\Program Files\LLVM\bin;C:\Tools\vcpkg;C:\Tools\Coverity\bin;C:\Program Files (x86)\NSIS;C:\Tools\Octopus;C:\Program Files\Meson\;C:\Program Files (x86)\Apache\Maven\bin;C:\Tools\GitVersion;C:\Users\appveyor\AppData\Local\Microsoft\WindowsApps;C:\Users\appveyor\.dotnet\tools;C:\Users\appveyor\AppData\Roaming\npm;C:\Users\appveyor\AppData\Local\Yarn\bin;C:\Program Files\AppVeyor\BuildAgent\;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\Llvm\x64\bin;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\CommonExtensions\Microsoft\CMake\CMake\bin;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\CommonExtensions\Microsoft\CMake\Ninja;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\VC\Linux\bin\ConnectionManagerExe MALLOC_PERTURB_=21 C:\projects\rizin\build\test/unit\test_inflate_deflate.exe
87/89 bin_vfiles OK 0.05s
88/89 cpu_platform_profiles OK 0.86s
89/89 open_analyse_save_load_project OK 1.90s
Ok: 88
Expected Fail: 0
Fail: 1
Unexpected Pass: 0
Skipped: 0
Timeout: 0
```
Note the missing lines:
```
The output from the failed tests:
86/89 inflate_deflate FAIL 0.03s exit status 1
--- command ---
10:49:49 /home/runner/work/rizin/rizin/build/test/unit/test_inflate_deflate
--- stdout ---
test_rz_inflate ERR
[XX] Fail at line 31: Windows inflate OS bits: expected 11, got 3.
test_rz_deflate OK
test_rz_inflate_buf OK
test_rz_deflate_buf OK
-------
``` | 1.0 | Show the `mu_*` asserts in the AppVeyor (Windows) logs - ### Work environment
| Questions | Answers
|------------------------------------------------------|--------------------
| OS/arch/bits (mandatory) | Windows 64 bits
| File format of the file you reverse (mandatory) | -
| Architecture/bits of the file (mandatory) | -
| `rizin -v` full output, **not truncated** (mandatory) | https://github.com/rizinorg/rizin/commit/a25ef4e9c6d9734679e841d8f4a2a868922761f6
### Expected behavior
Windows AppVeyor unit tests output should print the exact assert
```
Run ninja -C build test
ninja: Entering directory `build'
[0/1] Running all tests.
1/89 addr_interval OK 0.02s
2/89 agraph OK 0.10s
3/89 analysis_cc OK 0.05s
4/89 analysis_class_graph OK 0.10s
5/89 analysis_function OK 0.04s
6/89 analysis_hints OK 0.04s
7/89 analysis_meta OK 0.07s
8/89 analysis_op OK 0.05s
9/89 analysis_var OK 0.04s
10/89 analysis_global_var OK 0.03s
11/89 analysis_xrefs OK 0.03s
12/89 annotated_code OK 0.02s
13/89 analysis_block OK 1.19s
14/89 base64 OK 0.02s
15/89 autocmplt OK 0.96s
16/89 bin OK 0.04s
17/89 bin_lines OK 0.22s
18/89 binheap OK 0.02s
19/89 bitmap OK 0.02s
20/89 buf OK 0.17s
21/89 cmd OK 0.02s
22/89 config OK 0.02s
23/89 cons OK 0.02s
24/89 contrbtree OK 0.02s
25/89 core_bin OK 0.91s
26/89 core_cmd OK 0.31s
27/89 core_seek OK 0.52s
28/89 big OK 2.69s
29/89 debruijn OK 0.02s
30/89 debug OK 0.02s
31/89 debug_session OK 0.02s
32/89 diff OK 0.03s
33/89 core_task OK 0.19s
34/89 dwarf_info OK 0.03s
35/89 dwarf OK 0.05s
36/89 endian OK 0.02s
37/89 event OK 0.02s
38/89 file OK 0.02s
39/89 flags OK 0.02s
40/89 glob OK 0.02s
41/89 graph OK 0.02s
42/89 hash OK 0.02s
43/89 hex OK 0.02s
44/89 id_storage OK 0.02s
45/89 idpool OK 0.02s
46/89 idstorage OK 0.02s
47/89 intervaltree OK 0.30s
48/89 io OK 0.02s
49/89 itv OK 0.02s
50/89 io_ihex OK 0.09s
51/89 json OK 0.02s
52/89 list OK 0.02s
53/89 ovf OK 0.02s
54/89 pdb OK 0.23s
55/89 pj OK 0.02s
56/89 project_migrate OK 0.59s
57/89 queue OK 0.02s
58/89 rbtree OK 0.10s
59/89 reg OK 0.02s
60/89 run OK 0.02s
61/89 rz_test OK 0.02s
62/89 rzpipe OK 0.16s
63/89 serialize_analysis OK 0.16s
64/89 serialize_config OK 0.02s
65/89 serialize_flag OK 0.02s
66/89 serialize_spaces OK 0.02s
67/89 serialize_types OK 0.05s
68/89 dwarf_integration OK 2.23s
69/89 skiplist OK 0.02s
70/89 skyline OK 0.02s
71/89 spaces OK 0.02s
72/89 sparse OK 0.02s
73/89 stack OK 0.02s
74/89 str OK 0.02s
75/89 strbuf OK 0.02s
76/89 subprocess OK 0.04s
77/89 table OK 0.02s
78/89 task OK 0.02s
79/89 tree OK 0.02s
80/89 sign OK 0.38s
81/89 type OK 0.09s
82/89 uleb128 OK 0.02s
83/89 unum OK 0.02s
84/89 util OK 0.02s
85/89 vector OK 0.02s
86/89 inflate_deflate FAIL 0.03s exit status 1
87/89 bin_vfiles OK 0.04s
88/89 cpu_platform_profiles OK 0.70s
89/89 open_analyse_save_load_project OK 2.26s
The output from the failed tests:
86/89 inflate_deflate FAIL 0.03s exit status 1
--- command ---
10:49:49 /home/runner/work/rizin/rizin/build/test/unit/test_inflate_deflate
--- stdout ---
test_rz_inflate ERR
[XX] Fail at line 31: Windows inflate OS bits: expected 11, got 3.
test_rz_deflate OK
test_rz_inflate_buf OK
test_rz_deflate_buf OK
-------
Summary of Failures:
86/89 inflate_deflate FAIL 0.03s exit status 1
Ok: 88
Expected Fail: 0
Fail: 1
Unexpected Pass: 0
Skipped: 0
Timeout: 0
Full log written to /home/runner/work/rizin/rizin/build/meson-logs/testlog.txt
FAILED: meson-test
/usr/local/bin/meson test --no-rebuild --print-errorlogs
ninja: build stopped: subcommand failed.
```
### Actual behavior
```
%PYTHON%\Scripts\meson test -C build -t 10
ninja: Entering directory `C:\projects\rizin\build'
ninja: no work to do.
1/89 addr_interval OK 0.02s
2/89 agraph OK 0.11s
3/89 analysis_cc OK 0.04s
4/89 analysis_class_graph OK 0.11s
5/89 analysis_function OK 0.03s
6/89 analysis_hints OK 0.03s
7/89 analysis_meta OK 0.05s
8/89 analysis_op OK 0.05s
9/89 analysis_var OK 0.03s
10/89 analysis_global_var OK 0.02s
11/89 analysis_xrefs OK 0.02s
12/89 annotated_code OK 0.02s
13/89 analysis_block OK 0.80s
14/89 base64 OK 0.00s
15/89 autocmplt OK 0.93s
16/89 big OK 0.64s
17/89 bin OK 0.06s
18/89 binheap OK 0.02s
19/89 bitmap OK 0.02s
20/89 bin_lines OK 0.20s
21/89 cmd OK 0.02s
22/89 config OK 0.02s
23/89 cons OK 0.02s
24/89 contrbtree OK 0.02s
25/89 buf OK 0.23s
26/89 core_cmd OK 0.34s
27/89 core_seek OK 0.62s
28/89 core_task OK 0.19s
29/89 debruijn OK 0.03s
30/89 debug OK 0.03s
31/89 debug_session OK 0.02s
32/89 diff OK 0.03s
33/89 dwarf OK 0.03s
34/89 dwarf_info OK 0.03s
35/89 dwarf_integration OK 1.98s
36/89 endian OK 0.02s
37/89 event OK 0.02s
38/89 file OK 0.02s
39/89 flags OK 0.02s
40/89 glob OK 0.02s
41/89 graph OK 0.02s
42/89 hash OK 0.02s
43/89 hex OK 0.02s
44/89 id_storage OK 0.02s
45/89 idpool OK 0.02s
46/89 idstorage OK 0.02s
47/89 intervaltree OK 0.19s
48/89 io OK 0.02s
49/89 itv OK 0.02s
50/89 io_ihex OK 0.30s
51/89 json OK 0.02s
52/89 list OK 0.02s
53/89 ovf OK 0.00s
54/89 pdb OK 0.28s
55/89 pj OK 0.02s
56/89 core_bin OK 4.65s
57/89 queue OK 0.02s
58/89 rbtree OK 0.08s
59/89 reg OK 0.02s
60/89 run OK 0.02s
61/89 rz_test OK 0.02s
62/89 rzpipe OK 0.02s
63/89 serialize_analysis OK 0.16s
64/89 serialize_config OK 0.00s
65/89 serialize_flag OK 0.00s
66/89 serialize_spaces OK 0.00s
67/89 serialize_types OK 0.06s
68/89 project_migrate OK 0.73s
69/89 skiplist OK 0.02s
70/89 skyline OK 0.02s
71/89 spaces OK 0.02s
72/89 sparse OK 0.01s
73/89 stack OK 0.00s
74/89 str OK 0.02s
75/89 strbuf OK 0.02s
76/89 subprocess OK 0.12s
77/89 table OK 0.02s
78/89 task OK 0.05s
79/89 tree OK 0.02s
80/89 sign OK 0.41s
81/89 uleb128 OK 0.02s
82/89 unum OK 0.02s
83/89 type OK 0.09s
84/89 util OK 0.02s
85/89 vector OK 0.02s
86/89 inflate_deflate FAIL 0.03s exit status 1
>>> PATH=C:/projects/rizin/build/librz/analysis;C:/projects/rizin/build/librz/asm;C:/projects/rizin/build/librz/bin;C:/projects/rizin/build/librz/bp;C:/projects/rizin/build/librz/config;C:/projects/rizin/build/librz/cons;C:/projects/rizin/build/librz/core;C:/projects/rizin/build/librz/diff;C:/projects/rizin/build/librz/crypto;C:/projects/rizin/build/librz/debug;C:/projects/rizin/build/librz/egg;C:/projects/rizin/build/librz/flag;C:/projects/rizin/build/librz/hash;C:/projects/rizin/build/librz/io;C:/projects/rizin/build/librz/lang;C:/projects/rizin/build/librz/magic;C:/projects/rizin/build/librz/main;C:/projects/rizin/build/librz/parse;C:/projects/rizin/build/librz/reg;C:/projects/rizin/build/librz/search;C:/projects/rizin/build/librz/socket;C:/projects/rizin/build/librz/syscall;C:/projects/rizin/build/librz/type;C:/projects/rizin/build/librz/util;C:\projects\rizin\rizin-vs2019_64-v0.3.0-git\bin;C:\Python38-x64;C:\msys64\mingw64\bin;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\\Extensions\Microsoft\IntelliCode\CLI;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.29.30037\bin\HostX64\x64;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\VC\VCPackages;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\CommonExtensions\Microsoft\TestWindow;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\CommonExtensions\Microsoft\TeamFoundation\Team Explorer;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\MSBuild\Current\bin\Roslyn;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Team Tools\Performance Tools\x64;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Team Tools\Performance Tools;C:\Program Files (x86)\Microsoft Visual Studio\Shared\Common\VSPerfCollectionTools\vs2019\\x64;C:\Program Files (x86)\Microsoft Visual Studio\Shared\Common\VSPerfCollectionTools\vs2019\;C:\Program Files (x86)\Microsoft SDKs\Windows\v10.0A\bin\NETFX 4.8 Tools\x64\;C:\Program Files (x86)\HTML Help Workshop;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\CommonExtensions\Microsoft\FSharp\Tools;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\Tools\devinit;C:\Program Files (x86)\Windows Kits\10\bin\10.0.19041.0\x64;C:\Program Files (x86)\Windows Kits\10\bin\x64;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\\MSBuild\Current\Bin;C:\Windows\Microsoft.NET\Framework64\v4.0.30319;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\Tools\;C:\Program Files (x86)\Microsoft SDKs\Azure\CLI2\wbin;C:\Program Files\Git\cmd;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Windows\System32\OpenSSH\;C:\Program Files\PowerShell\7\;C:\Program Files\7-Zip;C:\Program Files\Microsoft\Web Platform Installer\;C:\Tools\NuGet;C:\Tools\PsTools;C:\Program Files\Git\usr\bin;C:\Program Files\Git LFS;C:\Program Files\Mercurial\;C:\Program Files (x86)\Subversion\bin;C:\Program Files\Docker\Docker\resources\bin;C:\ProgramData\DockerDesktop\version-bin;C:\Program Files\dotnet\;C:\Program Files\Microsoft SQL Server\130\Tools\Binn\;C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\170\Tools\Binn\;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\150;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\MSBuild\Current\Bin;C:\Tools\xUnit;C:\Tools\xUnit20;C:\Tools\NUnit\bin;C:\Tools\NUnit3\bin;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\Extensions\TestPlatform;C:\Ruby193\bin;C:\Tools\WebDriver;C:\Python27;C:\Python27\Scripts;C:\Program Files (x86)\nodejs\;C:\Program Files\nodejs;C:\Program Files (x86)\iojs;C:\Program Files\iojs;C:\Users\appveyor\AppData\Roaming\npm;C:\Program Files (x86)\Yarn\bin\;C:\go\bin;C:\Program Files\Java\jdk1.8.0\bin;C:\Program Files\erl10.7\bin;C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\130\Tools\Binn\;C:\Program Files (x86)\Microsoft SQL Server\140\Tools\Binn\;C:\Program Files\Microsoft SQL Server\140\Tools\Binn\;C:\Program Files\Microsoft SQL Server\140\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\150\Tools\Binn\;C:\Program Files\Microsoft SQL Server\150\Tools\Binn\;C:\Program Files\Microsoft SQL Server\150\DTS\Binn\;C:\Program Files\Amazon\AWSCLI\;C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\bin;C:\Program Files (x86)\Windows Kits\8.1\Windows Performance Toolkit\;C:\Program Files (x86)\Microsoft DirectX SDK;C:\Program Files\Microsoft Service Fabric\bin\Fabric\Fabric.Code;C:\Program Files\Microsoft SDKs\Service Fabric\Tools\ServiceFabricLocalClusterManager;C:\Program Files (x86)\Microsoft SQL Server\110\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\120\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\130\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\140\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\150\DTS\Binn\;C:\Tools\Doxygen;C:\Tools\Graphviz\bin;C:\Program Files\CMake\bin;C:\ProgramData\chocolatey\bin;C:\Program Files\LLVM\bin;C:\Tools\vcpkg;C:\Tools\Coverity\bin;C:\Program Files (x86)\NSIS;C:\Tools\Octopus;C:\Program Files\Meson\;C:\Program Files (x86)\Apache\Maven\bin;C:\Tools\GitVersion;C:\Users\appveyor\AppData\Local\Microsoft\WindowsApps;C:\Users\appveyor\.dotnet\tools;C:\Users\appveyor\AppData\Roaming\npm;C:\Users\appveyor\AppData\Local\Yarn\bin;C:\Program Files\AppVeyor\BuildAgent\;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\Llvm\x64\bin;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\CommonExtensions\Microsoft\CMake\CMake\bin;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\CommonExtensions\Microsoft\CMake\Ninja;C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\VC\Linux\bin\ConnectionManagerExe MALLOC_PERTURB_=21 C:\projects\rizin\build\test/unit\test_inflate_deflate.exe
87/89 bin_vfiles OK 0.05s
88/89 cpu_platform_profiles OK 0.86s
89/89 open_analyse_save_load_project OK 1.90s
Ok: 88
Expected Fail: 0
Fail: 1
Unexpected Pass: 0
Skipped: 0
Timeout: 0
```
Note the missing lines:
```
The output from the failed tests:
86/89 inflate_deflate FAIL 0.03s exit status 1
--- command ---
10:49:49 /home/runner/work/rizin/rizin/build/test/unit/test_inflate_deflate
--- stdout ---
test_rz_inflate ERR
[XX] Fail at line 31: Windows inflate OS bits: expected 11, got 3.
test_rz_deflate OK
test_rz_inflate_buf OK
test_rz_deflate_buf OK
-------
``` | build | show the mu asserts in the appveyor windows logs work environment questions answers os arch bits mandatory windows bits file format of the file you reverse mandatory architecture bits of the file mandatory rizin v full output not truncated mandatory expected behavior windows appveyor unit tests output should print the exact assert run ninja c build test ninja entering directory build running all tests addr interval ok agraph ok analysis cc ok analysis class graph ok analysis function ok analysis hints ok analysis meta ok analysis op ok analysis var ok analysis global var ok analysis xrefs ok annotated code ok analysis block ok ok autocmplt ok bin ok bin lines ok binheap ok bitmap ok buf ok cmd ok config ok cons ok contrbtree ok core bin ok core cmd ok core seek ok big ok debruijn ok debug ok debug session ok diff ok core task ok dwarf info ok dwarf ok endian ok event ok file ok flags ok glob ok graph ok hash ok hex ok id storage ok idpool ok idstorage ok intervaltree ok io ok itv ok io ihex ok json ok list ok ovf ok pdb ok pj ok project migrate ok queue ok rbtree ok reg ok run ok rz test ok rzpipe ok serialize analysis ok serialize config ok serialize flag ok serialize spaces ok serialize types ok dwarf integration ok skiplist ok skyline ok spaces ok sparse ok stack ok str ok strbuf ok subprocess ok table ok task ok tree ok sign ok type ok ok unum ok util ok vector ok inflate deflate fail exit status bin vfiles ok cpu platform profiles ok open analyse save load project ok the output from the failed tests inflate deflate fail exit status command home runner work rizin rizin build test unit test inflate deflate stdout test rz inflate err fail at line windows inflate os bits expected got test rz deflate ok test rz inflate buf ok test rz deflate buf ok summary of failures inflate deflate fail exit status ok expected fail fail unexpected pass skipped timeout full log written to home runner work rizin rizin build meson logs testlog txt failed meson test usr local bin meson test no rebuild print errorlogs ninja build stopped subcommand failed actual behavior python scripts meson test c build t ninja entering directory c projects rizin build ninja no work to do addr interval ok agraph ok analysis cc ok analysis class graph ok analysis function ok analysis hints ok analysis meta ok analysis op ok analysis var ok analysis global var ok analysis xrefs ok annotated code ok analysis block ok ok autocmplt ok big ok bin ok binheap ok bitmap ok bin lines ok cmd ok config ok cons ok contrbtree ok buf ok core cmd ok core seek ok core task ok debruijn ok debug ok debug session ok diff ok dwarf ok dwarf info ok dwarf integration ok endian ok event ok file ok flags ok glob ok graph ok hash ok hex ok id storage ok idpool ok idstorage ok intervaltree ok io ok itv ok io ihex ok json ok list ok ovf ok pdb ok pj ok core bin ok queue ok rbtree ok reg ok run ok rz test ok rzpipe ok serialize analysis ok serialize config ok serialize flag ok serialize spaces ok serialize types ok project migrate ok skiplist ok skyline ok spaces ok sparse ok stack ok str ok strbuf ok subprocess ok table ok task ok tree ok sign ok ok unum ok type ok util ok vector ok inflate deflate fail exit status path c projects rizin build librz analysis c projects rizin build librz asm c projects rizin build librz bin c projects rizin build librz bp c projects rizin build librz config c projects rizin build librz cons c projects rizin build librz core c projects rizin build librz diff c projects rizin build librz crypto c projects rizin build librz debug c projects rizin build librz egg c projects rizin build librz flag c projects rizin build librz hash c projects rizin build librz io c projects rizin build librz lang c projects rizin build librz magic c projects rizin build librz main c projects rizin build librz parse c projects rizin build librz reg c projects rizin build librz search c projects rizin build librz socket c projects rizin build librz syscall c projects rizin build librz type c projects rizin build librz util c projects rizin rizin git bin c c bin c program files microsoft visual studio community ide extensions microsoft intellicode cli c program files microsoft visual studio community vc tools msvc bin c program files microsoft visual studio community ide vc vcpackages c program files microsoft visual studio community ide commonextensions microsoft testwindow c program files microsoft visual studio community ide commonextensions microsoft teamfoundation team explorer c program files microsoft visual studio community msbuild current bin roslyn c program files microsoft visual studio community team tools performance tools c program files microsoft visual studio community team tools performance tools c program files microsoft visual studio shared common vsperfcollectiontools c program files microsoft visual studio shared common vsperfcollectiontools c program files microsoft sdks windows bin netfx tools c program files html help workshop c program files microsoft visual studio community ide commonextensions microsoft fsharp tools c program files microsoft visual studio community tools devinit c program files windows kits bin c program files windows kits bin c program files microsoft visual studio community msbuild current bin c windows microsoft net c program files microsoft visual studio community ide c program files microsoft visual studio community tools c program files microsoft sdks azure wbin c program files git cmd c windows c windows c windows wbem c windows windowspowershell c windows openssh c program files powershell c program files zip c program files microsoft web platform installer c tools nuget c tools pstools c program files git usr bin c program files git lfs c program files mercurial c program files subversion bin c program files docker docker resources bin c programdata dockerdesktop version bin c program files dotnet c program files microsoft sql server tools binn c program files microsoft sql server client sdk odbc tools binn c program files microsoft visual studio community ide extensions microsoft sqldb dac c program files microsoft visual studio community msbuild current bin c tools xunit c tools c tools nunit bin c tools bin c program files microsoft visual studio community ide extensions testplatform c bin c tools webdriver c c scripts c program files nodejs c program files nodejs c program files iojs c program files iojs c users appveyor appdata roaming npm c program files yarn bin c go bin c program files java bin c program files bin c program files microsoft sql server client sdk odbc tools binn c program files microsoft sql server tools binn c program files microsoft sql server tools binn c program files microsoft sql server dts binn c program files microsoft sql server tools binn c program files microsoft sql server tools binn c program files microsoft sql server dts binn c program files amazon awscli c program files google cloud sdk google cloud sdk bin c program files windows kits windows performance toolkit c program files microsoft directx sdk c program files microsoft service fabric bin fabric fabric code c program files microsoft sdks service fabric tools servicefabriclocalclustermanager c program files microsoft sql server dts binn c program files microsoft sql server dts binn c program files microsoft sql server dts binn c program files microsoft sql server dts binn c program files microsoft sql server dts binn c tools doxygen c tools graphviz bin c program files cmake bin c programdata chocolatey bin c program files llvm bin c tools vcpkg c tools coverity bin c program files nsis c tools octopus c program files meson c program files apache maven bin c tools gitversion c users appveyor appdata local microsoft windowsapps c users appveyor dotnet tools c users appveyor appdata roaming npm c users appveyor appdata local yarn bin c program files appveyor buildagent c program files microsoft visual studio community vc tools llvm bin c program files microsoft visual studio community ide commonextensions microsoft cmake cmake bin c program files microsoft visual studio community ide commonextensions microsoft cmake ninja c program files microsoft visual studio community ide vc linux bin connectionmanagerexe malloc perturb c projects rizin build test unit test inflate deflate exe bin vfiles ok cpu platform profiles ok open analyse save load project ok ok expected fail fail unexpected pass skipped timeout note the missing lines the output from the failed tests inflate deflate fail exit status command home runner work rizin rizin build test unit test inflate deflate stdout test rz inflate err fail at line windows inflate os bits expected got test rz deflate ok test rz inflate buf ok test rz deflate buf ok | 1 |
35,058 | 7,541,689,780 | IssuesEvent | 2018-04-17 10:33:35 | cakephp/cakephp | https://api.github.com/repos/cakephp/cakephp | closed | Redirects broken after 3.6.0 for applications in sub-directories | Defect Need more information deprecations | This is a (multiple allowed):
* [x] bug
* [ ] enhancement
* [ ] feature-discussion (RFC)
* CakePHP Version: 3.6.0
* Platform and Target: Apache 2.4 // PHP 7.2.4
- https://example.com/subfolder
- /var/www/html/subfolder
### What you did
upgraded to 3.6.0
### What happened
redirects (FriendsOfCake/search [PRG], loginRedirect, FormValidation errors) stopped working.
parts of the url are missing after dispatching Controller.beforeRedirect

### What you expected to happen
should work as before
P.S. Remember, an issue is not the place to ask questions. You can use [Stack Overflow](https://stackoverflow.com/questions/tagged/cakephp)
for that or join the #cakephp channel on irc.freenode.net, where we will be more
than happy to help answer your questions.
Before you open an issue, please check if a similar issue already exists or has been closed before.
| 1.0 | Redirects broken after 3.6.0 for applications in sub-directories - This is a (multiple allowed):
* [x] bug
* [ ] enhancement
* [ ] feature-discussion (RFC)
* CakePHP Version: 3.6.0
* Platform and Target: Apache 2.4 // PHP 7.2.4
- https://example.com/subfolder
- /var/www/html/subfolder
### What you did
upgraded to 3.6.0
### What happened
redirects (FriendsOfCake/search [PRG], loginRedirect, FormValidation errors) stopped working.
parts of the url are missing after dispatching Controller.beforeRedirect

### What you expected to happen
should work as before
P.S. Remember, an issue is not the place to ask questions. You can use [Stack Overflow](https://stackoverflow.com/questions/tagged/cakephp)
for that or join the #cakephp channel on irc.freenode.net, where we will be more
than happy to help answer your questions.
Before you open an issue, please check if a similar issue already exists or has been closed before.
| non_build | redirects broken after for applications in sub directories this is a multiple allowed bug enhancement feature discussion rfc cakephp version platform and target apache php var www html subfolder what you did upgraded to what happened redirects friendsofcake search loginredirect formvalidation errors stopped working parts of the url are missing after dispatching controller beforeredirect what you expected to happen should work as before p s remember an issue is not the place to ask questions you can use for that or join the cakephp channel on irc freenode net where we will be more than happy to help answer your questions before you open an issue please check if a similar issue already exists or has been closed before | 0 |
58,975 | 14,517,335,128 | IssuesEvent | 2020-12-13 19:12:33 | X0rg/CPU-X | https://api.github.com/repos/X0rg/CPU-X | closed | Disabling gettext support causes compilation to fail on FreeBSD | bug build | Disabling gettext support (ncurses variant) via ports tree causes the following:
```
===> Performing out-of-source build
/bin/mkdir -p /usr/ports/sysutils/cpu-x/work/.build
-- The C compiler identification is Clang 11.0.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /usr/bin/cc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Performing Test HAS_NO_PIE
-- Performing Test HAS_NO_PIE - Success
-- Found PkgConfig: pkgconf (found version "1.7.3")
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Found Threads: TRUE
-- Looking for backtrace
-- Looking for backtrace - not found
-- Found Backtrace: /usr/lib/libexecinfo.so
-- Checking for module 'libcpuid>=0.4.0'
-- Found libcpuid, version 0.5.0
-- Looking for msr_serialize_raw_data
-- Looking for msr_serialize_raw_data - not found
-- Performing Test HAVE_CPU_ID_L1I_INFO
-- Performing Test HAVE_CPU_ID_L1I_INFO - Failed
-- Checking for module 'libpci'
-- Found libpci, version 3.7.0
-- Checking for module 'libstatgrab'
-- Found libstatgrab, version 0.92
-- Using built-in dmidecode, version 3.2.20200417
-- The ASM_NASM compiler identification is NASM
-- Found assembler: /usr/local/bin/nasm
-- Using built-in bandwidth, version 1.5.1
** cpu-x 4.0.1 configuration **
** GTK support is disabled (explicitly disabled)
** NCURSES support is enabled
** GETTEXT support is disabled (explicitly disabled)
** LIBCPUID support is enabled
** LIBPCI support is enabled
** LIBSTATGRAB support is enabled
** DMIDECODE support is enabled
** BANDWIDTH support is enabled
-- Configuring done
-- Generating done
.....
[19/20] : && /usr/bin/cc -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -Wno-deprecated-declarations -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -fstack-protector-strong -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now src/CMakeFiles/cpu-x-daemon.dir/daemon.c.o -o output/bin/cpu-x-daemon -L/usr/local/lib -Wl,-rpath,/usr/local/lib: -pthread -lcpuid output/lib/libdmidecode.a && :
FAILED: output/bin/cpu-x-daemon
: && /usr/bin/cc -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -Wno-deprecated-declarations -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -fstack-protector-strong -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now src/CMakeFiles/cpu-x-daemon.dir/daemon.c.o -o output/bin/cpu-x-daemon -L/usr/local/lib -Wl,-rpath,/usr/local/lib: -pthread -lcpuid output/lib/libdmidecode.a && :
ld: error: undefined symbol: libintl_gettext
>>> referenced by daemon.c
>>> src/CMakeFiles/cpu-x-daemon.dir/daemon.c.o:(request_handler)
cc: error: linker command failed with exit code 1 (use -v to see invocation)
[20/20] : && /usr/bin/cc -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -Wno-deprecated-declarations -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -fstack-protector-strong -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now src/CMakeFiles/cpu-x.dir/main.c.o src/CMakeFiles/cpu-x.dir/util.c.o src/CMakeFiles/cpu-x.dir/core.c.o -o output/bin/cpu-x -L/usr/local/lib -Wl,-rpath,/usr/local/lib: -lm -pthread -lexecinfo output/lib/libtui_ncurses.a -l:libncursesw.so -lcpuid -lpci -lstatgrab -ldevstat output/lib/libdmidecode.a output/lib/libbandwidth.a && :
FAILED: output/bin/cpu-x
: && /usr/bin/cc -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -Wno-deprecated-declarations -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -fstack-protector-strong -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now src/CMakeFiles/cpu-x.dir/main.c.o src/CMakeFiles/cpu-x.dir/util.c.o src/CMakeFiles/cpu-x.dir/core.c.o -o output/bin/cpu-x -L/usr/local/lib -Wl,-rpath,/usr/local/lib: -lm -pthread -lexecinfo output/lib/libtui_ncurses.a -l:libncursesw.so -lcpuid -lpci -lstatgrab -ldevstat output/lib/libdmidecode.a output/lib/libbandwidth.a && :
ld: error: undefined symbol: libintl_gettext
>>> referenced by main.c
>>> src/CMakeFiles/cpu-x.dir/main.c.o:(labels_free)
>>> referenced by main.c
>>> src/CMakeFiles/cpu-x.dir/main.c.o:(main)
>>> referenced by main.c
>>> src/CMakeFiles/cpu-x.dir/main.c.o:(main)
>>> referenced 237 more times
ld: error: undefined symbol: libintl_bindtextdomain
>>> referenced by main.c
>>> src/CMakeFiles/cpu-x.dir/main.c.o:(main)
ld: error: undefined symbol: libintl_bind_textdomain_codeset
>>> referenced by main.c
>>> src/CMakeFiles/cpu-x.dir/main.c.o:(main)
ld: error: undefined symbol: libintl_textdomain
>>> referenced by main.c
>>> src/CMakeFiles/cpu-x.dir/main.c.o:(main)
cc: error: linker command failed with exit code 1 (use -v to see invocation)
ninja: build stopped: subcommand failed.
```
OS: FreeBSD 13-CURRENT r368276 (amd64)
Software: CPU-X 4.0.1, CMake 3.18.5, ninja 1.10.1
If there's any other information needed just ask | 1.0 | Disabling gettext support causes compilation to fail on FreeBSD - Disabling gettext support (ncurses variant) via ports tree causes the following:
```
===> Performing out-of-source build
/bin/mkdir -p /usr/ports/sysutils/cpu-x/work/.build
-- The C compiler identification is Clang 11.0.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /usr/bin/cc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Performing Test HAS_NO_PIE
-- Performing Test HAS_NO_PIE - Success
-- Found PkgConfig: pkgconf (found version "1.7.3")
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Found Threads: TRUE
-- Looking for backtrace
-- Looking for backtrace - not found
-- Found Backtrace: /usr/lib/libexecinfo.so
-- Checking for module 'libcpuid>=0.4.0'
-- Found libcpuid, version 0.5.0
-- Looking for msr_serialize_raw_data
-- Looking for msr_serialize_raw_data - not found
-- Performing Test HAVE_CPU_ID_L1I_INFO
-- Performing Test HAVE_CPU_ID_L1I_INFO - Failed
-- Checking for module 'libpci'
-- Found libpci, version 3.7.0
-- Checking for module 'libstatgrab'
-- Found libstatgrab, version 0.92
-- Using built-in dmidecode, version 3.2.20200417
-- The ASM_NASM compiler identification is NASM
-- Found assembler: /usr/local/bin/nasm
-- Using built-in bandwidth, version 1.5.1
** cpu-x 4.0.1 configuration **
** GTK support is disabled (explicitly disabled)
** NCURSES support is enabled
** GETTEXT support is disabled (explicitly disabled)
** LIBCPUID support is enabled
** LIBPCI support is enabled
** LIBSTATGRAB support is enabled
** DMIDECODE support is enabled
** BANDWIDTH support is enabled
-- Configuring done
-- Generating done
.....
[19/20] : && /usr/bin/cc -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -Wno-deprecated-declarations -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -fstack-protector-strong -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now src/CMakeFiles/cpu-x-daemon.dir/daemon.c.o -o output/bin/cpu-x-daemon -L/usr/local/lib -Wl,-rpath,/usr/local/lib: -pthread -lcpuid output/lib/libdmidecode.a && :
FAILED: output/bin/cpu-x-daemon
: && /usr/bin/cc -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -Wno-deprecated-declarations -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -fstack-protector-strong -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now src/CMakeFiles/cpu-x-daemon.dir/daemon.c.o -o output/bin/cpu-x-daemon -L/usr/local/lib -Wl,-rpath,/usr/local/lib: -pthread -lcpuid output/lib/libdmidecode.a && :
ld: error: undefined symbol: libintl_gettext
>>> referenced by daemon.c
>>> src/CMakeFiles/cpu-x-daemon.dir/daemon.c.o:(request_handler)
cc: error: linker command failed with exit code 1 (use -v to see invocation)
[20/20] : && /usr/bin/cc -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -Wno-deprecated-declarations -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -fstack-protector-strong -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now src/CMakeFiles/cpu-x.dir/main.c.o src/CMakeFiles/cpu-x.dir/util.c.o src/CMakeFiles/cpu-x.dir/core.c.o -o output/bin/cpu-x -L/usr/local/lib -Wl,-rpath,/usr/local/lib: -lm -pthread -lexecinfo output/lib/libtui_ncurses.a -l:libncursesw.so -lcpuid -lpci -lstatgrab -ldevstat output/lib/libdmidecode.a output/lib/libbandwidth.a && :
FAILED: output/bin/cpu-x
: && /usr/bin/cc -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -Wno-deprecated-declarations -O2 -pipe -march=ivybridge -fstack-protector-strong -fno-strict-aliasing -fstack-protector-strong -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now src/CMakeFiles/cpu-x.dir/main.c.o src/CMakeFiles/cpu-x.dir/util.c.o src/CMakeFiles/cpu-x.dir/core.c.o -o output/bin/cpu-x -L/usr/local/lib -Wl,-rpath,/usr/local/lib: -lm -pthread -lexecinfo output/lib/libtui_ncurses.a -l:libncursesw.so -lcpuid -lpci -lstatgrab -ldevstat output/lib/libdmidecode.a output/lib/libbandwidth.a && :
ld: error: undefined symbol: libintl_gettext
>>> referenced by main.c
>>> src/CMakeFiles/cpu-x.dir/main.c.o:(labels_free)
>>> referenced by main.c
>>> src/CMakeFiles/cpu-x.dir/main.c.o:(main)
>>> referenced by main.c
>>> src/CMakeFiles/cpu-x.dir/main.c.o:(main)
>>> referenced 237 more times
ld: error: undefined symbol: libintl_bindtextdomain
>>> referenced by main.c
>>> src/CMakeFiles/cpu-x.dir/main.c.o:(main)
ld: error: undefined symbol: libintl_bind_textdomain_codeset
>>> referenced by main.c
>>> src/CMakeFiles/cpu-x.dir/main.c.o:(main)
ld: error: undefined symbol: libintl_textdomain
>>> referenced by main.c
>>> src/CMakeFiles/cpu-x.dir/main.c.o:(main)
cc: error: linker command failed with exit code 1 (use -v to see invocation)
ninja: build stopped: subcommand failed.
```
OS: FreeBSD 13-CURRENT r368276 (amd64)
Software: CPU-X 4.0.1, CMake 3.18.5, ninja 1.10.1
If there's any other information needed just ask | build | disabling gettext support causes compilation to fail on freebsd disabling gettext support ncurses variant via ports tree causes the following performing out of source build bin mkdir p usr ports sysutils cpu x work build the c compiler identification is clang detecting c compiler abi info detecting c compiler abi info done check for working c compiler usr bin cc skipped detecting c compile features detecting c compile features done performing test has no pie performing test has no pie success found pkgconfig pkgconf found version looking for pthread h looking for pthread h found found threads true looking for backtrace looking for backtrace not found found backtrace usr lib libexecinfo so checking for module libcpuid found libcpuid version looking for msr serialize raw data looking for msr serialize raw data not found performing test have cpu id info performing test have cpu id info failed checking for module libpci found libpci version checking for module libstatgrab found libstatgrab version using built in dmidecode version the asm nasm compiler identification is nasm found assembler usr local bin nasm using built in bandwidth version cpu x configuration gtk support is disabled explicitly disabled ncurses support is enabled gettext support is disabled explicitly disabled libcpuid support is enabled libpci support is enabled libstatgrab support is enabled dmidecode support is enabled bandwidth support is enabled configuring done generating done usr bin cc pipe march ivybridge fstack protector strong fno strict aliasing wno deprecated declarations pipe march ivybridge fstack protector strong fno strict aliasing fstack protector strong wl z noexecstack wl z relro wl z now src cmakefiles cpu x daemon dir daemon c o o output bin cpu x daemon l usr local lib wl rpath usr local lib pthread lcpuid output lib libdmidecode a failed output bin cpu x daemon usr bin cc pipe march ivybridge fstack protector strong fno strict aliasing wno deprecated declarations pipe march ivybridge fstack protector strong fno strict aliasing fstack protector strong wl z noexecstack wl z relro wl z now src cmakefiles cpu x daemon dir daemon c o o output bin cpu x daemon l usr local lib wl rpath usr local lib pthread lcpuid output lib libdmidecode a ld error undefined symbol libintl gettext referenced by daemon c src cmakefiles cpu x daemon dir daemon c o request handler cc error linker command failed with exit code use v to see invocation usr bin cc pipe march ivybridge fstack protector strong fno strict aliasing wno deprecated declarations pipe march ivybridge fstack protector strong fno strict aliasing fstack protector strong wl z noexecstack wl z relro wl z now src cmakefiles cpu x dir main c o src cmakefiles cpu x dir util c o src cmakefiles cpu x dir core c o o output bin cpu x l usr local lib wl rpath usr local lib lm pthread lexecinfo output lib libtui ncurses a l libncursesw so lcpuid lpci lstatgrab ldevstat output lib libdmidecode a output lib libbandwidth a failed output bin cpu x usr bin cc pipe march ivybridge fstack protector strong fno strict aliasing wno deprecated declarations pipe march ivybridge fstack protector strong fno strict aliasing fstack protector strong wl z noexecstack wl z relro wl z now src cmakefiles cpu x dir main c o src cmakefiles cpu x dir util c o src cmakefiles cpu x dir core c o o output bin cpu x l usr local lib wl rpath usr local lib lm pthread lexecinfo output lib libtui ncurses a l libncursesw so lcpuid lpci lstatgrab ldevstat output lib libdmidecode a output lib libbandwidth a ld error undefined symbol libintl gettext referenced by main c src cmakefiles cpu x dir main c o labels free referenced by main c src cmakefiles cpu x dir main c o main referenced by main c src cmakefiles cpu x dir main c o main referenced more times ld error undefined symbol libintl bindtextdomain referenced by main c src cmakefiles cpu x dir main c o main ld error undefined symbol libintl bind textdomain codeset referenced by main c src cmakefiles cpu x dir main c o main ld error undefined symbol libintl textdomain referenced by main c src cmakefiles cpu x dir main c o main cc error linker command failed with exit code use v to see invocation ninja build stopped subcommand failed os freebsd current software cpu x cmake ninja if there s any other information needed just ask | 1 |
104,078 | 13,033,409,201 | IssuesEvent | 2020-07-28 06:51:39 | oSoc20/urban-brussels | https://api.github.com/repos/oSoc20/urban-brussels | closed | Designing the container for "fun facts" | design | - [x] Meet with backend team and define fun facts (define ranking in the backend)
- [ ] Implement fun facts | 1.0 | Designing the container for "fun facts" - - [x] Meet with backend team and define fun facts (define ranking in the backend)
- [ ] Implement fun facts | non_build | designing the container for fun facts meet with backend team and define fun facts define ranking in the backend implement fun facts | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.