Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
5
112
repo_url
stringlengths
34
141
action
stringclasses
3 values
title
stringlengths
1
844
labels
stringlengths
4
721
body
stringlengths
1
261k
index
stringclasses
12 values
text_combine
stringlengths
96
261k
label
stringclasses
2 values
text
stringlengths
96
248k
binary_label
int64
0
1
80,485
23,220,559,311
IssuesEvent
2022-08-02 17:46:51
elastic/beats
https://api.github.com/repos/elastic/beats
closed
Build 2 for 8.4 with status FAILURE
automation ci-reported Team:Elastic-Agent-Data-Plane build-failures
## :broken_heart: Tests Failed <!-- BUILD BADGES--> > _the below badges are clickable and redirect to their specific view in the CI or DOCS_ [![Pipeline View](https://img.shields.io/badge/pipeline-pipeline%20-green)](https://beats-ci.elastic.co/blue/organizations/jenkins/Beats%2Fbeats%2F8.4/detail/8.4/2//pipeline) [![Test View](https://img.shields.io/badge/test-test-green)](https://beats-ci.elastic.co/blue/organizations/jenkins/Beats%2Fbeats%2F8.4/detail/8.4/2//tests) [![Changes](https://img.shields.io/badge/changes-changes-green)](https://beats-ci.elastic.co/blue/organizations/jenkins/Beats%2Fbeats%2F8.4/detail/8.4/2//changes) [![Artifacts](https://img.shields.io/badge/artifacts-artifacts-yellow)](https://beats-ci.elastic.co/blue/organizations/jenkins/Beats%2Fbeats%2F8.4/detail/8.4/2//artifacts) [![preview](https://img.shields.io/badge/docs-preview-yellowgreen)](http://beats_null.docs-preview.app.elstc.co/diff) [![preview](https://img.shields.io/badge/elastic-observability-blue)](https://ci-stats.elastic.co/app/apm/services/beats-ci/transactions/view?rangeFrom=2022-08-01T17:07:49.381Z&rangeTo=2022-08-01T17:27:49.381Z&transactionName=Beats/beats/8.4&transactionType=job&latencyAggregationType=avg&traceId=7dcc7b25cefd5d269b2d907551a386f7&transactionId=c0e9f9824b486ded) <!-- BUILD SUMMARY--> <details><summary>Expand to view the summary</summary> <p> #### Build stats * Start Time: 2022-08-01T17:17:49.381+0000 * Duration: 101 min 53 sec #### Test stats :test_tube: | Test | Results | | ------------ | :-----------------------------: | | Failed | 10 | | Passed | 24295 | | Skipped | 2254 | | Total | 26559 | </p> </details> <!-- TEST RESULTS IF ANY--> ### Test errors [![10](https://img.shields.io/badge/10%20-red)](https://beats-ci.elastic.co/blue/organizations/jenkins/Beats%2Fbeats%2F8.4/detail/8.4/2//tests) <details><summary>Expand to view the tests failures</summary><p> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_dashboards – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_export_config – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_export_ilm_policy – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_export_index_pattern – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_export_index_pattern_migration – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_export_template – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_index_management – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_start_stop – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_health – x-pack.metricbeat.module.enterprisesearch.test_enterprisesearch.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "DeprecationWarning: The "warn" method is deprecated, use "warning" instead" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <docker.api.client.APIClient object at 0x7fabc5a156d0> response = <Response [500]> def _raise_for_status(self, response): """Raises stored :class:`APIError`, if one occurred.""" try: > response.raise_for_status() ../../build/ve/docker/lib/python3.9/site-packages/docker/api/client.py:268: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <Response [500]> def raise_for_status(self): """Raises :class:`HTTPError`, if one occurred.""" http_error_msg = "" if isinstance(self.reason, bytes): # We attempt to decode utf-8 first because some servers # choose to localize their reason strings. If the string # isn"t utf-8, we fall back to iso-8859-1 for all other # encodings. (See PR #3538) try: reason = self.reason.decode("utf-8") except UnicodeDecodeError: reason = self.reason.decode("iso-8859-1") else: reason = self.reason if 400 <= self.status_code < 500: http_error_msg = u"%s Client Error: %s for url: %s" % (self.status_code, reason, self.url) elif 500 <= self.status_code < 600: http_error_msg = u"%s Server Error: %s for url: %s" % (self.status_code, reason, self.url) if http_error_msg: > raise HTTPError(http_error_msg, response=self) E requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start ../../build/ve/docker/lib/python3.9/site-packages/requests/models.py:943: HTTPError During handling of the above exception, another exception occurred: self = <Service: elasticsearch> container = <Container: enterprisesearch_7918a246f6f8_elasticsearch_1 (f47681)> use_network_aliases = True def start_container(self, container, use_network_aliases=True): self.connect_container_to_networks(container, use_network_aliases) try: > container.start() ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:643: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <Container: enterprisesearch_7918a246f6f8_elasticsearch_1 (f47681)> options = {} def start(self, **options): > return self.client.start(self.id, **options) ../../build/ve/docker/lib/python3.9/site-packages/compose/container.py:228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <docker.api.client.APIClient object at 0x7fabc5a156d0> resource_id = "f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383" args = (), kwargs = {} @functools.wraps(f) def wrapped(self, resource_id=None, *args, **kwargs): if resource_id is None and kwargs.get(resource_name): resource_id = kwargs.pop(resource_name) if isinstance(resource_id, dict): resource_id = resource_id.get("Id", resource_id.get("ID")) if not resource_id: raise errors.NullResource( "Resource ID was not provided" ) > return f(self, resource_id, *args, **kwargs) ../../build/ve/docker/lib/python3.9/site-packages/docker/utils/decorators.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <docker.api.client.APIClient object at 0x7fabc5a156d0> container = "f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383" args = (), kwargs = {} url = "http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start" res = <Response [500]> @utils.check_resource("container") def start(self, container, *args, **kwargs): """ Start a container. Similar to the ``docker start`` command, but doesn"t support attach options. **Deprecation warning:** Passing configuration options in ``start`` is no longer supported. Users are expected to provide host config options in the ``host_config`` parameter of :py:meth:`~ContainerApiMixin.create_container`. Args: container (str): The container to start Raises: :py:class:`docker.errors.APIError` If the server returns an error. :py:class:`docker.errors.DeprecatedMethod` If any argument besides ``container`` are provided. Example: >>> container = client.api.create_container( ... image="busybox:latest", ... command="/bin/sleep 30") >>> client.api.start(container=container.get("Id")) """ if args or kwargs: raise errors.DeprecatedMethod( "Providing configuration in the start() method is no longer " "supported. Use the host_config param in create_container " "instead." ) url = self._url("/containers/{0}/start", container) res = self._post(url) > self._raise_for_status(res) ../../build/ve/docker/lib/python3.9/site-packages/docker/api/container.py:1109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <docker.api.client.APIClient object at 0x7fabc5a156d0> response = <Response [500]> def _raise_for_status(self, response): """Raises stored :class:`APIError`, if one occurred.""" try: response.raise_for_status() except requests.exceptions.HTTPError as e: > raise create_api_error_from_http_exception(e) ../../build/ve/docker/lib/python3.9/site-packages/docker/api/client.py:270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ e = HTTPError("500 Server Error: Internal Server Error for url: http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start") def create_api_error_from_http_exception(e): """ Create a suitable APIError from requests.exceptions.HTTPError. """ response = e.response try: explanation = response.json()["message"] except ValueError: explanation = (response.content or "").strip() cls = APIError if response.status_code == 404: if explanation and ("No such image" in str(explanation) or "not found: does not exist or no pull access" in str(explanation) or "repository does not exist" in str(explanation)): cls = ImageNotFound else: cls = NotFound > raise cls(e, response=response, explanation=explanation) E docker.errors.APIError: 500 Server Error for http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start: Internal Server Error ("driver failed programming external connectivity on endpoint enterprisesearch_7918a246f6f8_elasticsearch_1 (e4a03d626dd87fddbb5b812d3b7af90cf980c924da9bb7db603afb26fd3fb306): Bind for 0.0.0.0:9200 failed: port is already allocated") ../../build/ve/docker/lib/python3.9/site-packages/docker/errors.py:31: APIError During handling of the above exception, another exception occurred: self = <class "test_enterprisesearch.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() ../../libbeat/tests/system/beat/compose.py:66: in compose_up project.up( ../../build/ve/docker/lib/python3.9/site-packages/compose/project.py:697: in up results, errors = parallel.parallel_execute( ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:108: in parallel_execute raise error_to_reraise ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:206: in producer result = func(obj) ../../build/ve/docker/lib/python3.9/site-packages/compose/project.py:679: in do return service.execute_convergence_plan( ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:559: in execute_convergence_plan return self._execute_convergence_create( ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:473: in _execute_convergence_create containers, errors = parallel_execute( ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:108: in parallel_execute raise error_to_reraise ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:206: in producer result = func(obj) ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:478: in <lambda> lambda service_name: create_and_start(self, service_name.number), ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:461: in create_and_start self.start_container(container) ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:647: in start_container log.warn("Host is already in use by another container") _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <Logger compose.service (WARNING)> msg = "Host is already in use by another container", args = (), kwargs = {} def warn(self, msg, *args, **kwargs): > warnings.warn("The "warn" method is deprecated, " "use "warning" instead", DeprecationWarning, 2) E DeprecationWarning: The "warn" method is deprecated, use "warning" instead /usr/lib/python3.9/logging/__init__.py:1457: DeprecationWarning ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_stats – x-pack.metricbeat.module.enterprisesearch.test_enterprisesearch.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "DeprecationWarning: The "warn" method is deprecated, use "warning" instead" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <docker.api.client.APIClient object at 0x7fabc5a156d0> response = <Response [500]> def _raise_for_status(self, response): """Raises stored :class:`APIError`, if one occurred.""" try: > response.raise_for_status() ../../build/ve/docker/lib/python3.9/site-packages/docker/api/client.py:268: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <Response [500]> def raise_for_status(self): """Raises :class:`HTTPError`, if one occurred.""" http_error_msg = "" if isinstance(self.reason, bytes): # We attempt to decode utf-8 first because some servers # choose to localize their reason strings. If the string # isn"t utf-8, we fall back to iso-8859-1 for all other # encodings. (See PR #3538) try: reason = self.reason.decode("utf-8") except UnicodeDecodeError: reason = self.reason.decode("iso-8859-1") else: reason = self.reason if 400 <= self.status_code < 500: http_error_msg = u"%s Client Error: %s for url: %s" % (self.status_code, reason, self.url) elif 500 <= self.status_code < 600: http_error_msg = u"%s Server Error: %s for url: %s" % (self.status_code, reason, self.url) if http_error_msg: > raise HTTPError(http_error_msg, response=self) E requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start ../../build/ve/docker/lib/python3.9/site-packages/requests/models.py:943: HTTPError During handling of the above exception, another exception occurred: self = <Service: elasticsearch> container = <Container: enterprisesearch_7918a246f6f8_elasticsearch_1 (f47681)> use_network_aliases = True def start_container(self, container, use_network_aliases=True): self.connect_container_to_networks(container, use_network_aliases) try: > container.start() ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:643: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <Container: enterprisesearch_7918a246f6f8_elasticsearch_1 (f47681)> options = {} def start(self, **options): > return self.client.start(self.id, **options) ../../build/ve/docker/lib/python3.9/site-packages/compose/container.py:228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <docker.api.client.APIClient object at 0x7fabc5a156d0> resource_id = "f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383" args = (), kwargs = {} @functools.wraps(f) def wrapped(self, resource_id=None, *args, **kwargs): if resource_id is None and kwargs.get(resource_name): resource_id = kwargs.pop(resource_name) if isinstance(resource_id, dict): resource_id = resource_id.get("Id", resource_id.get("ID")) if not resource_id: raise errors.NullResource( "Resource ID was not provided" ) > return f(self, resource_id, *args, **kwargs) ../../build/ve/docker/lib/python3.9/site-packages/docker/utils/decorators.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <docker.api.client.APIClient object at 0x7fabc5a156d0> container = "f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383" args = (), kwargs = {} url = "http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start" res = <Response [500]> @utils.check_resource("container") def start(self, container, *args, **kwargs): """ Start a container. Similar to the ``docker start`` command, but doesn"t support attach options. **Deprecation warning:** Passing configuration options in ``start`` is no longer supported. Users are expected to provide host config options in the ``host_config`` parameter of :py:meth:`~ContainerApiMixin.create_container`. Args: container (str): The container to start Raises: :py:class:`docker.errors.APIError` If the server returns an error. :py:class:`docker.errors.DeprecatedMethod` If any argument besides ``container`` are provided. Example: >>> container = client.api.create_container( ... image="busybox:latest", ... command="/bin/sleep 30") >>> client.api.start(container=container.get("Id")) """ if args or kwargs: raise errors.DeprecatedMethod( "Providing configuration in the start() method is no longer " "supported. Use the host_config param in create_container " "instead." ) url = self._url("/containers/{0}/start", container) res = self._post(url) > self._raise_for_status(res) ../../build/ve/docker/lib/python3.9/site-packages/docker/api/container.py:1109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <docker.api.client.APIClient object at 0x7fabc5a156d0> response = <Response [500]> def _raise_for_status(self, response): """Raises stored :class:`APIError`, if one occurred.""" try: response.raise_for_status() except requests.exceptions.HTTPError as e: > raise create_api_error_from_http_exception(e) ../../build/ve/docker/lib/python3.9/site-packages/docker/api/client.py:270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ e = HTTPError("500 Server Error: Internal Server Error for url: http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start") def create_api_error_from_http_exception(e): """ Create a suitable APIError from requests.exceptions.HTTPError. """ response = e.response try: explanation = response.json()["message"] except ValueError: explanation = (response.content or "").strip() cls = APIError if response.status_code == 404: if explanation and ("No such image" in str(explanation) or "not found: does not exist or no pull access" in str(explanation) or "repository does not exist" in str(explanation)): cls = ImageNotFound else: cls = NotFound > raise cls(e, response=response, explanation=explanation) E docker.errors.APIError: 500 Server Error for http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start: Internal Server Error ("driver failed programming external connectivity on endpoint enterprisesearch_7918a246f6f8_elasticsearch_1 (e4a03d626dd87fddbb5b812d3b7af90cf980c924da9bb7db603afb26fd3fb306): Bind for 0.0.0.0:9200 failed: port is already allocated") ../../build/ve/docker/lib/python3.9/site-packages/docker/errors.py:31: APIError During handling of the above exception, another exception occurred: self = <class "test_enterprisesearch.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() ../../libbeat/tests/system/beat/compose.py:66: in compose_up project.up( ../../build/ve/docker/lib/python3.9/site-packages/compose/project.py:697: in up results, errors = parallel.parallel_execute( ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:108: in parallel_execute raise error_to_reraise ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:206: in producer result = func(obj) ../../build/ve/docker/lib/python3.9/site-packages/compose/project.py:679: in do return service.execute_convergence_plan( ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:559: in execute_convergence_plan return self._execute_convergence_create( ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:473: in _execute_convergence_create containers, errors = parallel_execute( ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:108: in parallel_execute raise error_to_reraise ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:206: in producer result = func(obj) ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:478: in <lambda> lambda service_name: create_and_start(self, service_name.number), ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:461: in create_and_start self.start_container(container) ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:647: in start_container log.warn("Host is already in use by another container") _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <Logger compose.service (WARNING)> msg = "Host is already in use by another container", args = (), kwargs = {} def warn(self, msg, *args, **kwargs): > warnings.warn("The "warn" method is deprecated, " "use "warning" instead", DeprecationWarning, 2) E DeprecationWarning: The "warn" method is deprecated, use "warning" instead /usr/lib/python3.9/logging/__init__.py:1457: DeprecationWarning ``` </p></details> </ul> </p></details> <!-- STEPS ERRORS IF ANY --> ### Steps errors [![7](https://img.shields.io/badge/7%20-red)](https://beats-ci.elastic.co/blue/organizations/jenkins/Beats%2Fbeats%2F8.4/detail/8.4/2//pipeline) <details><summary>Expand to view the steps failures</summary> <p> ##### `Docker login` <ul> <li>Took 0 min 6 sec . View more details <a href="https://beats-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/Beats/pipelines/beats/pipelines/8.4/runs/2/steps/2929/log/?start=0">here</a></li> <li>Description: <code> set +x if command -v host 2>&1 > /dev/null; then host docker.io 2>&1 > /dev/null fi if command -v dig 2>&1 > /dev/null; then dig docker.io 2>&1 > /dev/null fi docker login -u "${DOCKER_USER}" -p "${DOCKER_PASSWORD}" "docker.io" 2>/dev/null </code></l1> </ul> ##### `Docker login` <ul> <li>Took 0 min 6 sec . View more details <a href="https://beats-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/Beats/pipelines/beats/pipelines/8.4/runs/2/steps/2922/log/?start=0">here</a></li> <li>Description: <code> set +x if command -v host 2>&1 > /dev/null; then host docker.io 2>&1 > /dev/null fi if command -v dig 2>&1 > /dev/null; then dig docker.io 2>&1 > /dev/null fi docker login -u "${DOCKER_USER}" -p "${DOCKER_PASSWORD}" "docker.io" 2>/dev/null </code></l1> </ul> ##### `Docker login` <ul> <li>Took 0 min 16 sec . View more details <a href="https://beats-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/Beats/pipelines/beats/pipelines/8.4/runs/2/steps/2596/log/?start=0">here</a></li> <li>Description: <code> set +x if command -v host 2>&1 > /dev/null; then host docker.io 2>&1 > /dev/null fi if command -v dig 2>&1 > /dev/null; then dig docker.io 2>&1 > /dev/null fi docker login -u "${DOCKER_USER}" -p "${DOCKER_PASSWORD}" "docker.io" 2>/dev/null </code></l1> </ul> ##### `x-pack/metricbeat-pythonIntegTest - mage pythonIntegTest` <ul> <li>Took 27 min 41 sec . View more details <a href="https://beats-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/Beats/pipelines/beats/pipelines/8.4/runs/2/steps/16494/log/?start=0">here</a></li> <li>Description: <code>mage pythonIntegTest</code></l1> </ul> ##### `x-pack/metricbeat-pythonIntegTest - mage pythonIntegTest` <ul> <li>Took 20 min 45 sec . View more details <a href="https://beats-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/Beats/pipelines/beats/pipelines/8.4/runs/2/steps/24000/log/?start=0">here</a></li> <li>Description: <code>mage pythonIntegTest</code></l1> </ul> ##### `x-pack/metricbeat-pythonIntegTest - mage pythonIntegTest` <ul> <li>Took 20 min 42 sec . View more details <a href="https://beats-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/Beats/pipelines/beats/pipelines/8.4/runs/2/steps/24304/log/?start=0">here</a></li> <li>Description: <code>mage pythonIntegTest</code></l1> </ul> ##### `Error signal` <ul> <li>Took 0 min 0 sec . View more details <a href="https://beats-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/Beats/pipelines/beats/pipelines/8.4/runs/2/steps/24317/log/?start=0">here</a></li> <li>Description: <code>Error "hudson.AbortException: script returned exit code 1"</code></l1> </ul> </p> </details>
1.0
Build 2 for 8.4 with status FAILURE - ## :broken_heart: Tests Failed <!-- BUILD BADGES--> > _the below badges are clickable and redirect to their specific view in the CI or DOCS_ [![Pipeline View](https://img.shields.io/badge/pipeline-pipeline%20-green)](https://beats-ci.elastic.co/blue/organizations/jenkins/Beats%2Fbeats%2F8.4/detail/8.4/2//pipeline) [![Test View](https://img.shields.io/badge/test-test-green)](https://beats-ci.elastic.co/blue/organizations/jenkins/Beats%2Fbeats%2F8.4/detail/8.4/2//tests) [![Changes](https://img.shields.io/badge/changes-changes-green)](https://beats-ci.elastic.co/blue/organizations/jenkins/Beats%2Fbeats%2F8.4/detail/8.4/2//changes) [![Artifacts](https://img.shields.io/badge/artifacts-artifacts-yellow)](https://beats-ci.elastic.co/blue/organizations/jenkins/Beats%2Fbeats%2F8.4/detail/8.4/2//artifacts) [![preview](https://img.shields.io/badge/docs-preview-yellowgreen)](http://beats_null.docs-preview.app.elstc.co/diff) [![preview](https://img.shields.io/badge/elastic-observability-blue)](https://ci-stats.elastic.co/app/apm/services/beats-ci/transactions/view?rangeFrom=2022-08-01T17:07:49.381Z&rangeTo=2022-08-01T17:27:49.381Z&transactionName=Beats/beats/8.4&transactionType=job&latencyAggregationType=avg&traceId=7dcc7b25cefd5d269b2d907551a386f7&transactionId=c0e9f9824b486ded) <!-- BUILD SUMMARY--> <details><summary>Expand to view the summary</summary> <p> #### Build stats * Start Time: 2022-08-01T17:17:49.381+0000 * Duration: 101 min 53 sec #### Test stats :test_tube: | Test | Results | | ------------ | :-----------------------------: | | Failed | 10 | | Passed | 24295 | | Skipped | 2254 | | Total | 26559 | </p> </details> <!-- TEST RESULTS IF ANY--> ### Test errors [![10](https://img.shields.io/badge/10%20-red)](https://beats-ci.elastic.co/blue/organizations/jenkins/Beats%2Fbeats%2F8.4/detail/8.4/2//tests) <details><summary>Expand to view the tests failures</summary><p> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_dashboards – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_export_config – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_export_ilm_policy – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_export_index_pattern – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_export_index_pattern_migration – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_export_template – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_index_management – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_start_stop – x-pack.metricbeat.tests.system.test_xpack_base.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <class "test_xpack_base.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = <class "test_xpack_base.Test"> @classmethod def compose_up(cls): """ Ensure *only* the services defined under `COMPOSE_SERVICES` are running and healthy """ if not INTEGRATION_TESTS or not cls.COMPOSE_SERVICES: return if os.environ.get("NO_COMPOSE"): return def print_logs(container): print("---- " + container.name_without_project) print(container.logs()) print("----") def is_healthy(container): return container.inspect()["State"]["Health"]["Status"] == "healthy" project = cls.compose_project() with disabled_logger("compose.service"): project.pull( ignore_pull_failures=True, service_names=cls.COMPOSE_SERVICES) project.up( strategy=ConvergenceStrategy.always, service_names=cls.COMPOSE_SERVICES, timeout=30) # Wait for them to be healthy start = time.time() while True: containers = project.containers( service_names=cls.COMPOSE_SERVICES, stopped=True) healthy = True for container in containers: if not container.is_running: print_logs(container) raise Exception( "Container %s unexpectedly finished on startup" % container.name_without_project) if not is_healthy(container): healthy = False break if healthy: break if cls.COMPOSE_ADVERTISED_HOST: for service in cls.COMPOSE_SERVICES: cls._setup_advertised_host(project, service) time.sleep(1) timeout = time.time() - start > cls.COMPOSE_TIMEOUT if timeout: for container in containers: if not is_healthy(container): print_logs(container) > raise Exception( "Timeout while waiting for healthy " "docker-compose services: %s" % ",".join(cls.COMPOSE_SERVICES)) E Exception: Timeout while waiting for healthy docker-compose services: elasticsearch,kibana ../../libbeat/tests/system/beat/compose.py:102: Exception ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_health – x-pack.metricbeat.module.enterprisesearch.test_enterprisesearch.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "DeprecationWarning: The "warn" method is deprecated, use "warning" instead" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <docker.api.client.APIClient object at 0x7fabc5a156d0> response = <Response [500]> def _raise_for_status(self, response): """Raises stored :class:`APIError`, if one occurred.""" try: > response.raise_for_status() ../../build/ve/docker/lib/python3.9/site-packages/docker/api/client.py:268: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <Response [500]> def raise_for_status(self): """Raises :class:`HTTPError`, if one occurred.""" http_error_msg = "" if isinstance(self.reason, bytes): # We attempt to decode utf-8 first because some servers # choose to localize their reason strings. If the string # isn"t utf-8, we fall back to iso-8859-1 for all other # encodings. (See PR #3538) try: reason = self.reason.decode("utf-8") except UnicodeDecodeError: reason = self.reason.decode("iso-8859-1") else: reason = self.reason if 400 <= self.status_code < 500: http_error_msg = u"%s Client Error: %s for url: %s" % (self.status_code, reason, self.url) elif 500 <= self.status_code < 600: http_error_msg = u"%s Server Error: %s for url: %s" % (self.status_code, reason, self.url) if http_error_msg: > raise HTTPError(http_error_msg, response=self) E requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start ../../build/ve/docker/lib/python3.9/site-packages/requests/models.py:943: HTTPError During handling of the above exception, another exception occurred: self = <Service: elasticsearch> container = <Container: enterprisesearch_7918a246f6f8_elasticsearch_1 (f47681)> use_network_aliases = True def start_container(self, container, use_network_aliases=True): self.connect_container_to_networks(container, use_network_aliases) try: > container.start() ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:643: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <Container: enterprisesearch_7918a246f6f8_elasticsearch_1 (f47681)> options = {} def start(self, **options): > return self.client.start(self.id, **options) ../../build/ve/docker/lib/python3.9/site-packages/compose/container.py:228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <docker.api.client.APIClient object at 0x7fabc5a156d0> resource_id = "f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383" args = (), kwargs = {} @functools.wraps(f) def wrapped(self, resource_id=None, *args, **kwargs): if resource_id is None and kwargs.get(resource_name): resource_id = kwargs.pop(resource_name) if isinstance(resource_id, dict): resource_id = resource_id.get("Id", resource_id.get("ID")) if not resource_id: raise errors.NullResource( "Resource ID was not provided" ) > return f(self, resource_id, *args, **kwargs) ../../build/ve/docker/lib/python3.9/site-packages/docker/utils/decorators.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <docker.api.client.APIClient object at 0x7fabc5a156d0> container = "f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383" args = (), kwargs = {} url = "http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start" res = <Response [500]> @utils.check_resource("container") def start(self, container, *args, **kwargs): """ Start a container. Similar to the ``docker start`` command, but doesn"t support attach options. **Deprecation warning:** Passing configuration options in ``start`` is no longer supported. Users are expected to provide host config options in the ``host_config`` parameter of :py:meth:`~ContainerApiMixin.create_container`. Args: container (str): The container to start Raises: :py:class:`docker.errors.APIError` If the server returns an error. :py:class:`docker.errors.DeprecatedMethod` If any argument besides ``container`` are provided. Example: >>> container = client.api.create_container( ... image="busybox:latest", ... command="/bin/sleep 30") >>> client.api.start(container=container.get("Id")) """ if args or kwargs: raise errors.DeprecatedMethod( "Providing configuration in the start() method is no longer " "supported. Use the host_config param in create_container " "instead." ) url = self._url("/containers/{0}/start", container) res = self._post(url) > self._raise_for_status(res) ../../build/ve/docker/lib/python3.9/site-packages/docker/api/container.py:1109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <docker.api.client.APIClient object at 0x7fabc5a156d0> response = <Response [500]> def _raise_for_status(self, response): """Raises stored :class:`APIError`, if one occurred.""" try: response.raise_for_status() except requests.exceptions.HTTPError as e: > raise create_api_error_from_http_exception(e) ../../build/ve/docker/lib/python3.9/site-packages/docker/api/client.py:270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ e = HTTPError("500 Server Error: Internal Server Error for url: http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start") def create_api_error_from_http_exception(e): """ Create a suitable APIError from requests.exceptions.HTTPError. """ response = e.response try: explanation = response.json()["message"] except ValueError: explanation = (response.content or "").strip() cls = APIError if response.status_code == 404: if explanation and ("No such image" in str(explanation) or "not found: does not exist or no pull access" in str(explanation) or "repository does not exist" in str(explanation)): cls = ImageNotFound else: cls = NotFound > raise cls(e, response=response, explanation=explanation) E docker.errors.APIError: 500 Server Error for http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start: Internal Server Error ("driver failed programming external connectivity on endpoint enterprisesearch_7918a246f6f8_elasticsearch_1 (e4a03d626dd87fddbb5b812d3b7af90cf980c924da9bb7db603afb26fd3fb306): Bind for 0.0.0.0:9200 failed: port is already allocated") ../../build/ve/docker/lib/python3.9/site-packages/docker/errors.py:31: APIError During handling of the above exception, another exception occurred: self = <class "test_enterprisesearch.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() ../../libbeat/tests/system/beat/compose.py:66: in compose_up project.up( ../../build/ve/docker/lib/python3.9/site-packages/compose/project.py:697: in up results, errors = parallel.parallel_execute( ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:108: in parallel_execute raise error_to_reraise ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:206: in producer result = func(obj) ../../build/ve/docker/lib/python3.9/site-packages/compose/project.py:679: in do return service.execute_convergence_plan( ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:559: in execute_convergence_plan return self._execute_convergence_create( ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:473: in _execute_convergence_create containers, errors = parallel_execute( ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:108: in parallel_execute raise error_to_reraise ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:206: in producer result = func(obj) ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:478: in <lambda> lambda service_name: create_and_start(self, service_name.number), ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:461: in create_and_start self.start_container(container) ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:647: in start_container log.warn("Host is already in use by another container") _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <Logger compose.service (WARNING)> msg = "Host is already in use by another container", args = (), kwargs = {} def warn(self, msg, *args, **kwargs): > warnings.warn("The "warn" method is deprecated, " "use "warning" instead", DeprecationWarning, 2) E DeprecationWarning: The "warn" method is deprecated, use "warning" instead /usr/lib/python3.9/logging/__init__.py:1457: DeprecationWarning ``` </p></details> </ul> ##### `Build&Test / x-pack/metricbeat-pythonIntegTest / test_stats – x-pack.metricbeat.module.enterprisesearch.test_enterprisesearch.Test` <ul> <details><summary>Expand to view the error details</summary><p> ``` failed on setup with "DeprecationWarning: The "warn" method is deprecated, use "warning" instead" ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` self = <docker.api.client.APIClient object at 0x7fabc5a156d0> response = <Response [500]> def _raise_for_status(self, response): """Raises stored :class:`APIError`, if one occurred.""" try: > response.raise_for_status() ../../build/ve/docker/lib/python3.9/site-packages/docker/api/client.py:268: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <Response [500]> def raise_for_status(self): """Raises :class:`HTTPError`, if one occurred.""" http_error_msg = "" if isinstance(self.reason, bytes): # We attempt to decode utf-8 first because some servers # choose to localize their reason strings. If the string # isn"t utf-8, we fall back to iso-8859-1 for all other # encodings. (See PR #3538) try: reason = self.reason.decode("utf-8") except UnicodeDecodeError: reason = self.reason.decode("iso-8859-1") else: reason = self.reason if 400 <= self.status_code < 500: http_error_msg = u"%s Client Error: %s for url: %s" % (self.status_code, reason, self.url) elif 500 <= self.status_code < 600: http_error_msg = u"%s Server Error: %s for url: %s" % (self.status_code, reason, self.url) if http_error_msg: > raise HTTPError(http_error_msg, response=self) E requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start ../../build/ve/docker/lib/python3.9/site-packages/requests/models.py:943: HTTPError During handling of the above exception, another exception occurred: self = <Service: elasticsearch> container = <Container: enterprisesearch_7918a246f6f8_elasticsearch_1 (f47681)> use_network_aliases = True def start_container(self, container, use_network_aliases=True): self.connect_container_to_networks(container, use_network_aliases) try: > container.start() ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:643: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <Container: enterprisesearch_7918a246f6f8_elasticsearch_1 (f47681)> options = {} def start(self, **options): > return self.client.start(self.id, **options) ../../build/ve/docker/lib/python3.9/site-packages/compose/container.py:228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <docker.api.client.APIClient object at 0x7fabc5a156d0> resource_id = "f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383" args = (), kwargs = {} @functools.wraps(f) def wrapped(self, resource_id=None, *args, **kwargs): if resource_id is None and kwargs.get(resource_name): resource_id = kwargs.pop(resource_name) if isinstance(resource_id, dict): resource_id = resource_id.get("Id", resource_id.get("ID")) if not resource_id: raise errors.NullResource( "Resource ID was not provided" ) > return f(self, resource_id, *args, **kwargs) ../../build/ve/docker/lib/python3.9/site-packages/docker/utils/decorators.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <docker.api.client.APIClient object at 0x7fabc5a156d0> container = "f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383" args = (), kwargs = {} url = "http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start" res = <Response [500]> @utils.check_resource("container") def start(self, container, *args, **kwargs): """ Start a container. Similar to the ``docker start`` command, but doesn"t support attach options. **Deprecation warning:** Passing configuration options in ``start`` is no longer supported. Users are expected to provide host config options in the ``host_config`` parameter of :py:meth:`~ContainerApiMixin.create_container`. Args: container (str): The container to start Raises: :py:class:`docker.errors.APIError` If the server returns an error. :py:class:`docker.errors.DeprecatedMethod` If any argument besides ``container`` are provided. Example: >>> container = client.api.create_container( ... image="busybox:latest", ... command="/bin/sleep 30") >>> client.api.start(container=container.get("Id")) """ if args or kwargs: raise errors.DeprecatedMethod( "Providing configuration in the start() method is no longer " "supported. Use the host_config param in create_container " "instead." ) url = self._url("/containers/{0}/start", container) res = self._post(url) > self._raise_for_status(res) ../../build/ve/docker/lib/python3.9/site-packages/docker/api/container.py:1109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <docker.api.client.APIClient object at 0x7fabc5a156d0> response = <Response [500]> def _raise_for_status(self, response): """Raises stored :class:`APIError`, if one occurred.""" try: response.raise_for_status() except requests.exceptions.HTTPError as e: > raise create_api_error_from_http_exception(e) ../../build/ve/docker/lib/python3.9/site-packages/docker/api/client.py:270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ e = HTTPError("500 Server Error: Internal Server Error for url: http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start") def create_api_error_from_http_exception(e): """ Create a suitable APIError from requests.exceptions.HTTPError. """ response = e.response try: explanation = response.json()["message"] except ValueError: explanation = (response.content or "").strip() cls = APIError if response.status_code == 404: if explanation and ("No such image" in str(explanation) or "not found: does not exist or no pull access" in str(explanation) or "repository does not exist" in str(explanation)): cls = ImageNotFound else: cls = NotFound > raise cls(e, response=response, explanation=explanation) E docker.errors.APIError: 500 Server Error for http+docker://localhost/v1.41/containers/f47681f86fdd7a970ac482b48468772ecfc1221dde9e101a370906644e0ee383/start: Internal Server Error ("driver failed programming external connectivity on endpoint enterprisesearch_7918a246f6f8_elasticsearch_1 (e4a03d626dd87fddbb5b812d3b7af90cf980c924da9bb7db603afb26fd3fb306): Bind for 0.0.0.0:9200 failed: port is already allocated") ../../build/ve/docker/lib/python3.9/site-packages/docker/errors.py:31: APIError During handling of the above exception, another exception occurred: self = <class "test_enterprisesearch.Test"> @classmethod def setUpClass(self): self.beat_name = "metricbeat" self.beat_path = os.path.abspath( os.path.join(os.path.dirname(__file__), "../../")) self.template_paths = [ os.path.abspath(os.path.join(self.beat_path, "../../metricbeat")), os.path.abspath(os.path.join(self.beat_path, "../../libbeat")), ] > super(XPackTest, self).setUpClass() tests/system/xpack_metricbeat.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../metricbeat/tests/system/metricbeat.py:42: in setUpClass super().setUpClass() ../../libbeat/tests/system/beat/beat.py:204: in setUpClass cls.compose_up_with_retries() ../../libbeat/tests/system/beat/beat.py:222: in compose_up_with_retries raise ex ../../libbeat/tests/system/beat/beat.py:218: in compose_up_with_retries cls.compose_up() ../../libbeat/tests/system/beat/compose.py:66: in compose_up project.up( ../../build/ve/docker/lib/python3.9/site-packages/compose/project.py:697: in up results, errors = parallel.parallel_execute( ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:108: in parallel_execute raise error_to_reraise ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:206: in producer result = func(obj) ../../build/ve/docker/lib/python3.9/site-packages/compose/project.py:679: in do return service.execute_convergence_plan( ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:559: in execute_convergence_plan return self._execute_convergence_create( ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:473: in _execute_convergence_create containers, errors = parallel_execute( ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:108: in parallel_execute raise error_to_reraise ../../build/ve/docker/lib/python3.9/site-packages/compose/parallel.py:206: in producer result = func(obj) ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:478: in <lambda> lambda service_name: create_and_start(self, service_name.number), ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:461: in create_and_start self.start_container(container) ../../build/ve/docker/lib/python3.9/site-packages/compose/service.py:647: in start_container log.warn("Host is already in use by another container") _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <Logger compose.service (WARNING)> msg = "Host is already in use by another container", args = (), kwargs = {} def warn(self, msg, *args, **kwargs): > warnings.warn("The "warn" method is deprecated, " "use "warning" instead", DeprecationWarning, 2) E DeprecationWarning: The "warn" method is deprecated, use "warning" instead /usr/lib/python3.9/logging/__init__.py:1457: DeprecationWarning ``` </p></details> </ul> </p></details> <!-- STEPS ERRORS IF ANY --> ### Steps errors [![7](https://img.shields.io/badge/7%20-red)](https://beats-ci.elastic.co/blue/organizations/jenkins/Beats%2Fbeats%2F8.4/detail/8.4/2//pipeline) <details><summary>Expand to view the steps failures</summary> <p> ##### `Docker login` <ul> <li>Took 0 min 6 sec . View more details <a href="https://beats-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/Beats/pipelines/beats/pipelines/8.4/runs/2/steps/2929/log/?start=0">here</a></li> <li>Description: <code> set +x if command -v host 2>&1 > /dev/null; then host docker.io 2>&1 > /dev/null fi if command -v dig 2>&1 > /dev/null; then dig docker.io 2>&1 > /dev/null fi docker login -u "${DOCKER_USER}" -p "${DOCKER_PASSWORD}" "docker.io" 2>/dev/null </code></l1> </ul> ##### `Docker login` <ul> <li>Took 0 min 6 sec . View more details <a href="https://beats-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/Beats/pipelines/beats/pipelines/8.4/runs/2/steps/2922/log/?start=0">here</a></li> <li>Description: <code> set +x if command -v host 2>&1 > /dev/null; then host docker.io 2>&1 > /dev/null fi if command -v dig 2>&1 > /dev/null; then dig docker.io 2>&1 > /dev/null fi docker login -u "${DOCKER_USER}" -p "${DOCKER_PASSWORD}" "docker.io" 2>/dev/null </code></l1> </ul> ##### `Docker login` <ul> <li>Took 0 min 16 sec . View more details <a href="https://beats-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/Beats/pipelines/beats/pipelines/8.4/runs/2/steps/2596/log/?start=0">here</a></li> <li>Description: <code> set +x if command -v host 2>&1 > /dev/null; then host docker.io 2>&1 > /dev/null fi if command -v dig 2>&1 > /dev/null; then dig docker.io 2>&1 > /dev/null fi docker login -u "${DOCKER_USER}" -p "${DOCKER_PASSWORD}" "docker.io" 2>/dev/null </code></l1> </ul> ##### `x-pack/metricbeat-pythonIntegTest - mage pythonIntegTest` <ul> <li>Took 27 min 41 sec . View more details <a href="https://beats-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/Beats/pipelines/beats/pipelines/8.4/runs/2/steps/16494/log/?start=0">here</a></li> <li>Description: <code>mage pythonIntegTest</code></l1> </ul> ##### `x-pack/metricbeat-pythonIntegTest - mage pythonIntegTest` <ul> <li>Took 20 min 45 sec . View more details <a href="https://beats-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/Beats/pipelines/beats/pipelines/8.4/runs/2/steps/24000/log/?start=0">here</a></li> <li>Description: <code>mage pythonIntegTest</code></l1> </ul> ##### `x-pack/metricbeat-pythonIntegTest - mage pythonIntegTest` <ul> <li>Took 20 min 42 sec . View more details <a href="https://beats-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/Beats/pipelines/beats/pipelines/8.4/runs/2/steps/24304/log/?start=0">here</a></li> <li>Description: <code>mage pythonIntegTest</code></l1> </ul> ##### `Error signal` <ul> <li>Took 0 min 0 sec . View more details <a href="https://beats-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/Beats/pipelines/beats/pipelines/8.4/runs/2/steps/24317/log/?start=0">here</a></li> <li>Description: <code>Error "hudson.AbortException: script returned exit code 1"</code></l1> </ul> </p> </details>
non_priority
build for with status failure broken heart tests failed the below badges are clickable and redirect to their specific view in the ci or docs expand to view the summary build stats start time duration min sec test stats test tube test results failed passed skipped total test errors expand to view the tests failures build test x pack metricbeat pythonintegtest test dashboards – x pack metricbeat tests system test xpack base test expand to view the error details failed on setup with exception timeout while waiting for healthy docker compose services elasticsearch kibana expand to view the stacktrace self classmethod def setupclass self self beat name metricbeat self beat path os path abspath os path join os path dirname file self template paths os path abspath os path join self beat path metricbeat os path abspath os path join self beat path libbeat super xpacktest self setupclass tests system xpack metricbeat py metricbeat tests system metricbeat py in setupclass super setupclass libbeat tests system beat beat py in setupclass cls compose up with retries libbeat tests system beat beat py in compose up with retries raise ex libbeat tests system beat beat py in compose up with retries cls compose up cls classmethod def compose up cls ensure only the services defined under compose services are running and healthy if not integration tests or not cls compose services return if os environ get no compose return def print logs container print container name without project print container logs print def is healthy container return container inspect healthy project cls compose project with disabled logger compose service project pull ignore pull failures true service names cls compose services project up strategy convergencestrategy always service names cls compose services timeout wait for them to be healthy start time time while true containers project containers service names cls compose services stopped true healthy true for container in containers if not container is running print logs container raise exception container s unexpectedly finished on startup container name without project if not is healthy container healthy false break if healthy break if cls compose advertised host for service in cls compose services cls setup advertised host project service time sleep timeout time time start cls compose timeout if timeout for container in containers if not is healthy container print logs container raise exception timeout while waiting for healthy docker compose services s join cls compose services e exception timeout while waiting for healthy docker compose services elasticsearch kibana libbeat tests system beat compose py exception build test x pack metricbeat pythonintegtest test export config – x pack metricbeat tests system test xpack base test expand to view the error details failed on setup with exception timeout while waiting for healthy docker compose services elasticsearch kibana expand to view the stacktrace self classmethod def setupclass self self beat name metricbeat self beat path os path abspath os path join os path dirname file self template paths os path abspath os path join self beat path metricbeat os path abspath os path join self beat path libbeat super xpacktest self setupclass tests system xpack metricbeat py metricbeat tests system metricbeat py in setupclass super setupclass libbeat tests system beat beat py in setupclass cls compose up with retries libbeat tests system beat beat py in compose up with retries raise ex libbeat tests system beat beat py in compose up with retries cls compose up cls classmethod def compose up cls ensure only the services defined under compose services are running and healthy if not integration tests or not cls compose services return if os environ get no compose return def print logs container print container name without project print container logs print def is healthy container return container inspect healthy project cls compose project with disabled logger compose service project pull ignore pull failures true service names cls compose services project up strategy convergencestrategy always service names cls compose services timeout wait for them to be healthy start time time while true containers project containers service names cls compose services stopped true healthy true for container in containers if not container is running print logs container raise exception container s unexpectedly finished on startup container name without project if not is healthy container healthy false break if healthy break if cls compose advertised host for service in cls compose services cls setup advertised host project service time sleep timeout time time start cls compose timeout if timeout for container in containers if not is healthy container print logs container raise exception timeout while waiting for healthy docker compose services s join cls compose services e exception timeout while waiting for healthy docker compose services elasticsearch kibana libbeat tests system beat compose py exception build test x pack metricbeat pythonintegtest test export ilm policy – x pack metricbeat tests system test xpack base test expand to view the error details failed on setup with exception timeout while waiting for healthy docker compose services elasticsearch kibana expand to view the stacktrace self classmethod def setupclass self self beat name metricbeat self beat path os path abspath os path join os path dirname file self template paths os path abspath os path join self beat path metricbeat os path abspath os path join self beat path libbeat super xpacktest self setupclass tests system xpack metricbeat py metricbeat tests system metricbeat py in setupclass super setupclass libbeat tests system beat beat py in setupclass cls compose up with retries libbeat tests system beat beat py in compose up with retries raise ex libbeat tests system beat beat py in compose up with retries cls compose up cls classmethod def compose up cls ensure only the services defined under compose services are running and healthy if not integration tests or not cls compose services return if os environ get no compose return def print logs container print container name without project print container logs print def is healthy container return container inspect healthy project cls compose project with disabled logger compose service project pull ignore pull failures true service names cls compose services project up strategy convergencestrategy always service names cls compose services timeout wait for them to be healthy start time time while true containers project containers service names cls compose services stopped true healthy true for container in containers if not container is running print logs container raise exception container s unexpectedly finished on startup container name without project if not is healthy container healthy false break if healthy break if cls compose advertised host for service in cls compose services cls setup advertised host project service time sleep timeout time time start cls compose timeout if timeout for container in containers if not is healthy container print logs container raise exception timeout while waiting for healthy docker compose services s join cls compose services e exception timeout while waiting for healthy docker compose services elasticsearch kibana libbeat tests system beat compose py exception build test x pack metricbeat pythonintegtest test export index pattern – x pack metricbeat tests system test xpack base test expand to view the error details failed on setup with exception timeout while waiting for healthy docker compose services elasticsearch kibana expand to view the stacktrace self classmethod def setupclass self self beat name metricbeat self beat path os path abspath os path join os path dirname file self template paths os path abspath os path join self beat path metricbeat os path abspath os path join self beat path libbeat super xpacktest self setupclass tests system xpack metricbeat py metricbeat tests system metricbeat py in setupclass super setupclass libbeat tests system beat beat py in setupclass cls compose up with retries libbeat tests system beat beat py in compose up with retries raise ex libbeat tests system beat beat py in compose up with retries cls compose up cls classmethod def compose up cls ensure only the services defined under compose services are running and healthy if not integration tests or not cls compose services return if os environ get no compose return def print logs container print container name without project print container logs print def is healthy container return container inspect healthy project cls compose project with disabled logger compose service project pull ignore pull failures true service names cls compose services project up strategy convergencestrategy always service names cls compose services timeout wait for them to be healthy start time time while true containers project containers service names cls compose services stopped true healthy true for container in containers if not container is running print logs container raise exception container s unexpectedly finished on startup container name without project if not is healthy container healthy false break if healthy break if cls compose advertised host for service in cls compose services cls setup advertised host project service time sleep timeout time time start cls compose timeout if timeout for container in containers if not is healthy container print logs container raise exception timeout while waiting for healthy docker compose services s join cls compose services e exception timeout while waiting for healthy docker compose services elasticsearch kibana libbeat tests system beat compose py exception build test x pack metricbeat pythonintegtest test export index pattern migration – x pack metricbeat tests system test xpack base test expand to view the error details failed on setup with exception timeout while waiting for healthy docker compose services elasticsearch kibana expand to view the stacktrace self classmethod def setupclass self self beat name metricbeat self beat path os path abspath os path join os path dirname file self template paths os path abspath os path join self beat path metricbeat os path abspath os path join self beat path libbeat super xpacktest self setupclass tests system xpack metricbeat py metricbeat tests system metricbeat py in setupclass super setupclass libbeat tests system beat beat py in setupclass cls compose up with retries libbeat tests system beat beat py in compose up with retries raise ex libbeat tests system beat beat py in compose up with retries cls compose up cls classmethod def compose up cls ensure only the services defined under compose services are running and healthy if not integration tests or not cls compose services return if os environ get no compose return def print logs container print container name without project print container logs print def is healthy container return container inspect healthy project cls compose project with disabled logger compose service project pull ignore pull failures true service names cls compose services project up strategy convergencestrategy always service names cls compose services timeout wait for them to be healthy start time time while true containers project containers service names cls compose services stopped true healthy true for container in containers if not container is running print logs container raise exception container s unexpectedly finished on startup container name without project if not is healthy container healthy false break if healthy break if cls compose advertised host for service in cls compose services cls setup advertised host project service time sleep timeout time time start cls compose timeout if timeout for container in containers if not is healthy container print logs container raise exception timeout while waiting for healthy docker compose services s join cls compose services e exception timeout while waiting for healthy docker compose services elasticsearch kibana libbeat tests system beat compose py exception build test x pack metricbeat pythonintegtest test export template – x pack metricbeat tests system test xpack base test expand to view the error details failed on setup with exception timeout while waiting for healthy docker compose services elasticsearch kibana expand to view the stacktrace self classmethod def setupclass self self beat name metricbeat self beat path os path abspath os path join os path dirname file self template paths os path abspath os path join self beat path metricbeat os path abspath os path join self beat path libbeat super xpacktest self setupclass tests system xpack metricbeat py metricbeat tests system metricbeat py in setupclass super setupclass libbeat tests system beat beat py in setupclass cls compose up with retries libbeat tests system beat beat py in compose up with retries raise ex libbeat tests system beat beat py in compose up with retries cls compose up cls classmethod def compose up cls ensure only the services defined under compose services are running and healthy if not integration tests or not cls compose services return if os environ get no compose return def print logs container print container name without project print container logs print def is healthy container return container inspect healthy project cls compose project with disabled logger compose service project pull ignore pull failures true service names cls compose services project up strategy convergencestrategy always service names cls compose services timeout wait for them to be healthy start time time while true containers project containers service names cls compose services stopped true healthy true for container in containers if not container is running print logs container raise exception container s unexpectedly finished on startup container name without project if not is healthy container healthy false break if healthy break if cls compose advertised host for service in cls compose services cls setup advertised host project service time sleep timeout time time start cls compose timeout if timeout for container in containers if not is healthy container print logs container raise exception timeout while waiting for healthy docker compose services s join cls compose services e exception timeout while waiting for healthy docker compose services elasticsearch kibana libbeat tests system beat compose py exception build test x pack metricbeat pythonintegtest test index management – x pack metricbeat tests system test xpack base test expand to view the error details failed on setup with exception timeout while waiting for healthy docker compose services elasticsearch kibana expand to view the stacktrace self classmethod def setupclass self self beat name metricbeat self beat path os path abspath os path join os path dirname file self template paths os path abspath os path join self beat path metricbeat os path abspath os path join self beat path libbeat super xpacktest self setupclass tests system xpack metricbeat py metricbeat tests system metricbeat py in setupclass super setupclass libbeat tests system beat beat py in setupclass cls compose up with retries libbeat tests system beat beat py in compose up with retries raise ex libbeat tests system beat beat py in compose up with retries cls compose up cls classmethod def compose up cls ensure only the services defined under compose services are running and healthy if not integration tests or not cls compose services return if os environ get no compose return def print logs container print container name without project print container logs print def is healthy container return container inspect healthy project cls compose project with disabled logger compose service project pull ignore pull failures true service names cls compose services project up strategy convergencestrategy always service names cls compose services timeout wait for them to be healthy start time time while true containers project containers service names cls compose services stopped true healthy true for container in containers if not container is running print logs container raise exception container s unexpectedly finished on startup container name without project if not is healthy container healthy false break if healthy break if cls compose advertised host for service in cls compose services cls setup advertised host project service time sleep timeout time time start cls compose timeout if timeout for container in containers if not is healthy container print logs container raise exception timeout while waiting for healthy docker compose services s join cls compose services e exception timeout while waiting for healthy docker compose services elasticsearch kibana libbeat tests system beat compose py exception build test x pack metricbeat pythonintegtest test start stop – x pack metricbeat tests system test xpack base test expand to view the error details failed on setup with exception timeout while waiting for healthy docker compose services elasticsearch kibana expand to view the stacktrace self classmethod def setupclass self self beat name metricbeat self beat path os path abspath os path join os path dirname file self template paths os path abspath os path join self beat path metricbeat os path abspath os path join self beat path libbeat super xpacktest self setupclass tests system xpack metricbeat py metricbeat tests system metricbeat py in setupclass super setupclass libbeat tests system beat beat py in setupclass cls compose up with retries libbeat tests system beat beat py in compose up with retries raise ex libbeat tests system beat beat py in compose up with retries cls compose up cls classmethod def compose up cls ensure only the services defined under compose services are running and healthy if not integration tests or not cls compose services return if os environ get no compose return def print logs container print container name without project print container logs print def is healthy container return container inspect healthy project cls compose project with disabled logger compose service project pull ignore pull failures true service names cls compose services project up strategy convergencestrategy always service names cls compose services timeout wait for them to be healthy start time time while true containers project containers service names cls compose services stopped true healthy true for container in containers if not container is running print logs container raise exception container s unexpectedly finished on startup container name without project if not is healthy container healthy false break if healthy break if cls compose advertised host for service in cls compose services cls setup advertised host project service time sleep timeout time time start cls compose timeout if timeout for container in containers if not is healthy container print logs container raise exception timeout while waiting for healthy docker compose services s join cls compose services e exception timeout while waiting for healthy docker compose services elasticsearch kibana libbeat tests system beat compose py exception build test x pack metricbeat pythonintegtest test health – x pack metricbeat module enterprisesearch test enterprisesearch test expand to view the error details failed on setup with deprecationwarning the warn method is deprecated use warning instead expand to view the stacktrace self response def raise for status self response raises stored class apierror if one occurred try response raise for status build ve docker lib site packages docker api client py self def raise for status self raises class httperror if one occurred http error msg if isinstance self reason bytes we attempt to decode utf first because some servers choose to localize their reason strings if the string isn t utf we fall back to iso for all other encodings see pr try reason self reason decode utf except unicodedecodeerror reason self reason decode iso else reason self reason if self status code http error msg u s client error s for url s self status code reason self url elif self status code http error msg u s server error s for url s self status code reason self url if http error msg raise httperror http error msg response self e requests exceptions httperror server error internal server error for url http docker localhost containers start build ve docker lib site packages requests models py httperror during handling of the above exception another exception occurred self container use network aliases true def start container self container use network aliases true self connect container to networks container use network aliases try container start build ve docker lib site packages compose service py self options def start self options return self client start self id options build ve docker lib site packages compose container py self resource id args kwargs functools wraps f def wrapped self resource id none args kwargs if resource id is none and kwargs get resource name resource id kwargs pop resource name if isinstance resource id dict resource id resource id get id resource id get id if not resource id raise errors nullresource resource id was not provided return f self resource id args kwargs build ve docker lib site packages docker utils decorators py self container args kwargs url http docker localhost containers start res utils check resource container def start self container args kwargs start a container similar to the docker start command but doesn t support attach options deprecation warning passing configuration options in start is no longer supported users are expected to provide host config options in the host config parameter of py meth containerapimixin create container args container str the container to start raises py class docker errors apierror if the server returns an error py class docker errors deprecatedmethod if any argument besides container are provided example container client api create container image busybox latest command bin sleep client api start container container get id if args or kwargs raise errors deprecatedmethod providing configuration in the start method is no longer supported use the host config param in create container instead url self url containers start container res self post url self raise for status res build ve docker lib site packages docker api container py self response def raise for status self response raises stored class apierror if one occurred try response raise for status except requests exceptions httperror as e raise create api error from http exception e build ve docker lib site packages docker api client py e httperror server error internal server error for url http docker localhost containers start def create api error from http exception e create a suitable apierror from requests exceptions httperror response e response try explanation response json except valueerror explanation response content or strip cls apierror if response status code if explanation and no such image in str explanation or not found does not exist or no pull access in str explanation or repository does not exist in str explanation cls imagenotfound else cls notfound raise cls e response response explanation explanation e docker errors apierror server error for http docker localhost containers start internal server error driver failed programming external connectivity on endpoint enterprisesearch elasticsearch bind for failed port is already allocated build ve docker lib site packages docker errors py apierror during handling of the above exception another exception occurred self classmethod def setupclass self self beat name metricbeat self beat path os path abspath os path join os path dirname file self template paths os path abspath os path join self beat path metricbeat os path abspath os path join self beat path libbeat super xpacktest self setupclass tests system xpack metricbeat py metricbeat tests system metricbeat py in setupclass super setupclass libbeat tests system beat beat py in setupclass cls compose up with retries libbeat tests system beat beat py in compose up with retries raise ex libbeat tests system beat beat py in compose up with retries cls compose up libbeat tests system beat compose py in compose up project up build ve docker lib site packages compose project py in up results errors parallel parallel execute build ve docker lib site packages compose parallel py in parallel execute raise error to reraise build ve docker lib site packages compose parallel py in producer result func obj build ve docker lib site packages compose project py in do return service execute convergence plan build ve docker lib site packages compose service py in execute convergence plan return self execute convergence create build ve docker lib site packages compose service py in execute convergence create containers errors parallel execute build ve docker lib site packages compose parallel py in parallel execute raise error to reraise build ve docker lib site packages compose parallel py in producer result func obj build ve docker lib site packages compose service py in lambda service name create and start self service name number build ve docker lib site packages compose service py in create and start self start container container build ve docker lib site packages compose service py in start container log warn host is already in use by another container self msg host is already in use by another container args kwargs def warn self msg args kwargs warnings warn the warn method is deprecated use warning instead deprecationwarning e deprecationwarning the warn method is deprecated use warning instead usr lib logging init py deprecationwarning build test x pack metricbeat pythonintegtest test stats – x pack metricbeat module enterprisesearch test enterprisesearch test expand to view the error details failed on setup with deprecationwarning the warn method is deprecated use warning instead expand to view the stacktrace self response def raise for status self response raises stored class apierror if one occurred try response raise for status build ve docker lib site packages docker api client py self def raise for status self raises class httperror if one occurred http error msg if isinstance self reason bytes we attempt to decode utf first because some servers choose to localize their reason strings if the string isn t utf we fall back to iso for all other encodings see pr try reason self reason decode utf except unicodedecodeerror reason self reason decode iso else reason self reason if self status code http error msg u s client error s for url s self status code reason self url elif self status code http error msg u s server error s for url s self status code reason self url if http error msg raise httperror http error msg response self e requests exceptions httperror server error internal server error for url http docker localhost containers start build ve docker lib site packages requests models py httperror during handling of the above exception another exception occurred self container use network aliases true def start container self container use network aliases true self connect container to networks container use network aliases try container start build ve docker lib site packages compose service py self options def start self options return self client start self id options build ve docker lib site packages compose container py self resource id args kwargs functools wraps f def wrapped self resource id none args kwargs if resource id is none and kwargs get resource name resource id kwargs pop resource name if isinstance resource id dict resource id resource id get id resource id get id if not resource id raise errors nullresource resource id was not provided return f self resource id args kwargs build ve docker lib site packages docker utils decorators py self container args kwargs url http docker localhost containers start res utils check resource container def start self container args kwargs start a container similar to the docker start command but doesn t support attach options deprecation warning passing configuration options in start is no longer supported users are expected to provide host config options in the host config parameter of py meth containerapimixin create container args container str the container to start raises py class docker errors apierror if the server returns an error py class docker errors deprecatedmethod if any argument besides container are provided example container client api create container image busybox latest command bin sleep client api start container container get id if args or kwargs raise errors deprecatedmethod providing configuration in the start method is no longer supported use the host config param in create container instead url self url containers start container res self post url self raise for status res build ve docker lib site packages docker api container py self response def raise for status self response raises stored class apierror if one occurred try response raise for status except requests exceptions httperror as e raise create api error from http exception e build ve docker lib site packages docker api client py e httperror server error internal server error for url http docker localhost containers start def create api error from http exception e create a suitable apierror from requests exceptions httperror response e response try explanation response json except valueerror explanation response content or strip cls apierror if response status code if explanation and no such image in str explanation or not found does not exist or no pull access in str explanation or repository does not exist in str explanation cls imagenotfound else cls notfound raise cls e response response explanation explanation e docker errors apierror server error for http docker localhost containers start internal server error driver failed programming external connectivity on endpoint enterprisesearch elasticsearch bind for failed port is already allocated build ve docker lib site packages docker errors py apierror during handling of the above exception another exception occurred self classmethod def setupclass self self beat name metricbeat self beat path os path abspath os path join os path dirname file self template paths os path abspath os path join self beat path metricbeat os path abspath os path join self beat path libbeat super xpacktest self setupclass tests system xpack metricbeat py metricbeat tests system metricbeat py in setupclass super setupclass libbeat tests system beat beat py in setupclass cls compose up with retries libbeat tests system beat beat py in compose up with retries raise ex libbeat tests system beat beat py in compose up with retries cls compose up libbeat tests system beat compose py in compose up project up build ve docker lib site packages compose project py in up results errors parallel parallel execute build ve docker lib site packages compose parallel py in parallel execute raise error to reraise build ve docker lib site packages compose parallel py in producer result func obj build ve docker lib site packages compose project py in do return service execute convergence plan build ve docker lib site packages compose service py in execute convergence plan return self execute convergence create build ve docker lib site packages compose service py in execute convergence create containers errors parallel execute build ve docker lib site packages compose parallel py in parallel execute raise error to reraise build ve docker lib site packages compose parallel py in producer result func obj build ve docker lib site packages compose service py in lambda service name create and start self service name number build ve docker lib site packages compose service py in create and start self start container container build ve docker lib site packages compose service py in start container log warn host is already in use by another container self msg host is already in use by another container args kwargs def warn self msg args kwargs warnings warn the warn method is deprecated use warning instead deprecationwarning e deprecationwarning the warn method is deprecated use warning instead usr lib logging init py deprecationwarning steps errors expand to view the steps failures docker login took min sec view more details a href description set x if command v host dev null then host docker io dev null fi if command v dig dev null then dig docker io dev null fi docker login u docker user p docker password docker io dev null docker login took min sec view more details a href description set x if command v host dev null then host docker io dev null fi if command v dig dev null then dig docker io dev null fi docker login u docker user p docker password docker io dev null docker login took min sec view more details a href description set x if command v host dev null then host docker io dev null fi if command v dig dev null then dig docker io dev null fi docker login u docker user p docker password docker io dev null x pack metricbeat pythonintegtest mage pythonintegtest took min sec view more details a href description mage pythonintegtest x pack metricbeat pythonintegtest mage pythonintegtest took min sec view more details a href description mage pythonintegtest x pack metricbeat pythonintegtest mage pythonintegtest took min sec view more details a href description mage pythonintegtest error signal took min sec view more details a href description error hudson abortexception script returned exit code
0
6,393
14,498,679,951
IssuesEvent
2020-12-11 15:48:09
ratchetphp/Ratchet
https://api.github.com/repos/ratchetphp/Ratchet
opened
Roadmap
architecture docs enhancement feature
## v0.5 Updates to the next release of Ratchet will be made against the [v0.5 branch](https://github.com/ratchetphp/Ratchet/tree/v0.5). This version will add some functionality, including a transition period, while keeping backwards compatibility. Key features for this version include: - WebSocket deflate support. Off by default. A new optional parameter to be added to `WsServer` to enable compression. - `ConnectionInterface` will implement [PSR-11's `ContainerInterface`](https://www.php-fig.org/psr/psr-11/). Properties from Components will be accessible via `$conn->get('HTTP.request')` as well as the current magic methodical way of `$conn->HTTP->request`. - Update dependencies to work with all React 1.0 libraries. We will support a range of what's supported now (0.x versions) up to 1.0. A couple of their APIs have changed in 1.0 so this may be a BC break for some people if they're also using React in their projects, hence maintaining support for the old version as well - Add TLS support to the App Facade (#848) - Consider adopting [PSR-12](https://www.php-fig.org/psr/psr-12/) in the form of a pre-commit hook or GitHub action to auto-format so the code base is consistent without having to think about it ## v0.6/v1.0 This version will not include any new features but have backwards compatibility breaks from old code. - Remove the magic accessors from `ConnectionInterface`. All properties set by Components are to be access via `ContainerInterface` methods. This will be a syntactic BC break but not an architectural one. - New version of PHP requirement (discussions to be had around which version this should be) - Transition return type declarations on all methods from Docblocks to language - Session and WAMP components will be moved to their own repositories - Drop support for pre 1.0 version of React dependencies - Determine optimal target version of Symfony libraries
1.0
Roadmap - ## v0.5 Updates to the next release of Ratchet will be made against the [v0.5 branch](https://github.com/ratchetphp/Ratchet/tree/v0.5). This version will add some functionality, including a transition period, while keeping backwards compatibility. Key features for this version include: - WebSocket deflate support. Off by default. A new optional parameter to be added to `WsServer` to enable compression. - `ConnectionInterface` will implement [PSR-11's `ContainerInterface`](https://www.php-fig.org/psr/psr-11/). Properties from Components will be accessible via `$conn->get('HTTP.request')` as well as the current magic methodical way of `$conn->HTTP->request`. - Update dependencies to work with all React 1.0 libraries. We will support a range of what's supported now (0.x versions) up to 1.0. A couple of their APIs have changed in 1.0 so this may be a BC break for some people if they're also using React in their projects, hence maintaining support for the old version as well - Add TLS support to the App Facade (#848) - Consider adopting [PSR-12](https://www.php-fig.org/psr/psr-12/) in the form of a pre-commit hook or GitHub action to auto-format so the code base is consistent without having to think about it ## v0.6/v1.0 This version will not include any new features but have backwards compatibility breaks from old code. - Remove the magic accessors from `ConnectionInterface`. All properties set by Components are to be access via `ContainerInterface` methods. This will be a syntactic BC break but not an architectural one. - New version of PHP requirement (discussions to be had around which version this should be) - Transition return type declarations on all methods from Docblocks to language - Session and WAMP components will be moved to their own repositories - Drop support for pre 1.0 version of React dependencies - Determine optimal target version of Symfony libraries
non_priority
roadmap updates to the next release of ratchet will be made against the this version will add some functionality including a transition period while keeping backwards compatibility key features for this version include websocket deflate support off by default a new optional parameter to be added to wsserver to enable compression connectioninterface will implement properties from components will be accessible via conn get http request as well as the current magic methodical way of conn http request update dependencies to work with all react libraries we will support a range of what s supported now x versions up to a couple of their apis have changed in so this may be a bc break for some people if they re also using react in their projects hence maintaining support for the old version as well add tls support to the app facade consider adopting in the form of a pre commit hook or github action to auto format so the code base is consistent without having to think about it this version will not include any new features but have backwards compatibility breaks from old code remove the magic accessors from connectioninterface all properties set by components are to be access via containerinterface methods this will be a syntactic bc break but not an architectural one new version of php requirement discussions to be had around which version this should be transition return type declarations on all methods from docblocks to language session and wamp components will be moved to their own repositories drop support for pre version of react dependencies determine optimal target version of symfony libraries
0
153,445
19,706,434,918
IssuesEvent
2022-01-12 22:40:28
KaterinaOrg/maven-modular
https://api.github.com/repos/KaterinaOrg/maven-modular
opened
CVE-2020-36187 (High) detected in jackson-databind-2.9.6.jar
security vulnerability
## CVE-2020-36187 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.6.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /module2/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar</p> <p> Dependency Hierarchy: - jackson-module-kotlin-2.9.6.jar (Root Library) - :x: **jackson-databind-2.9.6.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/KaterinaOrg/maven-modular/commit/5316d1e17d60b08f67a1c0f5526eeffbf1f3103a">5316d1e17d60b08f67a1c0f5526eeffbf1f3103a</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.tomcat.dbcp.dbcp.datasources.SharedPoolDataSource. <p>Publish Date: 2021-01-06 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36187>CVE-2020-36187</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2997">https://github.com/FasterXML/jackson-databind/issues/2997</a></p> <p>Release Date: 2021-01-06</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.6","packageFilePaths":["/module2/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"com.fasterxml.jackson.module:jackson-module-kotlin:2.9.6;com.fasterxml.jackson.core:jackson-databind:2.9.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-36187","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.tomcat.dbcp.dbcp.datasources.SharedPoolDataSource.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36187","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2020-36187 (High) detected in jackson-databind-2.9.6.jar - ## CVE-2020-36187 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.6.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /module2/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar</p> <p> Dependency Hierarchy: - jackson-module-kotlin-2.9.6.jar (Root Library) - :x: **jackson-databind-2.9.6.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/KaterinaOrg/maven-modular/commit/5316d1e17d60b08f67a1c0f5526eeffbf1f3103a">5316d1e17d60b08f67a1c0f5526eeffbf1f3103a</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.tomcat.dbcp.dbcp.datasources.SharedPoolDataSource. <p>Publish Date: 2021-01-06 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36187>CVE-2020-36187</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2997">https://github.com/FasterXML/jackson-databind/issues/2997</a></p> <p>Release Date: 2021-01-06</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.6","packageFilePaths":["/module2/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"com.fasterxml.jackson.module:jackson-module-kotlin:2.9.6;com.fasterxml.jackson.core:jackson-databind:2.9.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-36187","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.tomcat.dbcp.dbcp.datasources.SharedPoolDataSource.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36187","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_priority
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy jackson module kotlin jar root library x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org apache tomcat dbcp dbcp datasources sharedpooldatasource publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree com fasterxml jackson module jackson module kotlin com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org apache tomcat dbcp dbcp datasources sharedpooldatasource vulnerabilityurl
0
35,223
7,660,753,335
IssuesEvent
2018-05-11 11:50:35
jnavila/graffle2svg
https://api.github.com/repos/jnavila/graffle2svg
closed
Shapes not managed
defect
Don't know how to display Shape Bezier Don't know how to display Shape FlattenedRectangle Don't know how to display Shape Cloud Don't know how to display Class "PolygonGraphic" Don't know how to display Shape Diamond Don't know how to display Shape Trapazoid Don't know how to display Shape NoteShape
1.0
Shapes not managed - Don't know how to display Shape Bezier Don't know how to display Shape FlattenedRectangle Don't know how to display Shape Cloud Don't know how to display Class "PolygonGraphic" Don't know how to display Shape Diamond Don't know how to display Shape Trapazoid Don't know how to display Shape NoteShape
non_priority
shapes not managed don t know how to display shape bezier don t know how to display shape flattenedrectangle don t know how to display shape cloud don t know how to display class polygongraphic don t know how to display shape diamond don t know how to display shape trapazoid don t know how to display shape noteshape
0
188,965
15,174,163,022
IssuesEvent
2021-02-13 17:15:09
jeffdc/gallformers
https://api.github.com/repos/jeffdc/gallformers
closed
Add Note & Links to Gall-Host Mapping Page That Both Host and Gall must Exisit Before Mapping
documentation enhancement good first issue
I think we should add a note on the Add Gallformers page that points users to the Map Galls and Hosts page for this; I just tried to map a new gall to a genus and had to remember that I need to add one host first, then go to the other tool--not something that would be clear to another user. _Originally posted by @Megachile in https://github.com/jeffdc/gallformers/issues/50#issuecomment-775565511_
1.0
Add Note & Links to Gall-Host Mapping Page That Both Host and Gall must Exisit Before Mapping - I think we should add a note on the Add Gallformers page that points users to the Map Galls and Hosts page for this; I just tried to map a new gall to a genus and had to remember that I need to add one host first, then go to the other tool--not something that would be clear to another user. _Originally posted by @Megachile in https://github.com/jeffdc/gallformers/issues/50#issuecomment-775565511_
non_priority
add note links to gall host mapping page that both host and gall must exisit before mapping i think we should add a note on the add gallformers page that points users to the map galls and hosts page for this i just tried to map a new gall to a genus and had to remember that i need to add one host first then go to the other tool not something that would be clear to another user originally posted by megachile in
0
175,575
14,532,585,363
IssuesEvent
2020-12-14 22:42:04
geozeke/ubuntu
https://api.github.com/repos/geozeke/ubuntu
closed
Graphics Missing Stroke
documentation
* The graphic in setup guide Part 2; Step 20 is missing a stroke. * The graphic in setup guide Part 3; Step 13 is missing a stroke.
1.0
Graphics Missing Stroke - * The graphic in setup guide Part 2; Step 20 is missing a stroke. * The graphic in setup guide Part 3; Step 13 is missing a stroke.
non_priority
graphics missing stroke the graphic in setup guide part step is missing a stroke the graphic in setup guide part step is missing a stroke
0
45,081
12,535,364,887
IssuesEvent
2020-06-04 21:13:35
SasView/sasview
https://api.github.com/repos/SasView/sasview
closed
5.0.1 constraints between FitPages stop working
defect major
In my local build of ESS_GUI, have been trying my usual constraints between core, shell & drop microemulsion contrast series. M2:radius=M4.radius M3:radius=M4.radius+M4.thickness having added and removed a few constraints, including an M2:sld_solvent=M2.sld_core, these plainly stopped working. Of course I cannot easily reproduce this (but now see #1455 for example which fails), but a project file is attached, renamed from json to txt, with non functioning constraints. [constraints_not_work_M2M3M4 .txt](https://github.com/SasView/sasview/files/4160238/constraints_not_work_M2M3M4.txt) but there is now a problem with the save or load of the project, as there is no FitPage3 on load, and the shell contrast appears twice but the drop contrast not at all, so the M3: constraint in the project file does not appear. See #1455 In the original session the shell contrast data was I think being used on both FitPage1 (with ellipsoid) and FitPage2 (with core_shell_sphere), so that might be part of the story. I will try a simpler example.
1.0
5.0.1 constraints between FitPages stop working - In my local build of ESS_GUI, have been trying my usual constraints between core, shell & drop microemulsion contrast series. M2:radius=M4.radius M3:radius=M4.radius+M4.thickness having added and removed a few constraints, including an M2:sld_solvent=M2.sld_core, these plainly stopped working. Of course I cannot easily reproduce this (but now see #1455 for example which fails), but a project file is attached, renamed from json to txt, with non functioning constraints. [constraints_not_work_M2M3M4 .txt](https://github.com/SasView/sasview/files/4160238/constraints_not_work_M2M3M4.txt) but there is now a problem with the save or load of the project, as there is no FitPage3 on load, and the shell contrast appears twice but the drop contrast not at all, so the M3: constraint in the project file does not appear. See #1455 In the original session the shell contrast data was I think being used on both FitPage1 (with ellipsoid) and FitPage2 (with core_shell_sphere), so that might be part of the story. I will try a simpler example.
non_priority
constraints between fitpages stop working in my local build of ess gui have been trying my usual constraints between core shell drop microemulsion contrast series radius radius radius radius thickness having added and removed a few constraints including an sld solvent sld core these plainly stopped working of course i cannot easily reproduce this but now see for example which fails but a project file is attached renamed from json to txt with non functioning constraints but there is now a problem with the save or load of the project as there is no on load and the shell contrast appears twice but the drop contrast not at all so the constraint in the project file does not appear see in the original session the shell contrast data was i think being used on both with ellipsoid and with core shell sphere so that might be part of the story i will try a simpler example
0
447,541
31,713,867,263
IssuesEvent
2023-09-09 16:08:08
vercel/next.js
https://api.github.com/repos/vercel/next.js
closed
Docs: twitter metadate
template: documentation
### What is the improvement or update you wish to see? doc says below the code. but I think `name property `should be `property property` according to [Twitter Card markup reference](https://developer.twitter.com/en/docs/twitter-for-websites/cards/overview/markup). input ``` export const metadata = { twitter: { card: 'summary_large_image', title: 'Next.js', description: 'The React Framework for the Web', siteId: '1467726470533754880', creator: '@nextjs', creatorId: '1467726470533754880', images: ['https://nextjs.org/og.png'], }, } ``` output ``` <meta name="twitter:card" content="summary_large_image" /> <meta name="twitter:site:id" content="1467726470533754880" /> <meta name="twitter:creator" content="@nextjs" /> <meta name="twitter:creator:id" content="1467726470533754880" /> <meta name="twitter:title" content="Next.js" /> <meta name="twitter:description" content="The React Framework for the Web" /> <meta name="twitter:image" content="https://nextjs.org/og.png" /> ``` ### Is there any context that might help us understand? [Twitter Card markup reference](https://developer.twitter.com/en/docs/twitter-for-websites/cards/overview/markup). ### Does the docs page already exist? Please link to it. https://nextjs.org/docs/app/api-reference/functions/generate-metadata#twitter
1.0
Docs: twitter metadate - ### What is the improvement or update you wish to see? doc says below the code. but I think `name property `should be `property property` according to [Twitter Card markup reference](https://developer.twitter.com/en/docs/twitter-for-websites/cards/overview/markup). input ``` export const metadata = { twitter: { card: 'summary_large_image', title: 'Next.js', description: 'The React Framework for the Web', siteId: '1467726470533754880', creator: '@nextjs', creatorId: '1467726470533754880', images: ['https://nextjs.org/og.png'], }, } ``` output ``` <meta name="twitter:card" content="summary_large_image" /> <meta name="twitter:site:id" content="1467726470533754880" /> <meta name="twitter:creator" content="@nextjs" /> <meta name="twitter:creator:id" content="1467726470533754880" /> <meta name="twitter:title" content="Next.js" /> <meta name="twitter:description" content="The React Framework for the Web" /> <meta name="twitter:image" content="https://nextjs.org/og.png" /> ``` ### Is there any context that might help us understand? [Twitter Card markup reference](https://developer.twitter.com/en/docs/twitter-for-websites/cards/overview/markup). ### Does the docs page already exist? Please link to it. https://nextjs.org/docs/app/api-reference/functions/generate-metadata#twitter
non_priority
docs twitter metadate what is the improvement or update you wish to see doc says below the code but i think name property should be property property according to input export const metadata twitter card summary large image title next js description the react framework for the web siteid creator nextjs creatorid images output is there any context that might help us understand does the docs page already exist please link to it
0
41,514
5,343,198,126
IssuesEvent
2017-02-17 10:35:51
cyclestreets/cyclescape
https://api.github.com/repos/cyclestreets/cyclescape
opened
Sidebar metadata (with unfollow) needs to be persistent
design
Because of the auto-scrolling, the unfollow link often disappears up the screen. This can't be solved until a fuller redesign.
1.0
Sidebar metadata (with unfollow) needs to be persistent - Because of the auto-scrolling, the unfollow link often disappears up the screen. This can't be solved until a fuller redesign.
non_priority
sidebar metadata with unfollow needs to be persistent because of the auto scrolling the unfollow link often disappears up the screen this can t be solved until a fuller redesign
0
15,171
5,073,736,845
IssuesEvent
2016-12-27 10:19:10
drbenvincent/delay-discounting-analysis
https://api.github.com/repos/drbenvincent/delay-discounting-analysis
closed
Decompose the Data class into smaller classes
code clean up tests
At the moment, the Data class is doing too much. - [x] Create a DataFile class, which holds data and plot methods - [x] Get Data class to use (maybe generate) DataFile objects. - [x] Add a DataImporter class which takes all responsibility for importing and validating data
1.0
Decompose the Data class into smaller classes - At the moment, the Data class is doing too much. - [x] Create a DataFile class, which holds data and plot methods - [x] Get Data class to use (maybe generate) DataFile objects. - [x] Add a DataImporter class which takes all responsibility for importing and validating data
non_priority
decompose the data class into smaller classes at the moment the data class is doing too much create a datafile class which holds data and plot methods get data class to use maybe generate datafile objects add a dataimporter class which takes all responsibility for importing and validating data
0
300,988
26,008,593,404
IssuesEvent
2022-12-20 22:08:20
hashgraph/hedera-mirror-node
https://api.github.com/repos/hashgraph/hedera-mirror-node
opened
Acceptance tests create operator account fallback cause later test case failure
enhancement test
### Problem We introduced the config property `createOperatorAccount` to create a normal account (not exempt from fees) as the operator account to run the acceptance tests, in order to work around the issue that the default operator account may be exempt fees which in turn causing the check of the number of crypto transfers in a transaction to fail. However, the logic also falls back to the default operator account in case creating such normal account fails, this would again cause later test case failure for the reason mentioned above. We should - research why account create fails and make it more robust - figure out a better approach instead of fallback ### Solution as in the problem description ### Alternatives _No response_
1.0
Acceptance tests create operator account fallback cause later test case failure - ### Problem We introduced the config property `createOperatorAccount` to create a normal account (not exempt from fees) as the operator account to run the acceptance tests, in order to work around the issue that the default operator account may be exempt fees which in turn causing the check of the number of crypto transfers in a transaction to fail. However, the logic also falls back to the default operator account in case creating such normal account fails, this would again cause later test case failure for the reason mentioned above. We should - research why account create fails and make it more robust - figure out a better approach instead of fallback ### Solution as in the problem description ### Alternatives _No response_
non_priority
acceptance tests create operator account fallback cause later test case failure problem we introduced the config property createoperatoraccount to create a normal account not exempt from fees as the operator account to run the acceptance tests in order to work around the issue that the default operator account may be exempt fees which in turn causing the check of the number of crypto transfers in a transaction to fail however the logic also falls back to the default operator account in case creating such normal account fails this would again cause later test case failure for the reason mentioned above we should research why account create fails and make it more robust figure out a better approach instead of fallback solution as in the problem description alternatives no response
0
352,090
25,047,488,691
IssuesEvent
2022-11-05 12:53:39
AY2223S1-CS2103-W14-1/tp
https://api.github.com/repos/AY2223S1-CS2103-W14-1/tp
closed
[PE-D][Tester C] `add -p` command format in User Guide is missing `h/PROPERTY_TYPE`
type.Documentation PE-D.must-fix
As shown in the image belows, the format of the command is missing `h/PROPERTY_TYPE`, although the argument is compulsory. ![image.png](https://raw.githubusercontent.com/peppapighs/ped/main/files/d1724685-fa0c-44b4-943b-4dc150feb72c.png) <!--session: 1666941683639-2c47a0e6-9266-459f-b30e-47ba3c034af2--><!--Version: Web v3.4.4--> ------------- Labels: `severity.VeryLow` `type.DocumentationBug` original: peppapighs/ped#1
1.0
[PE-D][Tester C] `add -p` command format in User Guide is missing `h/PROPERTY_TYPE` - As shown in the image belows, the format of the command is missing `h/PROPERTY_TYPE`, although the argument is compulsory. ![image.png](https://raw.githubusercontent.com/peppapighs/ped/main/files/d1724685-fa0c-44b4-943b-4dc150feb72c.png) <!--session: 1666941683639-2c47a0e6-9266-459f-b30e-47ba3c034af2--><!--Version: Web v3.4.4--> ------------- Labels: `severity.VeryLow` `type.DocumentationBug` original: peppapighs/ped#1
non_priority
add p command format in user guide is missing h property type as shown in the image belows the format of the command is missing h property type although the argument is compulsory labels severity verylow type documentationbug original peppapighs ped
0
158,170
20,009,256,491
IssuesEvent
2022-02-01 02:55:28
Tim-sandbox/EasyBuggyLocal
https://api.github.com/repos/Tim-sandbox/EasyBuggyLocal
opened
CVE-2022-23437 (High) detected in xercesImpl-2.8.0.jar
security vulnerability
## CVE-2022-23437 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xercesImpl-2.8.0.jar</b></p></summary> <p>Xerces2 is the next generation of high performance, fully compliant XML parsers in the Apache Xerces family. This new version of Xerces introduces the Xerces Native Interface (XNI), a complete framework for building parser components and configurations that is extremely modular and easy to program.</p> <p>Library home page: <a href="http://xerces.apache.org/xerces2-j">http://xerces.apache.org/xerces2-j</a></p> <p>Path to dependency file: /pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/xerces/xercesImpl/2.8.0/xercesImpl-2.8.0.jar,/target/easybuggy-1-SNAPSHOT/WEB-INF/lib/xercesImpl-2.8.0.jar,/.extract/webapps/ROOT/WEB-INF/lib/xercesImpl-2.8.0.jar</p> <p> Dependency Hierarchy: - :x: **xercesImpl-2.8.0.jar** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> There's a vulnerability within the Apache Xerces Java (XercesJ) XML parser when handling specially crafted XML document payloads. This causes, the XercesJ XML parser to wait in an infinite loop, which may sometimes consume system resources for prolonged duration. This vulnerability is present within XercesJ version 2.12.1 and the previous versions. <p>Publish Date: 2022-01-24 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-23437>CVE-2022-23437</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-h65f-jvqw-m9fj">https://github.com/advisories/GHSA-h65f-jvqw-m9fj</a></p> <p>Release Date: 2022-01-24</p> <p>Fix Resolution: xerces:xercesImpl:2.12.2</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"xerces","packageName":"xercesImpl","packageVersion":"2.8.0","packageFilePaths":["/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"xerces:xercesImpl:2.8.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"xerces:xercesImpl:2.12.2","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2022-23437","vulnerabilityDetails":"There\u0027s a vulnerability within the Apache Xerces Java (XercesJ) XML parser when handling specially crafted XML document payloads. This causes, the XercesJ XML parser to wait in an infinite loop, which may sometimes consume system resources for prolonged duration. This vulnerability is present within XercesJ version 2.12.1 and the previous versions.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-23437","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
CVE-2022-23437 (High) detected in xercesImpl-2.8.0.jar - ## CVE-2022-23437 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xercesImpl-2.8.0.jar</b></p></summary> <p>Xerces2 is the next generation of high performance, fully compliant XML parsers in the Apache Xerces family. This new version of Xerces introduces the Xerces Native Interface (XNI), a complete framework for building parser components and configurations that is extremely modular and easy to program.</p> <p>Library home page: <a href="http://xerces.apache.org/xerces2-j">http://xerces.apache.org/xerces2-j</a></p> <p>Path to dependency file: /pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/xerces/xercesImpl/2.8.0/xercesImpl-2.8.0.jar,/target/easybuggy-1-SNAPSHOT/WEB-INF/lib/xercesImpl-2.8.0.jar,/.extract/webapps/ROOT/WEB-INF/lib/xercesImpl-2.8.0.jar</p> <p> Dependency Hierarchy: - :x: **xercesImpl-2.8.0.jar** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> There's a vulnerability within the Apache Xerces Java (XercesJ) XML parser when handling specially crafted XML document payloads. This causes, the XercesJ XML parser to wait in an infinite loop, which may sometimes consume system resources for prolonged duration. This vulnerability is present within XercesJ version 2.12.1 and the previous versions. <p>Publish Date: 2022-01-24 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-23437>CVE-2022-23437</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-h65f-jvqw-m9fj">https://github.com/advisories/GHSA-h65f-jvqw-m9fj</a></p> <p>Release Date: 2022-01-24</p> <p>Fix Resolution: xerces:xercesImpl:2.12.2</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"xerces","packageName":"xercesImpl","packageVersion":"2.8.0","packageFilePaths":["/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"xerces:xercesImpl:2.8.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"xerces:xercesImpl:2.12.2","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2022-23437","vulnerabilityDetails":"There\u0027s a vulnerability within the Apache Xerces Java (XercesJ) XML parser when handling specially crafted XML document payloads. This causes, the XercesJ XML parser to wait in an infinite loop, which may sometimes consume system resources for prolonged duration. This vulnerability is present within XercesJ version 2.12.1 and the previous versions.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-23437","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_priority
cve high detected in xercesimpl jar cve high severity vulnerability vulnerable library xercesimpl jar is the next generation of high performance fully compliant xml parsers in the apache xerces family this new version of xerces introduces the xerces native interface xni a complete framework for building parser components and configurations that is extremely modular and easy to program library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository xerces xercesimpl xercesimpl jar target easybuggy snapshot web inf lib xercesimpl jar extract webapps root web inf lib xercesimpl jar dependency hierarchy x xercesimpl jar vulnerable library found in base branch master vulnerability details there s a vulnerability within the apache xerces java xercesj xml parser when handling specially crafted xml document payloads this causes the xercesj xml parser to wait in an infinite loop which may sometimes consume system resources for prolonged duration this vulnerability is present within xercesj version and the previous versions publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution xerces xercesimpl rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree xerces xercesimpl isminimumfixversionavailable true minimumfixversion xerces xercesimpl isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails there a vulnerability within the apache xerces java xercesj xml parser when handling specially crafted xml document payloads this causes the xercesj xml parser to wait in an infinite loop which may sometimes consume system resources for prolonged duration this vulnerability is present within xercesj version and the previous versions vulnerabilityurl
0
31,700
6,587,219,361
IssuesEvent
2017-09-13 20:13:38
CenturyLinkCloud/mdw
https://api.github.com/repos/CenturyLinkCloud/mdw
opened
Javadocs are not produced correctly by the MDW build
defect
The javadocs on our site are terribly out-of-date: http://centurylinkcloud.github.io/mdw/docs/javadoc/ We should include in our build procedure a step to update these each time we publish a formal build. Also, the mdw-javadoc jar on maven-central only includes docs for mdw-hub: http://repo1.maven.org/maven2/com/centurylink/mdw/mdw/6.0.06/mdw-6.0.06-javadoc.jar
1.0
Javadocs are not produced correctly by the MDW build - The javadocs on our site are terribly out-of-date: http://centurylinkcloud.github.io/mdw/docs/javadoc/ We should include in our build procedure a step to update these each time we publish a formal build. Also, the mdw-javadoc jar on maven-central only includes docs for mdw-hub: http://repo1.maven.org/maven2/com/centurylink/mdw/mdw/6.0.06/mdw-6.0.06-javadoc.jar
non_priority
javadocs are not produced correctly by the mdw build the javadocs on our site are terribly out of date we should include in our build procedure a step to update these each time we publish a formal build also the mdw javadoc jar on maven central only includes docs for mdw hub
0
58,421
14,274,431,889
IssuesEvent
2020-11-22 03:55:13
Ghost-chu/QuickShop-Reremake
https://api.github.com/repos/Ghost-chu/QuickShop-Reremake
closed
CVE-2020-9546 (High) detected in jackson-databind-2.3.4.jar - autoclosed
Bug security vulnerability
## CVE-2020-9546 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.3.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Path to dependency file: QuickShop-Reremake/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.3.4/jackson-databind-2.3.4.jar</p> <p> Dependency Hierarchy: - jenkins-client-0.3.8.jar (Root Library) - :x: **jackson-databind-2.3.4.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Ghost-chu/QuickShop-Reremake/commit/8ee7d2b71191adf05b366e0787aec78ffbdad102">8ee7d2b71191adf05b366e0787aec78ffbdad102</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to org.apache.hadoop.shaded.com.zaxxer.hikari.HikariConfig (aka shaded hikari-config). <p>Publish Date: 2020-03-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-9546>CVE-2020-9546</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9546">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9546</a></p> <p>Release Date: 2020-03-02</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.10.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-9546 (High) detected in jackson-databind-2.3.4.jar - autoclosed - ## CVE-2020-9546 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.3.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Path to dependency file: QuickShop-Reremake/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.3.4/jackson-databind-2.3.4.jar</p> <p> Dependency Hierarchy: - jenkins-client-0.3.8.jar (Root Library) - :x: **jackson-databind-2.3.4.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Ghost-chu/QuickShop-Reremake/commit/8ee7d2b71191adf05b366e0787aec78ffbdad102">8ee7d2b71191adf05b366e0787aec78ffbdad102</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to org.apache.hadoop.shaded.com.zaxxer.hikari.HikariConfig (aka shaded hikari-config). <p>Publish Date: 2020-03-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-9546>CVE-2020-9546</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9546">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9546</a></p> <p>Release Date: 2020-03-02</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.10.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve high detected in jackson databind jar autoclosed cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api path to dependency file quickshop reremake pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy jenkins client jar root library x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org apache hadoop shaded com zaxxer hikari hikariconfig aka shaded hikari config publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind step up your open source security game with whitesource
0
122,609
17,760,803,948
IssuesEvent
2021-08-29 16:59:00
MidnightBSD/src
https://api.github.com/repos/MidnightBSD/src
opened
CVE-2018-17942 (High) detected in non-gnucvs-1.12.13
security vulnerability
## CVE-2018-17942 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>non-gnucvs-1.12.13</b></p></summary> <p> <p>Gnu Distributions</p> <p>Library home page: <a href=https://ftp.gnu.org/gnu/non-gnu?wsslib=non-gnu>https://ftp.gnu.org/gnu/non-gnu?wsslib=non-gnu</a></p> <p>Found in HEAD commit: <a href="https://github.com/MidnightBSD/src/commit/816463d989cc5839c1cca2efb5bf2503408507fb">816463d989cc5839c1cca2efb5bf2503408507fb</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The convert_to_decimal function in vasnprintf.c in Gnulib before 2018-09-23 has a heap-based buffer overflow because memory is not allocated for a trailing '\0' character during %f processing. <p>Publish Date: 2018-10-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-17942>CVE-2018-17942</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/coreutils/gnulib/commit/278b4175c9d7dd47c1a3071554aac02add3b3c35">https://github.com/coreutils/gnulib/commit/278b4175c9d7dd47c1a3071554aac02add3b3c35</a></p> <p>Release Date: 2018-10-03</p> <p>Fix Resolution: 2018-09-23</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2018-17942 (High) detected in non-gnucvs-1.12.13 - ## CVE-2018-17942 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>non-gnucvs-1.12.13</b></p></summary> <p> <p>Gnu Distributions</p> <p>Library home page: <a href=https://ftp.gnu.org/gnu/non-gnu?wsslib=non-gnu>https://ftp.gnu.org/gnu/non-gnu?wsslib=non-gnu</a></p> <p>Found in HEAD commit: <a href="https://github.com/MidnightBSD/src/commit/816463d989cc5839c1cca2efb5bf2503408507fb">816463d989cc5839c1cca2efb5bf2503408507fb</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The convert_to_decimal function in vasnprintf.c in Gnulib before 2018-09-23 has a heap-based buffer overflow because memory is not allocated for a trailing '\0' character during %f processing. <p>Publish Date: 2018-10-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-17942>CVE-2018-17942</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/coreutils/gnulib/commit/278b4175c9d7dd47c1a3071554aac02add3b3c35">https://github.com/coreutils/gnulib/commit/278b4175c9d7dd47c1a3071554aac02add3b3c35</a></p> <p>Release Date: 2018-10-03</p> <p>Fix Resolution: 2018-09-23</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve high detected in non gnucvs cve high severity vulnerability vulnerable library non gnucvs gnu distributions library home page a href found in head commit a href found in base branch master vulnerable source files vulnerability details the convert to decimal function in vasnprintf c in gnulib before has a heap based buffer overflow because memory is not allocated for a trailing character during f processing publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
42,922
11,101,141,658
IssuesEvent
2019-12-16 20:47:05
microsoft/MixedRealityToolkit-Unity
https://api.github.com/repos/microsoft/MixedRealityToolkit-Unity
closed
Build window defaults to 2017 if both 2017 and 2019 are installed
Bug Build / Tools Build Window
## Describe the bug If both VS versions are installed, the build window checks for and uses 2017 first. This can cause a build issue if Unity builds the solution expecting to use VS 2019. As a workaround, the VS version can be explicitly set in Unity's build Window, but it should be configurable from the MRTK's as well. ## To reproduce 1. Have both VS 2017 and 2019 installed. 2. Have Unity's build window set to VS 2019 or latest installed 3. Try to build the appx with the MRTK build window 4. Observe exception regarding missing build tools ## Expected behavior A built appx!
2.0
Build window defaults to 2017 if both 2017 and 2019 are installed - ## Describe the bug If both VS versions are installed, the build window checks for and uses 2017 first. This can cause a build issue if Unity builds the solution expecting to use VS 2019. As a workaround, the VS version can be explicitly set in Unity's build Window, but it should be configurable from the MRTK's as well. ## To reproduce 1. Have both VS 2017 and 2019 installed. 2. Have Unity's build window set to VS 2019 or latest installed 3. Try to build the appx with the MRTK build window 4. Observe exception regarding missing build tools ## Expected behavior A built appx!
non_priority
build window defaults to if both and are installed describe the bug if both vs versions are installed the build window checks for and uses first this can cause a build issue if unity builds the solution expecting to use vs as a workaround the vs version can be explicitly set in unity s build window but it should be configurable from the mrtk s as well to reproduce have both vs and installed have unity s build window set to vs or latest installed try to build the appx with the mrtk build window observe exception regarding missing build tools expected behavior a built appx
0
4,893
2,886,848,816
IssuesEvent
2015-06-12 11:18:46
kodi-pvr/pvr.vbox
https://api.github.com/repos/kodi-pvr/pvr.vbox
closed
Update architecture README subsection
documentation
At least the part about the timeshift buffer is partly wrong, the implementation has already been moved to its own namespace. Probably some other things need updating as well.
1.0
Update architecture README subsection - At least the part about the timeshift buffer is partly wrong, the implementation has already been moved to its own namespace. Probably some other things need updating as well.
non_priority
update architecture readme subsection at least the part about the timeshift buffer is partly wrong the implementation has already been moved to its own namespace probably some other things need updating as well
0
130,669
12,452,109,550
IssuesEvent
2020-05-27 11:42:31
FRRouting/frr
https://api.github.com/repos/FRRouting/frr
closed
how can I use protobuf message.
documentation triage
The default FPM message format is netlink, I configured the FPM message format is protobuf,but there is no use. I configred zebra like this; ```systemctl status frr ● frr.service - FRRouting Loaded: loaded (/usr/lib/systemd/system/frr.service; disabled; vendor preset: disabled) Active: active (running) since Thu 2020-05-21 17:47:58 CST; 9min ago Docs: https://frrouting.readthedocs.io/en/latest/setup.html Process: 17531 ExecStop=/usr/lib/frr/frrinit.sh stop (code=exited, status=0/SUCCESS) Process: 17552 ExecStart=/usr/lib/frr/frrinit.sh start (code=exited, status=0/SUCCESS) CGroup: /system.slice/frr.service ├─17558 /usr/lib/frr/watchfrr -d zebra bgpd staticd ├─17573 /usr/lib/frr/zebra -d -A 127.0.0.1 -s 90000000 -M fpm protobuf ├─17577 /usr/lib/frr/bgpd -d -A 127.0.0.1 └─17583 /usr/lib/frr/staticd -d -A 127.0.0.1 ``` ``` cat /etc/frr/daemons zebra_options=" -A 127.0.0.1 -s 90000000 -M fpm protobuf" ``` ``` rpm -qa|grep frr frr-7.2-01.el7.centos.x86_64 ``` daemons file configure like this is right?
1.0
how can I use protobuf message. - The default FPM message format is netlink, I configured the FPM message format is protobuf,but there is no use. I configred zebra like this; ```systemctl status frr ● frr.service - FRRouting Loaded: loaded (/usr/lib/systemd/system/frr.service; disabled; vendor preset: disabled) Active: active (running) since Thu 2020-05-21 17:47:58 CST; 9min ago Docs: https://frrouting.readthedocs.io/en/latest/setup.html Process: 17531 ExecStop=/usr/lib/frr/frrinit.sh stop (code=exited, status=0/SUCCESS) Process: 17552 ExecStart=/usr/lib/frr/frrinit.sh start (code=exited, status=0/SUCCESS) CGroup: /system.slice/frr.service ├─17558 /usr/lib/frr/watchfrr -d zebra bgpd staticd ├─17573 /usr/lib/frr/zebra -d -A 127.0.0.1 -s 90000000 -M fpm protobuf ├─17577 /usr/lib/frr/bgpd -d -A 127.0.0.1 └─17583 /usr/lib/frr/staticd -d -A 127.0.0.1 ``` ``` cat /etc/frr/daemons zebra_options=" -A 127.0.0.1 -s 90000000 -M fpm protobuf" ``` ``` rpm -qa|grep frr frr-7.2-01.el7.centos.x86_64 ``` daemons file configure like this is right?
non_priority
how can i use protobuf message the default fpm message format is netlink i configured the fpm message format is protobuf but there is no use i configred zebra like this systemctl status frr ● frr service frrouting loaded loaded usr lib systemd system frr service disabled vendor preset disabled active active running since thu cst ago docs process execstop usr lib frr frrinit sh stop code exited status success process execstart usr lib frr frrinit sh start code exited status success cgroup system slice frr service ├─ usr lib frr watchfrr d zebra bgpd staticd ├─ usr lib frr zebra d a s m fpm protobuf ├─ usr lib frr bgpd d a └─ usr lib frr staticd d a cat etc frr daemons zebra options a s m fpm protobuf rpm qa grep frr frr centos daemons file configure like this is right
0
133,932
29,670,585,651
IssuesEvent
2023-06-11 11:03:03
dtcxzyw/llvm-ci
https://api.github.com/repos/dtcxzyw/llvm-ci
closed
Regressions Report [rv64gcv-O3-thinlto] April 25th 2023, 6:06:24 am
regression codegen transform reasonable
## Metadata + Workflow URL: https://github.com/dtcxzyw/llvm-ci/actions/runs/4793517267 ## Change Logs from 1245a1ed07bab52fd4a5501f50651d65f43b9971 to 65eedcebdc03052959508911417bac548009652a [65eedcebdc03052959508911417bac548009652a](https://github.com/llvm/llvm-project/commit/65eedcebdc03052959508911417bac548009652a) [mlir] detensorize: don&#x27;t accidentally convert function entry blocks [8117f58adc97b4e1bb8720d2113a7e092260131b](https://github.com/llvm/llvm-project/commit/8117f58adc97b4e1bb8720d2113a7e092260131b) [gn build] Port 8a3950510f81 [8a3950510f819308f7ead16c339484147c69c84a](https://github.com/llvm/llvm-project/commit/8a3950510f819308f7ead16c339484147c69c84a) [RISCV] Support scalar/fix-length vector NTLH intrinsic with different domain [e66c2db7996ed0ce8cd27548a623ce62246be33b](https://github.com/llvm/llvm-project/commit/e66c2db7996ed0ce8cd27548a623ce62246be33b) -Wframe-larger-than=: improve error with an invalid argument [f40d186d4a3a448bfb4233c52658a70e71ae04f1](https://github.com/llvm/llvm-project/commit/f40d186d4a3a448bfb4233c52658a70e71ae04f1) ValueTracking: Add ordered negative handling for fmul to computeKnownFPClass [7aeec64215cdbb2420756808a902a9e6807ecb30](https://github.com/llvm/llvm-project/commit/7aeec64215cdbb2420756808a902a9e6807ecb30) ValueTracking: Handle fptrunc_round in computeKnownFPClass [301f4d884f6a73ff3e7354dfef1de42dcb9e33c4](https://github.com/llvm/llvm-project/commit/301f4d884f6a73ff3e7354dfef1de42dcb9e33c4) [bazel][mlir] Build Debug/BreakpointManagers only from a single target [faa2d69e462146543e168cc6c36a28a7e238ecce](https://github.com/llvm/llvm-project/commit/faa2d69e462146543e168cc6c36a28a7e238ecce) [RISCV] Ensure extract_vector_elt has a single use in combineBinOpToReduce. [463412e930b248dab06e0f51d92a8cf0e71072fc](https://github.com/llvm/llvm-project/commit/463412e930b248dab06e0f51d92a8cf0e71072fc) [RISCV] Add test case showing duplicated reduction due to missing one use check. NFC [09bd5ae49ea84c734cec35ec8555b16edb13c7b4](https://github.com/llvm/llvm-project/commit/09bd5ae49ea84c734cec35ec8555b16edb13c7b4) [mlir][tosa] Fix `tosa.reshape` folder for quantized constants [a1e89710d9fb8d4eb90083d476bd5d77215a960e](https://github.com/llvm/llvm-project/commit/a1e89710d9fb8d4eb90083d476bd5d77215a960e) [RISCV] Strengthen INSERT_SUBVECTOR check in combineBinOpToReduce. [c95533a7be2858893ec32b8abaa37a2d912ebe63](https://github.com/llvm/llvm-project/commit/c95533a7be2858893ec32b8abaa37a2d912ebe63) [gn build] Port d45fae601067 [c49f850d55221e84c675f03c68fec2801674a4d3](https://github.com/llvm/llvm-project/commit/c49f850d55221e84c675f03c68fec2801674a4d3) Migrate `IIT_Info` into `Intrinsics.td` [ddaf085e7bcb903d5ae1cafc4667b8c3d302897e](https://github.com/llvm/llvm-project/commit/ddaf085e7bcb903d5ae1cafc4667b8c3d302897e) Fully generate `MachineValueType.h` [45b820d5a11a673124d78efd5907f0da8ee3bf41](https://github.com/llvm/llvm-project/commit/45b820d5a11a673124d78efd5907f0da8ee3bf41) ValueTypes.td: Reorganize ValueType [3c853c845ad6ff1591f60a909fa3c7d293c27b49](https://github.com/llvm/llvm-project/commit/3c853c845ad6ff1591f60a909fa3c7d293c27b49) ValueTypes.td: Introduce VTAny as `isOverloaded = true` [28cc956054bd4e618513eefbe3db50b6df49b00f](https://github.com/llvm/llvm-project/commit/28cc956054bd4e618513eefbe3db50b6df49b00f) SupportTests/MachineValueType.h: Catch up llvmorg-17-init-8340-gb68b94f6f40b [d45fae601067f03e8b4a5a59507ad3aaf7613ac4](https://github.com/llvm/llvm-project/commit/d45fae601067f03e8b4a5a59507ad3aaf7613ac4) Move CodeGen/LowLevelType =&gt; CodeGen/LowLevelTypeUtils [b63b2c2350ad1af9131fd95bb94c642d912bd4cf](https://github.com/llvm/llvm-project/commit/b63b2c2350ad1af9131fd95bb94c642d912bd4cf) Reland &quot;[-Wunsafe-buffer-usage] Bug fix: Handles the assertion violations for code within macros&quot; [84ec1f7725d4f4575474b59467e598d7c5528a4e](https://github.com/llvm/llvm-project/commit/84ec1f7725d4f4575474b59467e598d7c5528a4e) Revert &quot;[-Wunsafe-buffer-usage] Bug fix: Handles the assertion violations for code within macros&quot; [4dbaaf4b95fcfc4ca88ff59efd69e32801383e64](https://github.com/llvm/llvm-project/commit/4dbaaf4b95fcfc4ca88ff59efd69e32801383e64) [libc][Obvious] Add NO_RUN_POSTBUILD to sqrtf exhaustive test. [1d097ad73b3fe38ff301f471356d15f093a2dbef](https://github.com/llvm/llvm-project/commit/1d097ad73b3fe38ff301f471356d15f093a2dbef) [clang] Fix -Wimplicit-fallthrough in UnsafeBufferUsage.cpp [NFC] [7b37f732783e9b8270cbb63ff95618594dbe011b](https://github.com/llvm/llvm-project/commit/7b37f732783e9b8270cbb63ff95618594dbe011b) [libc][Obvious] Enable hermetic tests only under full build. [a971bc38cee892fb490b806eec74f2188fb70416](https://github.com/llvm/llvm-project/commit/a971bc38cee892fb490b806eec74f2188fb70416) Move DBG_VALUE&#x27;s that depend on loads to after a load if the load is moved due to the pre register allocation ld/st optimization pass [9bd0db80784e30d40a4a65f1b47109c833f05b54](https://github.com/llvm/llvm-project/commit/9bd0db80784e30d40a4a65f1b47109c833f05b54) [-Wunsafe-buffer-usage] Bug fix: Handles the assertion violations for code within macros [1e8960c7a588c117bf703221ce656aeb034219c5](https://github.com/llvm/llvm-project/commit/1e8960c7a588c117bf703221ce656aeb034219c5) [libc] Add rule named `add_libc_hermetic_test` which adds a hermetic test. [e831f73ac034f3a43aa9dcccf7ceb7b269089ef6](https://github.com/llvm/llvm-project/commit/e831f73ac034f3a43aa9dcccf7ceb7b269089ef6) [libc] Run all unit tests, irrespective of whether they belong to a test suite. [62439d54fecf9c08ce5dc799d1d44562da884e88](https://github.com/llvm/llvm-project/commit/62439d54fecf9c08ce5dc799d1d44562da884e88) [NVPTX] Unforce minimum alignment of 4 for byval arguments of device-side functions. [99cfaf0d5ed68d5d4e292fc87a10b1bb26201787](https://github.com/llvm/llvm-project/commit/99cfaf0d5ed68d5d4e292fc87a10b1bb26201787) Fix MLIR build when shared library mode is enabled [5f2b0892d5b732a822728f96a57144f6542c155e](https://github.com/llvm/llvm-project/commit/5f2b0892d5b732a822728f96a57144f6542c155e) [bazel][mlir] BreakpointManager fixes for 7f069f5ef4fe [23df67bdeee241dc5c32d7e7627cae99dd54fc55](https://github.com/llvm/llvm-project/commit/23df67bdeee241dc5c32d7e7627cae99dd54fc55) [libc][Obvious] Add include.stdlib as missing dep for CPP.string. [38ecb9767c1485abe0eb210ceeb827a884bc55c9](https://github.com/llvm/llvm-project/commit/38ecb9767c1485abe0eb210ceeb827a884bc55c9) [NFC][clang] Fix Coverity bugs with AUTO_CAUSES_COPY [6dc83c156e6d7290163daeae7ac5f9b0b6fe6292](https://github.com/llvm/llvm-project/commit/6dc83c156e6d7290163daeae7ac5f9b0b6fe6292) [MLIR] Fix build, obivous typos in cca510640 [e464b549a9fd2c06898b2149896c31147e2301ad](https://github.com/llvm/llvm-project/commit/e464b549a9fd2c06898b2149896c31147e2301ad) [MLIR][doc] Minor fixes the Action documentation [9c8db444bc859294401f211a7c8f4704a4edeb88](https://github.com/llvm/llvm-project/commit/9c8db444bc859294401f211a7c8f4704a4edeb88) Remove deprecated `preloadDialectInContext` flag for MlirOptMain that has been deprecated for 2 years [ffd6f6b91a3863bed19f1c1e2ae5efea80db566e](https://github.com/llvm/llvm-project/commit/ffd6f6b91a3863bed19f1c1e2ae5efea80db566e) Remove deprecated entry point for MlirOptMain [cca510640bf0aa3ef356a8ad51652de16b5a557a](https://github.com/llvm/llvm-project/commit/cca510640bf0aa3ef356a8ad51652de16b5a557a) Refactor the mlir-opt command line options related to debugging in a helper [b0528a53eab403194b713995e4b22d4cff12818b](https://github.com/llvm/llvm-project/commit/b0528a53eab403194b713995e4b22d4cff12818b) Add user doc on the website for the Action framework [1020150e7a6f6d6f833c232125c5ab817c03c76b](https://github.com/llvm/llvm-project/commit/1020150e7a6f6d6f833c232125c5ab817c03c76b) Add a GDB/LLDB interface for interactive debugging of MLIR Actions [1869a9c225c7ed411a15592d21b277716b65a374](https://github.com/llvm/llvm-project/commit/1869a9c225c7ed411a15592d21b277716b65a374) [LV] Use the known trip count when costing non-tail folded VFs [2bca3f2a92a506997914f335396e124c0a5f87dd](https://github.com/llvm/llvm-project/commit/2bca3f2a92a506997914f335396e124c0a5f87dd) Revert &quot;[OpenMP] Fix GCC build issues and restore &quot;Additional APIs used by the&quot; [075202d126dfba6fd1503926808a2fcafab1714a](https://github.com/llvm/llvm-project/commit/075202d126dfba6fd1503926808a2fcafab1714a) [X86 isel] Fix permute mask calculation in lowerShuffleAsUNPCKAndPermute [7090c102731192d5abafb7e0b2b49adb4912efae](https://github.com/llvm/llvm-project/commit/7090c102731192d5abafb7e0b2b49adb4912efae) [libc] Adjust the `cpp:function` type to support lambdas [50445dff43037014a23eb38b1f50bb698e64ffcf](https://github.com/llvm/llvm-project/commit/50445dff43037014a23eb38b1f50bb698e64ffcf) [libc] Add more utility functions for the GPU [5084ba395e487adee67ba38cc5c68ff7e052e37c](https://github.com/llvm/llvm-project/commit/5084ba395e487adee67ba38cc5c68ff7e052e37c) [clang] Remove workaround for old LLVM_ENABLE_PROJECTS=libcxx build [21bff9ca42e4735a52aa1e981b1ccd0d3b274b34](https://github.com/llvm/llvm-project/commit/21bff9ca42e4735a52aa1e981b1ccd0d3b274b34) [NFC][flang] Fixed typo in AVOID_NATIVE_UINT128_T macro. Differential Revision: https://reviews.llvm.org/D149097 [74f00516e5ce79a367acfd1ed1c74fa15aff69c7](https://github.com/llvm/llvm-project/commit/74f00516e5ce79a367acfd1ed1c74fa15aff69c7) [DFSAN] Add support for strsep. ## Regressions (Size) |Name|Baseline MD5|Current MD5|Baseline Size|Current Size|Ratio| |:--|:--:|:--:|--:|--:|--:| |MultiSource/Applications/ClamAV/clamscan|906c792fa0ebed3c156bfba5978324c9|69d651dbf73b4716ee3df16c589ae8f5|481126|481414|1.001| ## Regressions (Time) |Name|Baseline MD5|Current MD5|Baseline Time|Current Time|Ratio| |:--|:--:|:--:|--:|--:|--:| |MultiSource/Benchmarks/lzbench/lzbench|17fc212f529e74366c106e5c86ea75e8|c78e252c48ddccabc86e7b71ae54327a|481.517290868|481.517867029|1.000| ## Differences (Size) |Name|Baseline MD5|Current MD5|Baseline Size|Current Size|Ratio| |:--|:--:|:--:|--:|--:|--:| |MultiSource/Applications/ClamAV/clamscan|906c792fa0ebed3c156bfba5978324c9|69d651dbf73b4716ee3df16c589ae8f5|481126|481414|1.001| |MultiSource/Benchmarks/lzbench/lzbench|17fc212f529e74366c106e5c86ea75e8|c78e252c48ddccabc86e7b71ae54327a|2030104|2029000|0.999| |MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg|b3d124cf8eb41efe2f0a530a8c096f56|bf4bb59c0a5f31b393a6ce72af6c8b75|70018|69882|0.998| |GeoMeans|N/A|N/A|408942.523|408684.998|0.999| ## Differences (Time) |Name|Baseline MD5|Current MD5|Baseline Time|Current Time|Ratio| |:--|:--:|:--:|--:|--:|--:| |MultiSource/Benchmarks/lzbench/lzbench|17fc212f529e74366c106e5c86ea75e8|c78e252c48ddccabc86e7b71ae54327a|481.517290868|481.517867029|1.000| |MultiSource/Applications/ClamAV/clamscan|906c792fa0ebed3c156bfba5978324c9|69d651dbf73b4716ee3df16c589ae8f5|0.475688167|0.475688111|1.000| |MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg|b3d124cf8eb41efe2f0a530a8c096f56|bf4bb59c0a5f31b393a6ce72af6c8b75|0.014767564|0.014767336|1.000| |GeoMeans|N/A|N/A|1.501|1.501|1.000|
1.0
Regressions Report [rv64gcv-O3-thinlto] April 25th 2023, 6:06:24 am - ## Metadata + Workflow URL: https://github.com/dtcxzyw/llvm-ci/actions/runs/4793517267 ## Change Logs from 1245a1ed07bab52fd4a5501f50651d65f43b9971 to 65eedcebdc03052959508911417bac548009652a [65eedcebdc03052959508911417bac548009652a](https://github.com/llvm/llvm-project/commit/65eedcebdc03052959508911417bac548009652a) [mlir] detensorize: don&#x27;t accidentally convert function entry blocks [8117f58adc97b4e1bb8720d2113a7e092260131b](https://github.com/llvm/llvm-project/commit/8117f58adc97b4e1bb8720d2113a7e092260131b) [gn build] Port 8a3950510f81 [8a3950510f819308f7ead16c339484147c69c84a](https://github.com/llvm/llvm-project/commit/8a3950510f819308f7ead16c339484147c69c84a) [RISCV] Support scalar/fix-length vector NTLH intrinsic with different domain [e66c2db7996ed0ce8cd27548a623ce62246be33b](https://github.com/llvm/llvm-project/commit/e66c2db7996ed0ce8cd27548a623ce62246be33b) -Wframe-larger-than=: improve error with an invalid argument [f40d186d4a3a448bfb4233c52658a70e71ae04f1](https://github.com/llvm/llvm-project/commit/f40d186d4a3a448bfb4233c52658a70e71ae04f1) ValueTracking: Add ordered negative handling for fmul to computeKnownFPClass [7aeec64215cdbb2420756808a902a9e6807ecb30](https://github.com/llvm/llvm-project/commit/7aeec64215cdbb2420756808a902a9e6807ecb30) ValueTracking: Handle fptrunc_round in computeKnownFPClass [301f4d884f6a73ff3e7354dfef1de42dcb9e33c4](https://github.com/llvm/llvm-project/commit/301f4d884f6a73ff3e7354dfef1de42dcb9e33c4) [bazel][mlir] Build Debug/BreakpointManagers only from a single target [faa2d69e462146543e168cc6c36a28a7e238ecce](https://github.com/llvm/llvm-project/commit/faa2d69e462146543e168cc6c36a28a7e238ecce) [RISCV] Ensure extract_vector_elt has a single use in combineBinOpToReduce. [463412e930b248dab06e0f51d92a8cf0e71072fc](https://github.com/llvm/llvm-project/commit/463412e930b248dab06e0f51d92a8cf0e71072fc) [RISCV] Add test case showing duplicated reduction due to missing one use check. NFC [09bd5ae49ea84c734cec35ec8555b16edb13c7b4](https://github.com/llvm/llvm-project/commit/09bd5ae49ea84c734cec35ec8555b16edb13c7b4) [mlir][tosa] Fix `tosa.reshape` folder for quantized constants [a1e89710d9fb8d4eb90083d476bd5d77215a960e](https://github.com/llvm/llvm-project/commit/a1e89710d9fb8d4eb90083d476bd5d77215a960e) [RISCV] Strengthen INSERT_SUBVECTOR check in combineBinOpToReduce. [c95533a7be2858893ec32b8abaa37a2d912ebe63](https://github.com/llvm/llvm-project/commit/c95533a7be2858893ec32b8abaa37a2d912ebe63) [gn build] Port d45fae601067 [c49f850d55221e84c675f03c68fec2801674a4d3](https://github.com/llvm/llvm-project/commit/c49f850d55221e84c675f03c68fec2801674a4d3) Migrate `IIT_Info` into `Intrinsics.td` [ddaf085e7bcb903d5ae1cafc4667b8c3d302897e](https://github.com/llvm/llvm-project/commit/ddaf085e7bcb903d5ae1cafc4667b8c3d302897e) Fully generate `MachineValueType.h` [45b820d5a11a673124d78efd5907f0da8ee3bf41](https://github.com/llvm/llvm-project/commit/45b820d5a11a673124d78efd5907f0da8ee3bf41) ValueTypes.td: Reorganize ValueType [3c853c845ad6ff1591f60a909fa3c7d293c27b49](https://github.com/llvm/llvm-project/commit/3c853c845ad6ff1591f60a909fa3c7d293c27b49) ValueTypes.td: Introduce VTAny as `isOverloaded = true` [28cc956054bd4e618513eefbe3db50b6df49b00f](https://github.com/llvm/llvm-project/commit/28cc956054bd4e618513eefbe3db50b6df49b00f) SupportTests/MachineValueType.h: Catch up llvmorg-17-init-8340-gb68b94f6f40b [d45fae601067f03e8b4a5a59507ad3aaf7613ac4](https://github.com/llvm/llvm-project/commit/d45fae601067f03e8b4a5a59507ad3aaf7613ac4) Move CodeGen/LowLevelType =&gt; CodeGen/LowLevelTypeUtils [b63b2c2350ad1af9131fd95bb94c642d912bd4cf](https://github.com/llvm/llvm-project/commit/b63b2c2350ad1af9131fd95bb94c642d912bd4cf) Reland &quot;[-Wunsafe-buffer-usage] Bug fix: Handles the assertion violations for code within macros&quot; [84ec1f7725d4f4575474b59467e598d7c5528a4e](https://github.com/llvm/llvm-project/commit/84ec1f7725d4f4575474b59467e598d7c5528a4e) Revert &quot;[-Wunsafe-buffer-usage] Bug fix: Handles the assertion violations for code within macros&quot; [4dbaaf4b95fcfc4ca88ff59efd69e32801383e64](https://github.com/llvm/llvm-project/commit/4dbaaf4b95fcfc4ca88ff59efd69e32801383e64) [libc][Obvious] Add NO_RUN_POSTBUILD to sqrtf exhaustive test. [1d097ad73b3fe38ff301f471356d15f093a2dbef](https://github.com/llvm/llvm-project/commit/1d097ad73b3fe38ff301f471356d15f093a2dbef) [clang] Fix -Wimplicit-fallthrough in UnsafeBufferUsage.cpp [NFC] [7b37f732783e9b8270cbb63ff95618594dbe011b](https://github.com/llvm/llvm-project/commit/7b37f732783e9b8270cbb63ff95618594dbe011b) [libc][Obvious] Enable hermetic tests only under full build. [a971bc38cee892fb490b806eec74f2188fb70416](https://github.com/llvm/llvm-project/commit/a971bc38cee892fb490b806eec74f2188fb70416) Move DBG_VALUE&#x27;s that depend on loads to after a load if the load is moved due to the pre register allocation ld/st optimization pass [9bd0db80784e30d40a4a65f1b47109c833f05b54](https://github.com/llvm/llvm-project/commit/9bd0db80784e30d40a4a65f1b47109c833f05b54) [-Wunsafe-buffer-usage] Bug fix: Handles the assertion violations for code within macros [1e8960c7a588c117bf703221ce656aeb034219c5](https://github.com/llvm/llvm-project/commit/1e8960c7a588c117bf703221ce656aeb034219c5) [libc] Add rule named `add_libc_hermetic_test` which adds a hermetic test. [e831f73ac034f3a43aa9dcccf7ceb7b269089ef6](https://github.com/llvm/llvm-project/commit/e831f73ac034f3a43aa9dcccf7ceb7b269089ef6) [libc] Run all unit tests, irrespective of whether they belong to a test suite. [62439d54fecf9c08ce5dc799d1d44562da884e88](https://github.com/llvm/llvm-project/commit/62439d54fecf9c08ce5dc799d1d44562da884e88) [NVPTX] Unforce minimum alignment of 4 for byval arguments of device-side functions. [99cfaf0d5ed68d5d4e292fc87a10b1bb26201787](https://github.com/llvm/llvm-project/commit/99cfaf0d5ed68d5d4e292fc87a10b1bb26201787) Fix MLIR build when shared library mode is enabled [5f2b0892d5b732a822728f96a57144f6542c155e](https://github.com/llvm/llvm-project/commit/5f2b0892d5b732a822728f96a57144f6542c155e) [bazel][mlir] BreakpointManager fixes for 7f069f5ef4fe [23df67bdeee241dc5c32d7e7627cae99dd54fc55](https://github.com/llvm/llvm-project/commit/23df67bdeee241dc5c32d7e7627cae99dd54fc55) [libc][Obvious] Add include.stdlib as missing dep for CPP.string. [38ecb9767c1485abe0eb210ceeb827a884bc55c9](https://github.com/llvm/llvm-project/commit/38ecb9767c1485abe0eb210ceeb827a884bc55c9) [NFC][clang] Fix Coverity bugs with AUTO_CAUSES_COPY [6dc83c156e6d7290163daeae7ac5f9b0b6fe6292](https://github.com/llvm/llvm-project/commit/6dc83c156e6d7290163daeae7ac5f9b0b6fe6292) [MLIR] Fix build, obivous typos in cca510640 [e464b549a9fd2c06898b2149896c31147e2301ad](https://github.com/llvm/llvm-project/commit/e464b549a9fd2c06898b2149896c31147e2301ad) [MLIR][doc] Minor fixes the Action documentation [9c8db444bc859294401f211a7c8f4704a4edeb88](https://github.com/llvm/llvm-project/commit/9c8db444bc859294401f211a7c8f4704a4edeb88) Remove deprecated `preloadDialectInContext` flag for MlirOptMain that has been deprecated for 2 years [ffd6f6b91a3863bed19f1c1e2ae5efea80db566e](https://github.com/llvm/llvm-project/commit/ffd6f6b91a3863bed19f1c1e2ae5efea80db566e) Remove deprecated entry point for MlirOptMain [cca510640bf0aa3ef356a8ad51652de16b5a557a](https://github.com/llvm/llvm-project/commit/cca510640bf0aa3ef356a8ad51652de16b5a557a) Refactor the mlir-opt command line options related to debugging in a helper [b0528a53eab403194b713995e4b22d4cff12818b](https://github.com/llvm/llvm-project/commit/b0528a53eab403194b713995e4b22d4cff12818b) Add user doc on the website for the Action framework [1020150e7a6f6d6f833c232125c5ab817c03c76b](https://github.com/llvm/llvm-project/commit/1020150e7a6f6d6f833c232125c5ab817c03c76b) Add a GDB/LLDB interface for interactive debugging of MLIR Actions [1869a9c225c7ed411a15592d21b277716b65a374](https://github.com/llvm/llvm-project/commit/1869a9c225c7ed411a15592d21b277716b65a374) [LV] Use the known trip count when costing non-tail folded VFs [2bca3f2a92a506997914f335396e124c0a5f87dd](https://github.com/llvm/llvm-project/commit/2bca3f2a92a506997914f335396e124c0a5f87dd) Revert &quot;[OpenMP] Fix GCC build issues and restore &quot;Additional APIs used by the&quot; [075202d126dfba6fd1503926808a2fcafab1714a](https://github.com/llvm/llvm-project/commit/075202d126dfba6fd1503926808a2fcafab1714a) [X86 isel] Fix permute mask calculation in lowerShuffleAsUNPCKAndPermute [7090c102731192d5abafb7e0b2b49adb4912efae](https://github.com/llvm/llvm-project/commit/7090c102731192d5abafb7e0b2b49adb4912efae) [libc] Adjust the `cpp:function` type to support lambdas [50445dff43037014a23eb38b1f50bb698e64ffcf](https://github.com/llvm/llvm-project/commit/50445dff43037014a23eb38b1f50bb698e64ffcf) [libc] Add more utility functions for the GPU [5084ba395e487adee67ba38cc5c68ff7e052e37c](https://github.com/llvm/llvm-project/commit/5084ba395e487adee67ba38cc5c68ff7e052e37c) [clang] Remove workaround for old LLVM_ENABLE_PROJECTS=libcxx build [21bff9ca42e4735a52aa1e981b1ccd0d3b274b34](https://github.com/llvm/llvm-project/commit/21bff9ca42e4735a52aa1e981b1ccd0d3b274b34) [NFC][flang] Fixed typo in AVOID_NATIVE_UINT128_T macro. Differential Revision: https://reviews.llvm.org/D149097 [74f00516e5ce79a367acfd1ed1c74fa15aff69c7](https://github.com/llvm/llvm-project/commit/74f00516e5ce79a367acfd1ed1c74fa15aff69c7) [DFSAN] Add support for strsep. ## Regressions (Size) |Name|Baseline MD5|Current MD5|Baseline Size|Current Size|Ratio| |:--|:--:|:--:|--:|--:|--:| |MultiSource/Applications/ClamAV/clamscan|906c792fa0ebed3c156bfba5978324c9|69d651dbf73b4716ee3df16c589ae8f5|481126|481414|1.001| ## Regressions (Time) |Name|Baseline MD5|Current MD5|Baseline Time|Current Time|Ratio| |:--|:--:|:--:|--:|--:|--:| |MultiSource/Benchmarks/lzbench/lzbench|17fc212f529e74366c106e5c86ea75e8|c78e252c48ddccabc86e7b71ae54327a|481.517290868|481.517867029|1.000| ## Differences (Size) |Name|Baseline MD5|Current MD5|Baseline Size|Current Size|Ratio| |:--|:--:|:--:|--:|--:|--:| |MultiSource/Applications/ClamAV/clamscan|906c792fa0ebed3c156bfba5978324c9|69d651dbf73b4716ee3df16c589ae8f5|481126|481414|1.001| |MultiSource/Benchmarks/lzbench/lzbench|17fc212f529e74366c106e5c86ea75e8|c78e252c48ddccabc86e7b71ae54327a|2030104|2029000|0.999| |MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg|b3d124cf8eb41efe2f0a530a8c096f56|bf4bb59c0a5f31b393a6ce72af6c8b75|70018|69882|0.998| |GeoMeans|N/A|N/A|408942.523|408684.998|0.999| ## Differences (Time) |Name|Baseline MD5|Current MD5|Baseline Time|Current Time|Ratio| |:--|:--:|:--:|--:|--:|--:| |MultiSource/Benchmarks/lzbench/lzbench|17fc212f529e74366c106e5c86ea75e8|c78e252c48ddccabc86e7b71ae54327a|481.517290868|481.517867029|1.000| |MultiSource/Applications/ClamAV/clamscan|906c792fa0ebed3c156bfba5978324c9|69d651dbf73b4716ee3df16c589ae8f5|0.475688167|0.475688111|1.000| |MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg|b3d124cf8eb41efe2f0a530a8c096f56|bf4bb59c0a5f31b393a6ce72af6c8b75|0.014767564|0.014767336|1.000| |GeoMeans|N/A|N/A|1.501|1.501|1.000|
non_priority
regressions report april am metadata workflow url change logs from to detensorize don t accidentally convert function entry blocks port support scalar fix length vector ntlh intrinsic with different domain wframe larger than improve error with an invalid argument valuetracking add ordered negative handling for fmul to computeknownfpclass valuetracking handle fptrunc round in computeknownfpclass build debug breakpointmanagers only from a single target ensure extract vector elt has a single use in combinebinoptoreduce add test case showing duplicated reduction due to missing one use check nfc fix tosa reshape folder for quantized constants strengthen insert subvector check in combinebinoptoreduce port migrate iit info into intrinsics td fully generate machinevaluetype h valuetypes td reorganize valuetype valuetypes td introduce vtany as isoverloaded true supporttests machinevaluetype h catch up llvmorg init move codegen lowleveltype gt codegen lowleveltypeutils reland quot bug fix handles the assertion violations for code within macros quot revert quot bug fix handles the assertion violations for code within macros quot add no run postbuild to sqrtf exhaustive test fix wimplicit fallthrough in unsafebufferusage cpp enable hermetic tests only under full build move dbg value s that depend on loads to after a load if the load is moved due to the pre register allocation ld st optimization pass bug fix handles the assertion violations for code within macros add rule named add libc hermetic test which adds a hermetic test run all unit tests irrespective of whether they belong to a test suite unforce minimum alignment of for byval arguments of device side functions fix mlir build when shared library mode is enabled breakpointmanager fixes for add include stdlib as missing dep for cpp string fix coverity bugs with auto causes copy fix build obivous typos in minor fixes the action documentation remove deprecated preloaddialectincontext flag for mliroptmain that has been deprecated for years remove deprecated entry point for mliroptmain refactor the mlir opt command line options related to debugging in a helper add user doc on the website for the action framework add a gdb lldb interface for interactive debugging of mlir actions use the known trip count when costing non tail folded vfs revert quot fix gcc build issues and restore quot additional apis used by the quot fix permute mask calculation in lowershuffleasunpckandpermute adjust the cpp function type to support lambdas add more utility functions for the gpu remove workaround for old llvm enable projects libcxx build fixed typo in avoid native t macro differential revision add support for strsep regressions size name baseline current baseline size current size ratio multisource applications clamav clamscan regressions time name baseline current baseline time current time ratio multisource benchmarks lzbench lzbench differences size name baseline current baseline size current size ratio multisource applications clamav clamscan multisource benchmarks lzbench lzbench multisource benchmarks mediabench jpeg jpeg cjpeg geomeans n a n a differences time name baseline current baseline time current time ratio multisource benchmarks lzbench lzbench multisource applications clamav clamscan multisource benchmarks mediabench jpeg jpeg cjpeg geomeans n a n a
0
130,523
10,617,606,178
IssuesEvent
2019-10-12 20:20:05
Vachok/ftpplus
https://api.github.com/repos/Vachok/ftpplus
closed
testRun [D335]
Lowest TestQuality bug mint resolution_Fixed
Execute WeeklyInternetStatsTest::testRun**testRun** *WeeklyInternetStatsTest* *java.lang.UnsupportedOperationException: System tray unavailable WeeklyInternetStatsTest.java: ru.vachok.networker.info.stats.WeeklyInternetStatsTest.testRun(WeeklyInternetStatsTest.java:83) expected [null] but found [java.util.concurrent.ExecutionException: java.lang.UnsupportedOperationException: System tray unavailable]* *java.lang.AssertionError*
1.0
testRun [D335] - Execute WeeklyInternetStatsTest::testRun**testRun** *WeeklyInternetStatsTest* *java.lang.UnsupportedOperationException: System tray unavailable WeeklyInternetStatsTest.java: ru.vachok.networker.info.stats.WeeklyInternetStatsTest.testRun(WeeklyInternetStatsTest.java:83) expected [null] but found [java.util.concurrent.ExecutionException: java.lang.UnsupportedOperationException: System tray unavailable]* *java.lang.AssertionError*
non_priority
testrun execute weeklyinternetstatstest testrun testrun weeklyinternetstatstest java lang unsupportedoperationexception system tray unavailable weeklyinternetstatstest java ru vachok networker info stats weeklyinternetstatstest testrun weeklyinternetstatstest java expected but found java lang assertionerror
0
3,160
2,741,657,941
IssuesEvent
2015-04-21 12:40:02
gbv/paia
https://api.github.com/repos/gbv/paia
closed
Make HTTP Accept header explicit
documentation
This is not mentioned explicitly: Accept: application/json or Accept: application/json, ...
1.0
Make HTTP Accept header explicit - This is not mentioned explicitly: Accept: application/json or Accept: application/json, ...
non_priority
make http accept header explicit this is not mentioned explicitly accept application json or accept application json
0
41,300
5,325,918,813
IssuesEvent
2017-02-15 01:36:10
elegantthemes/Divi-Beta
https://api.github.com/repos/elegantthemes/Divi-Beta
closed
BlueHost :: Shortcode Trimming :: Issues with adding padding
BUG DESIGN SIGNOFF QUALITY ASSURED READY FOR REVIEW
### Problem: I don't see the results as they should appear on the VB. The front end works okay, and when you reload the VB page you also see the padding applied correctly, it's just not when you add it live: ![94](https://cloud.githubusercontent.com/assets/16414613/22790376/e5bfa67c-eedd-11e6-92bf-971f6581ddd5.gif) ### Attached PR * https://github.com/elegantthemes/submodule-builder/issues/1715
1.0
BlueHost :: Shortcode Trimming :: Issues with adding padding - ### Problem: I don't see the results as they should appear on the VB. The front end works okay, and when you reload the VB page you also see the padding applied correctly, it's just not when you add it live: ![94](https://cloud.githubusercontent.com/assets/16414613/22790376/e5bfa67c-eedd-11e6-92bf-971f6581ddd5.gif) ### Attached PR * https://github.com/elegantthemes/submodule-builder/issues/1715
non_priority
bluehost shortcode trimming issues with adding padding problem i don t see the results as they should appear on the vb the front end works okay and when you reload the vb page you also see the padding applied correctly it s just not when you add it live attached pr
0
20,093
11,385,832,795
IssuesEvent
2020-01-29 11:58:49
cityofaustin/atd-data-tech
https://api.github.com/repos/cityofaustin/atd-data-tech
closed
Ability to Track OC and SN #'s
Need: 1-Must Have Project: Warehouse Inventory Service: Apps Type: Feature Workgroup: AMD
As a warehouse admin, I need to track the OC# that was created when I entered the transactions in the AIMS system. See below screenshot. An Inventory Request will have a status of `Needs AIMS Entry` when: - all items have been issued, canceled, or returned - there are issued items with no OC# The OC# is an editable field on the transactions table. It will be hidden until the transaction is in an `Issued` state. If empty, it will display in pink with an alert icon. Once populated, that styling is removed. Once all issued items have an OC#, the inventory request status will change to `Completed`. ![Screen Shot 2020-01-10 at 11.26.08 AM.png](https://images.zenhubusercontent.com/5caf676becad11531cc417cb/44dcccca-9f7f-4121-81c7-3cd1ff98cd2a)
1.0
Ability to Track OC and SN #'s - As a warehouse admin, I need to track the OC# that was created when I entered the transactions in the AIMS system. See below screenshot. An Inventory Request will have a status of `Needs AIMS Entry` when: - all items have been issued, canceled, or returned - there are issued items with no OC# The OC# is an editable field on the transactions table. It will be hidden until the transaction is in an `Issued` state. If empty, it will display in pink with an alert icon. Once populated, that styling is removed. Once all issued items have an OC#, the inventory request status will change to `Completed`. ![Screen Shot 2020-01-10 at 11.26.08 AM.png](https://images.zenhubusercontent.com/5caf676becad11531cc417cb/44dcccca-9f7f-4121-81c7-3cd1ff98cd2a)
non_priority
ability to track oc and sn s as a warehouse admin i need to track the oc that was created when i entered the transactions in the aims system see below screenshot an inventory request will have a status of needs aims entry when all items have been issued canceled or returned there are issued items with no oc the oc is an editable field on the transactions table it will be hidden until the transaction is in an issued state if empty it will display in pink with an alert icon once populated that styling is removed once all issued items have an oc the inventory request status will change to completed
0
65,724
19,671,762,066
IssuesEvent
2022-01-11 08:11:30
SeleniumHQ/selenium
https://api.github.com/repos/SeleniumHQ/selenium
opened
[🐛 Bug]: Impossible to use CDP via WS or HTTP with Selenium 4.x
I-defect needs-triaging
### What happened? For the last few days me and other people have been trying to use CDP commands via Selenium, however nothing seems to function as expected. Selenium always returns "se:cdp": "ws://" instead of HTTP - that's problem 1, and problem 2 is that even trying to communicate via ws:// Selenium never returns anything, it just gets stuck. If you do send "bad" or "missing" parameters Selenium will return an error message, but if you do send a "correct" command, nothing will happen. I would like to focus only on problem 2 which is the ws:// cdp usage, since I spent several days to try and use HTTP without success. Please note that I have ensured and triple checked and tested several variations of Chrome + ChromeDriver. ### How can we reproduce the issue? ```shell Try to communicate via ws://, send the following test: {"id":123213123,"sessionId":"975cefdf3e55bc59341ec6a3f9564216","method":"Network.setUserAgentOverride","params":{"userAgent":"banana","acceptLanguage":"minion world","platform":"apple"}} - nothing will happen. ``` ### Relevant log output ```shell Nothing, unfortunately. No error, no output, absolutely nothing. ``` ### Operating System Windows 10 ### Selenium version Java version "1.8.0_311" ### What are the browser(s) and version(s) where you see this issue? Chrome 96.0.4664.110 ### What are the browser driver(s) and version(s) where you see this issue? ChromeDriver 96.0.4664.110 ### Are you using Selenium Grid? 4.1.1
1.0
[🐛 Bug]: Impossible to use CDP via WS or HTTP with Selenium 4.x - ### What happened? For the last few days me and other people have been trying to use CDP commands via Selenium, however nothing seems to function as expected. Selenium always returns "se:cdp": "ws://" instead of HTTP - that's problem 1, and problem 2 is that even trying to communicate via ws:// Selenium never returns anything, it just gets stuck. If you do send "bad" or "missing" parameters Selenium will return an error message, but if you do send a "correct" command, nothing will happen. I would like to focus only on problem 2 which is the ws:// cdp usage, since I spent several days to try and use HTTP without success. Please note that I have ensured and triple checked and tested several variations of Chrome + ChromeDriver. ### How can we reproduce the issue? ```shell Try to communicate via ws://, send the following test: {"id":123213123,"sessionId":"975cefdf3e55bc59341ec6a3f9564216","method":"Network.setUserAgentOverride","params":{"userAgent":"banana","acceptLanguage":"minion world","platform":"apple"}} - nothing will happen. ``` ### Relevant log output ```shell Nothing, unfortunately. No error, no output, absolutely nothing. ``` ### Operating System Windows 10 ### Selenium version Java version "1.8.0_311" ### What are the browser(s) and version(s) where you see this issue? Chrome 96.0.4664.110 ### What are the browser driver(s) and version(s) where you see this issue? ChromeDriver 96.0.4664.110 ### Are you using Selenium Grid? 4.1.1
non_priority
impossible to use cdp via ws or http with selenium x what happened for the last few days me and other people have been trying to use cdp commands via selenium however nothing seems to function as expected selenium always returns se cdp ws instead of http that s problem and problem is that even trying to communicate via ws selenium never returns anything it just gets stuck if you do send bad or missing parameters selenium will return an error message but if you do send a correct command nothing will happen i would like to focus only on problem which is the ws cdp usage since i spent several days to try and use http without success please note that i have ensured and triple checked and tested several variations of chrome chromedriver how can we reproduce the issue shell try to communicate via ws send the following test id sessionid method network setuseragentoverride params useragent banana acceptlanguage minion world platform apple nothing will happen relevant log output shell nothing unfortunately no error no output absolutely nothing operating system windows selenium version java version what are the browser s and version s where you see this issue chrome what are the browser driver s and version s where you see this issue chromedriver are you using selenium grid
0
151,552
12,043,068,329
IssuesEvent
2020-04-14 11:45:58
ehimetakahashilab/research_papers
https://api.github.com/repos/ehimetakahashilab/research_papers
closed
Machine Learning-based Prediction of Test Power
machine learning test power prediction test selection
## 0. 論文 [Machine Learning-based Prediction of Test Power](https://ehimetakahashilab.slack.com/files/U2CTQRAMS/F010FNX6DV4/ets2019_machine_learning-based_prediction_of_test_power.pdf) ## 1. どんなもの? シミュレーションを行わずにテストパターンのテスト電力を予測する機械学習を用いた手法を提案 ## 2. 先行研究と比べてどこがすごい? 様々な機械学習アルゴリズムを適用することで,高い予測精度も示しつつ, シミュレーションベースのテスト電力を解析するものより実行時間も大幅に短縮した. ## 3. 技術や手法のキモはどこ? 最近ではテストパターンの数が多く,各テストの電力解析の実行時間が長すぎるため, すべてのテストについて完全な電力解析結果を得ることは不可能であり,事前に選択されたテストのごく一部のサブセットに対してのみ行う. そのため,テスト電力に関して最悪のシナリオを提供する可能性のあるテストを決定することが重要な課題である. そこで,テスト選択のための機械学習ベースのテスト電力予測手法を提案 予測では2つの異なる方法が適用される. 1. 消費電力の高いテストを特定するためにテストの動作を予測 1. 局所的なホットスポットを特定するためにスイッチングアクティビティと電力情報をチップのレイアウトに関連付ける ## 4. どうやって有効だと検証した? IWLSベンチマーク回路に対して,ニューラルネットワーク,最小二乗法,最近傍探索,リッジ回帰のアプローチの結果を示した. ## 5. 議論はある? ## 6. 次に読むべき論文は?
2.0
Machine Learning-based Prediction of Test Power - ## 0. 論文 [Machine Learning-based Prediction of Test Power](https://ehimetakahashilab.slack.com/files/U2CTQRAMS/F010FNX6DV4/ets2019_machine_learning-based_prediction_of_test_power.pdf) ## 1. どんなもの? シミュレーションを行わずにテストパターンのテスト電力を予測する機械学習を用いた手法を提案 ## 2. 先行研究と比べてどこがすごい? 様々な機械学習アルゴリズムを適用することで,高い予測精度も示しつつ, シミュレーションベースのテスト電力を解析するものより実行時間も大幅に短縮した. ## 3. 技術や手法のキモはどこ? 最近ではテストパターンの数が多く,各テストの電力解析の実行時間が長すぎるため, すべてのテストについて完全な電力解析結果を得ることは不可能であり,事前に選択されたテストのごく一部のサブセットに対してのみ行う. そのため,テスト電力に関して最悪のシナリオを提供する可能性のあるテストを決定することが重要な課題である. そこで,テスト選択のための機械学習ベースのテスト電力予測手法を提案 予測では2つの異なる方法が適用される. 1. 消費電力の高いテストを特定するためにテストの動作を予測 1. 局所的なホットスポットを特定するためにスイッチングアクティビティと電力情報をチップのレイアウトに関連付ける ## 4. どうやって有効だと検証した? IWLSベンチマーク回路に対して,ニューラルネットワーク,最小二乗法,最近傍探索,リッジ回帰のアプローチの結果を示した. ## 5. 議論はある? ## 6. 次に読むべき論文は?
non_priority
machine learning based prediction of test power 論文 どんなもの? シミュレーションを行わずにテストパターンのテスト電力を予測する機械学習を用いた手法を提案 先行研究と比べてどこがすごい? 様々な機械学習アルゴリズムを適用することで,高い予測精度も示しつつ, シミュレーションベースのテスト電力を解析するものより実行時間も大幅に短縮した. 技術や手法のキモはどこ? 最近ではテストパターンの数が多く,各テストの電力解析の実行時間が長すぎるため, すべてのテストについて完全な電力解析結果を得ることは不可能であり,事前に選択されたテストのごく一部のサブセットに対してのみ行う. そのため,テスト電力に関して最悪のシナリオを提供する可能性のあるテストを決定することが重要な課題である. そこで,テスト選択のための機械学習ベースのテスト電力予測手法を提案 . 消費電力の高いテストを特定するためにテストの動作を予測 局所的なホットスポットを特定するためにスイッチングアクティビティと電力情報をチップのレイアウトに関連付ける どうやって有効だと検証した? iwlsベンチマーク回路に対して,ニューラルネットワーク,最小二乗法,最近傍探索,リッジ回帰のアプローチの結果を示した. 議論はある? 次に読むべき論文は?
0
5,678
3,975,542,030
IssuesEvent
2016-05-05 06:12:39
kolliSuman/issues
https://api.github.com/repos/kolliSuman/issues
closed
QA_Common Errors in English_Prerequisites_p1
Category: Usability Developed By: VLEAD Release Number: Production Severity: S2 Status: Open
Defect Description : In the "Common Errors in English" experiment, the minimum requirement to run the experiment is not displayed in the page instead a page or Scrolling should appear providing information on minimum requirement to run this experiment, information like Bandwidth,Device Resolution,Hardware Configuration and Software Required. Actual Result :In the "Common Errors in English" experiment, the minimum requirement to run the experiment is not displayed in the page. Environment : OS: Windows 7, Ubuntu-16.04,Centos-6 Browsers: Firefox-42.0,Chrome-47.0,chromium-45.0 Bandwidth : 100Mbps Hardware Configuration:8GBRAM , Processor:i5 Test Step Link: https://github.com/Virtual-Labs/virtual-english-iitg/blob/master/test-cases/integration_test-cases/Common%20Errors%20in%20English/Common%20Errors%20in%20English_30_Prerequisites_p1.org
True
QA_Common Errors in English_Prerequisites_p1 - Defect Description : In the "Common Errors in English" experiment, the minimum requirement to run the experiment is not displayed in the page instead a page or Scrolling should appear providing information on minimum requirement to run this experiment, information like Bandwidth,Device Resolution,Hardware Configuration and Software Required. Actual Result :In the "Common Errors in English" experiment, the minimum requirement to run the experiment is not displayed in the page. Environment : OS: Windows 7, Ubuntu-16.04,Centos-6 Browsers: Firefox-42.0,Chrome-47.0,chromium-45.0 Bandwidth : 100Mbps Hardware Configuration:8GBRAM , Processor:i5 Test Step Link: https://github.com/Virtual-Labs/virtual-english-iitg/blob/master/test-cases/integration_test-cases/Common%20Errors%20in%20English/Common%20Errors%20in%20English_30_Prerequisites_p1.org
non_priority
qa common errors in english prerequisites defect description in the common errors in english experiment the minimum requirement to run the experiment is not displayed in the page instead a page or scrolling should appear providing information on minimum requirement to run this experiment information like bandwidth device resolution hardware configuration and software required actual result in the common errors in english experiment the minimum requirement to run the experiment is not displayed in the page environment os windows ubuntu centos browsers firefox chrome chromium bandwidth hardware configuration processor test step link
0
11,266
29,492,777,602
IssuesEvent
2023-06-02 14:36:15
tremor-rs/tremor-www
https://api.github.com/repos/tremor-rs/tremor-www
opened
Add a generic error handling guide
bug documentation information-architecture
Experienced production tremor users who have grown up with tremor and how it can manage, compensate, drop or process runtime errors have had the benefit of face time with tremor maintainers. We need to document the development and production practices as a guide as it can be confusing for new adopters and users of tremor and the documentation we have produced in this regard is weak to this date. The guide should cover ( post a comment to add to this list, dear readers ): * Configuring rust logging levels on the command line * Configuring and redirecting errors via the error port and connectors * Other common runtime error handling strategies The guide should not cover QoS, FT, HA as this is a separate topic and requires an application or system flow model and set of needs to be useful.
1.0
Add a generic error handling guide - Experienced production tremor users who have grown up with tremor and how it can manage, compensate, drop or process runtime errors have had the benefit of face time with tremor maintainers. We need to document the development and production practices as a guide as it can be confusing for new adopters and users of tremor and the documentation we have produced in this regard is weak to this date. The guide should cover ( post a comment to add to this list, dear readers ): * Configuring rust logging levels on the command line * Configuring and redirecting errors via the error port and connectors * Other common runtime error handling strategies The guide should not cover QoS, FT, HA as this is a separate topic and requires an application or system flow model and set of needs to be useful.
non_priority
add a generic error handling guide experienced production tremor users who have grown up with tremor and how it can manage compensate drop or process runtime errors have had the benefit of face time with tremor maintainers we need to document the development and production practices as a guide as it can be confusing for new adopters and users of tremor and the documentation we have produced in this regard is weak to this date the guide should cover post a comment to add to this list dear readers configuring rust logging levels on the command line configuring and redirecting errors via the error port and connectors other common runtime error handling strategies the guide should not cover qos ft ha as this is a separate topic and requires an application or system flow model and set of needs to be useful
0
227,885
17,402,033,087
IssuesEvent
2021-08-02 21:13:12
lightbend/akkaserverless-javascript-sdk
https://api.github.com/repos/lightbend/akkaserverless-javascript-sdk
opened
Add Replicated Entity page back to JavaScript SDK doc pages
Documentation doc-team
This task is to bring back the old documentation on replicated entities, but the content still needs to be updated to reflect the new implementation.
1.0
Add Replicated Entity page back to JavaScript SDK doc pages - This task is to bring back the old documentation on replicated entities, but the content still needs to be updated to reflect the new implementation.
non_priority
add replicated entity page back to javascript sdk doc pages this task is to bring back the old documentation on replicated entities but the content still needs to be updated to reflect the new implementation
0
40,093
12,746,539,857
IssuesEvent
2020-06-26 16:07:23
tech256/jobs
https://api.github.com/repos/tech256/jobs
closed
Cyber Threat Intel Analyst
Active Clearance Required Cyber Security Hiring stale
Cyber Threat Intelligence Analyst Vicksburg, MS Are you a whiz at Cyber Security? Do you enjoy supporting our military? INSUVI, Inc. is looking for great talent to join our team! What We Can Offer YOU! Medical Dental Vision Long and Short-Term Disability Life Insurance 401(k) Paid Time Off (PTO) Paid Holidays And More! COMPANY OVERVIEW: INSUVI, Inc. is a certified Economically Disadvantaged Woman-Owned Small Business (EDWOSB) headquartered in Huntsville, Alabama. We provide Information Technology, JavaScript Training, Systems Engineering, and Training services. POSITION OVERVIEW: Job Responsibilities Performs as the Senior Technical Subject Matter Expert (SME) in area of cyber threat intelligence Implements a full network infrastructure and selects network components including routers, switches, gateways, and firewalls Configures and maintains network designs, devices, and infrastructure and optimizes network performance Incorporates threat intelligence into countermeasures to detect and prevent intrusions and malware infestation and attacks Identifies threat actor tactics, techniques, and procedures Based on indicators, develops custom signatures and blocks Interfaces with Army Corps of Engineers Information Technology Computer Incident Response Team (ACE-IT CIRT) for incident response, recovery, and prevention. Coordinates with ACE-IT Security Operations Center (SOC) and Network Operations Center (NOC) personnel to maximize cyber threat prevention measures, enhances audit and logging standards, Implements the core Security Intelligence Center (SIC) concepts (SOC vs. SIC, Cyber Kill Chain, APT) Enforces and monitors effective cyber security policies and configurations and security event management within the logging and SIEM infrastructure Navigates the command line using specific expressions to manipulate data Handles and organizes disparate data about detections, attacks, and attackers Employs discovery techniques and vetting of new intelligence Builds better actionable intelligence from data QUALIFICATIONS: Education & Experience Bachelor's degree from an accredited university/college in Computer Science, Computer Engineering or related field and 4-8 years of prior relevant experience or master's degree with 2 - 6 years of prior relevant experience Relevant Experience required: Computer network defense technologies and Cyber Kill Chain Threat actor TTP and indicator identification using large data sources. Custom signature development Packet analysis Knowledge & Skills Has a strong grasp of the enterprise network and key networking concepts related to the Security Intelligence process Understands and works with various categories of electronic evidence including media, email, and networks Has a strong understanding of the tools & techniques necessary to efficiently identify trends and extract indicators from large data sources Recognizes key forensics and incident response concepts critical to the Security Intelligence process Knows the importance of being in control of the adversary's intrusion steps Understands how to employ the Cyber Kill Chain Knows how to identify and create mitigations for the Cyber Kill Chain grid Comprehends structured digital evidence collection and evaluation Understands the concept of Advanced Persistent Threat (APT) Is able to distinguish APT from traditional cyber threats Knows examples of specific intrusion techniques used by APT adversaries Recognizes what you'll need to know to prevent or identify APT intrusions Understands concepts of packet analysis Other Requirements Clearance: Must possess an Active U.S. Secret (or higher) Security INSUVI, Inc., provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, gender identity, sexual orientation, pregnancy, status as a parent, national origin, status as a parent, disability (physical or mental), family medical history or genetic information, political affiliation, military service, or other non-merit based factors. For more information, or to apply now, you must go to the website below. Please DO NOT email your resume to us as we only accept applications through our website. https://www.applicantpro.com/j/1311433-219641
True
Cyber Threat Intel Analyst - Cyber Threat Intelligence Analyst Vicksburg, MS Are you a whiz at Cyber Security? Do you enjoy supporting our military? INSUVI, Inc. is looking for great talent to join our team! What We Can Offer YOU! Medical Dental Vision Long and Short-Term Disability Life Insurance 401(k) Paid Time Off (PTO) Paid Holidays And More! COMPANY OVERVIEW: INSUVI, Inc. is a certified Economically Disadvantaged Woman-Owned Small Business (EDWOSB) headquartered in Huntsville, Alabama. We provide Information Technology, JavaScript Training, Systems Engineering, and Training services. POSITION OVERVIEW: Job Responsibilities Performs as the Senior Technical Subject Matter Expert (SME) in area of cyber threat intelligence Implements a full network infrastructure and selects network components including routers, switches, gateways, and firewalls Configures and maintains network designs, devices, and infrastructure and optimizes network performance Incorporates threat intelligence into countermeasures to detect and prevent intrusions and malware infestation and attacks Identifies threat actor tactics, techniques, and procedures Based on indicators, develops custom signatures and blocks Interfaces with Army Corps of Engineers Information Technology Computer Incident Response Team (ACE-IT CIRT) for incident response, recovery, and prevention. Coordinates with ACE-IT Security Operations Center (SOC) and Network Operations Center (NOC) personnel to maximize cyber threat prevention measures, enhances audit and logging standards, Implements the core Security Intelligence Center (SIC) concepts (SOC vs. SIC, Cyber Kill Chain, APT) Enforces and monitors effective cyber security policies and configurations and security event management within the logging and SIEM infrastructure Navigates the command line using specific expressions to manipulate data Handles and organizes disparate data about detections, attacks, and attackers Employs discovery techniques and vetting of new intelligence Builds better actionable intelligence from data QUALIFICATIONS: Education & Experience Bachelor's degree from an accredited university/college in Computer Science, Computer Engineering or related field and 4-8 years of prior relevant experience or master's degree with 2 - 6 years of prior relevant experience Relevant Experience required: Computer network defense technologies and Cyber Kill Chain Threat actor TTP and indicator identification using large data sources. Custom signature development Packet analysis Knowledge & Skills Has a strong grasp of the enterprise network and key networking concepts related to the Security Intelligence process Understands and works with various categories of electronic evidence including media, email, and networks Has a strong understanding of the tools & techniques necessary to efficiently identify trends and extract indicators from large data sources Recognizes key forensics and incident response concepts critical to the Security Intelligence process Knows the importance of being in control of the adversary's intrusion steps Understands how to employ the Cyber Kill Chain Knows how to identify and create mitigations for the Cyber Kill Chain grid Comprehends structured digital evidence collection and evaluation Understands the concept of Advanced Persistent Threat (APT) Is able to distinguish APT from traditional cyber threats Knows examples of specific intrusion techniques used by APT adversaries Recognizes what you'll need to know to prevent or identify APT intrusions Understands concepts of packet analysis Other Requirements Clearance: Must possess an Active U.S. Secret (or higher) Security INSUVI, Inc., provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, gender identity, sexual orientation, pregnancy, status as a parent, national origin, status as a parent, disability (physical or mental), family medical history or genetic information, political affiliation, military service, or other non-merit based factors. For more information, or to apply now, you must go to the website below. Please DO NOT email your resume to us as we only accept applications through our website. https://www.applicantpro.com/j/1311433-219641
non_priority
cyber threat intel analyst cyber threat intelligence analyst vicksburg ms are you a whiz at cyber security do you enjoy supporting our military insuvi inc is looking for great talent to join our team what we can offer you medical dental vision long and short term disability life insurance k paid time off pto paid holidays and more company overview insuvi inc is a certified economically disadvantaged woman owned small business edwosb headquartered in huntsville alabama we provide information technology javascript training systems engineering and training services position overview job responsibilities performs as the senior technical subject matter expert sme in area of cyber threat intelligence implements a full network infrastructure and selects network components including routers switches gateways and firewalls configures and maintains network designs devices and infrastructure and optimizes network performance incorporates threat intelligence into countermeasures to detect and prevent intrusions and malware infestation and attacks identifies threat actor tactics techniques and procedures based on indicators develops custom signatures and blocks interfaces with army corps of engineers information technology computer incident response team ace it cirt for incident response recovery and prevention coordinates with ace it security operations center soc and network operations center  noc personnel to maximize cyber threat prevention measures enhances audit and logging standards implements the core security intelligence center sic concepts soc vs sic cyber kill chain apt enforces and monitors effective cyber security policies and configurations and security event management within the logging and siem infrastructure navigates the command line using specific expressions to manipulate data handles and organizes disparate data about detections attacks and attackers employs discovery techniques and vetting of new intelligence builds better actionable intelligence from data qualifications education experience bachelor s degree from an accredited university college in computer science computer engineering or related field and years of prior relevant experience or master s degree with years of prior relevant experience relevant experience required computer network defense technologies and cyber kill chain threat actor ttp and indicator identification using large data sources custom signature development packet analysis knowledge skills has a strong grasp of the enterprise network and key networking concepts related to the security intelligence process understands and works with various categories of electronic evidence including media email and networks has a strong understanding of the tools techniques necessary to efficiently identify trends and extract indicators from large data sources recognizes key forensics and incident response concepts critical to the security intelligence process knows the importance of being in control of the adversary s intrusion steps understands how to employ the cyber kill chain knows how to identify and create mitigations for the cyber kill chain grid comprehends structured digital evidence collection and evaluation understands the concept of advanced persistent threat apt is able to distinguish apt from traditional cyber threats knows examples of specific intrusion techniques used by apt adversaries recognizes what you ll need to know to prevent or identify apt intrusions understands concepts of packet analysis other requirements clearance  must possess an active u s secret or higher security insuvi inc provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race color religion age sex gender identity sexual orientation pregnancy status as a parent national origin status as a parent disability physical or mental family medical history or genetic information political affiliation military service or other non merit based factors for more information or to apply now you must go to the website below please do not email your resume to us as we only accept applications through our website
0
13,607
3,163,764,985
IssuesEvent
2015-09-20 16:33:57
Quaggles/Icarus
https://api.github.com/repos/Quaggles/Icarus
closed
1: New Gameplay Mechanics
design enhancement programming
Implement new Shield Burst, Charge and Weapons. Refer to "Altered Mechanics" Document for how moves interact with each other (Rock, Paper, Scissors Balancing).
1.0
1: New Gameplay Mechanics - Implement new Shield Burst, Charge and Weapons. Refer to "Altered Mechanics" Document for how moves interact with each other (Rock, Paper, Scissors Balancing).
non_priority
new gameplay mechanics implement new shield burst charge and weapons refer to altered mechanics document for how moves interact with each other rock paper scissors balancing
0
93,864
11,813,615,906
IssuesEvent
2020-03-19 22:55:09
aragon/aragon
https://api.github.com/repos/aragon/aragon
closed
Allow users to use the default (local) network as their wallet provider
design: request enhancement
See discussion in https://github.com/aragon/aragon/pull/469#discussion_r233074933: > Maybe we should not have it by default and have a setting for that, below the provider URL, for users that know what they are doing? ``` Ethereum node: [ wss://mainnet.eth.aragon.network/ws ] [✓] Sign transactions with this provider ``` ---------- This would be useful for people who connect to the app through an unlocked eth node. See also https://github.com/aragon/aragon/pull/592#discussion_r251236549
1.0
Allow users to use the default (local) network as their wallet provider - See discussion in https://github.com/aragon/aragon/pull/469#discussion_r233074933: > Maybe we should not have it by default and have a setting for that, below the provider URL, for users that know what they are doing? ``` Ethereum node: [ wss://mainnet.eth.aragon.network/ws ] [✓] Sign transactions with this provider ``` ---------- This would be useful for people who connect to the app through an unlocked eth node. See also https://github.com/aragon/aragon/pull/592#discussion_r251236549
non_priority
allow users to use the default local network as their wallet provider see discussion in maybe we should not have it by default and have a setting for that below the provider url for users that know what they are doing ethereum node  sign transactions with this provider this would be useful for people who connect to the app through an unlocked eth node see also
0
142,838
11,496,422,504
IssuesEvent
2020-02-12 07:55:26
elastic/elasticsearch
https://api.github.com/repos/elastic/elasticsearch
closed
[CI] RegressionIT - testTwoJobsWithSameRandomizeSeedUseSameTrainingSet failing periodically
:ml >test-failure
Seen a few of these failures in the past week and was able to reproduce locally. ``` java.lang.AssertionError: Stats were: {"id":"regression_two_jobs_with_same_randomize_seed_1","state":"analyzing","progress":[{"phase":"reindexing","progress_percent":100},{"phase":"loading_data","progress_percent":100},{"phase":"analyzing","progress_percent":0},{"phase":"writing_results","progress_percent":0}],"node":{"id":"PIl01mkZRrO-meVVpsGh2Q","name":"integTest-0","ephemeral_id":"NLtfGfdBSZmQkIvL7fnGlQ","transport_address":"127.0.0.1:39527","attributes":{"testattr":"test","ml.machine_memory":"101267107840","ml.max_open_jobs":"20","xpack.installed":"true"}},"assignment_explanation":""} Expected: <stopped> but: was <analyzing> ``` https://gradle-enterprise.elastic.co/s/5ddwmmgxq2ru4/tests?search=testTwoJobsWithSameRandomizeSeedUseSameTrainingSet https://gradle-enterprise.elastic.co/s/zlvhmrm5uwf6o/tests?search=testTwoJobsWithSameRandomizeSeedUseSameTrainingSet https://gradle-enterprise.elastic.co/s/qlqseo7hyqs4o/tests?search=testTwoJobsWithSameRandomizeSeedUseSameTrainingSet Looks to be happening in both `master` and `7.x`.
1.0
[CI] RegressionIT - testTwoJobsWithSameRandomizeSeedUseSameTrainingSet failing periodically - Seen a few of these failures in the past week and was able to reproduce locally. ``` java.lang.AssertionError: Stats were: {"id":"regression_two_jobs_with_same_randomize_seed_1","state":"analyzing","progress":[{"phase":"reindexing","progress_percent":100},{"phase":"loading_data","progress_percent":100},{"phase":"analyzing","progress_percent":0},{"phase":"writing_results","progress_percent":0}],"node":{"id":"PIl01mkZRrO-meVVpsGh2Q","name":"integTest-0","ephemeral_id":"NLtfGfdBSZmQkIvL7fnGlQ","transport_address":"127.0.0.1:39527","attributes":{"testattr":"test","ml.machine_memory":"101267107840","ml.max_open_jobs":"20","xpack.installed":"true"}},"assignment_explanation":""} Expected: <stopped> but: was <analyzing> ``` https://gradle-enterprise.elastic.co/s/5ddwmmgxq2ru4/tests?search=testTwoJobsWithSameRandomizeSeedUseSameTrainingSet https://gradle-enterprise.elastic.co/s/zlvhmrm5uwf6o/tests?search=testTwoJobsWithSameRandomizeSeedUseSameTrainingSet https://gradle-enterprise.elastic.co/s/qlqseo7hyqs4o/tests?search=testTwoJobsWithSameRandomizeSeedUseSameTrainingSet Looks to be happening in both `master` and `7.x`.
non_priority
regressionit testtwojobswithsamerandomizeseedusesametrainingset failing periodically seen a few of these failures in the past week and was able to reproduce locally java lang assertionerror stats were id regression two jobs with same randomize seed state analyzing progress node id name integtest ephemeral id transport address attributes testattr test ml machine memory ml max open jobs xpack installed true assignment explanation expected but was looks to be happening in both master and x
0
45,592
9,790,031,654
IssuesEvent
2019-06-10 11:32:18
deity-io/falcon
https://api.github.com/repos/deity-io/falcon
closed
Rework Extensions
code-cleanup extension node
At this point, the `Extension` class seems to be an overhead when using within Falcon-Server. It is possible to move the functionality of Extension into ExtensionContainer class, keeping the current level of customization for users' extensions.
1.0
Rework Extensions - At this point, the `Extension` class seems to be an overhead when using within Falcon-Server. It is possible to move the functionality of Extension into ExtensionContainer class, keeping the current level of customization for users' extensions.
non_priority
rework extensions at this point the extension class seems to be an overhead when using within falcon server it is possible to move the functionality of extension into extensioncontainer class keeping the current level of customization for users extensions
0
19,677
27,322,438,894
IssuesEvent
2023-02-24 21:14:41
BlitterStudio/amiberry
https://api.github.com/repos/BlitterStudio/amiberry
closed
GFX Glitches
compatibility fixed in dev
**Describe the bug** Some games do present some graphical artifacts either in menu or in-game. This is 100% reproductible. Currently the following packages are impacted: * **PinkPanther_v1.0_0236** (does run fine on FS-uae) * **PuttySquad_v1.3_AGA** (does run fine on FS-uae) * **SurfNinjas_v1.0_AGA_1291** (does **_not_** run fine on FS-uae) * **WWFEuropeanRampageTour_v1.1_1298** (does **_not_** run fine on FS-uae) * **Zool2_v1.0_AGA_0415** (does run fine on FS-uae) **To Reproduce** 1. Simply run the aforementioned games. 2. Surf Ninjas: you have to go to the far left screen (beach) and get the bat. 3. WWF European Rampage Tour: glitches can be seen only in menus 4. Zool 2: as soon as you enter 1st stage (known bug) **Screenshots** ![PinkPanther_v1.0_0236](https://user-images.githubusercontent.com/8100500/204258393-bb656091-02e5-41c1-b9af-a7f4803273e2.png) ![SurfNinjas_v1 0_AGA_1291](https://user-images.githubusercontent.com/8100500/204258406-be4ada55-2f70-42dd-9f2a-f75367a8b6ef.png) ![WWFEuropeanRampageTour_v1 1_1298](https://user-images.githubusercontent.com/8100500/204258413-458d2abc-24d0-4ef2-bb56-b751b4a8f6a3.png) ![Zool2_v1 0_AGA_0415](https://user-images.githubusercontent.com/8100500/204258420-8137ad95-37cc-471b-b25c-3c9911295711.png) **Additional context** I'm running 5.5 (latest main) with Retropie 4.8 on Pi4. Buster / 32-bit / DMX. Everything (`.uae`, `.chd`, `.lha`) is therefore launched with the `--autoload` parameter. As far as memory goes, Zool 2 worked in 3.3 yet I'm pretty sure the others never worked properly.
True
GFX Glitches - **Describe the bug** Some games do present some graphical artifacts either in menu or in-game. This is 100% reproductible. Currently the following packages are impacted: * **PinkPanther_v1.0_0236** (does run fine on FS-uae) * **PuttySquad_v1.3_AGA** (does run fine on FS-uae) * **SurfNinjas_v1.0_AGA_1291** (does **_not_** run fine on FS-uae) * **WWFEuropeanRampageTour_v1.1_1298** (does **_not_** run fine on FS-uae) * **Zool2_v1.0_AGA_0415** (does run fine on FS-uae) **To Reproduce** 1. Simply run the aforementioned games. 2. Surf Ninjas: you have to go to the far left screen (beach) and get the bat. 3. WWF European Rampage Tour: glitches can be seen only in menus 4. Zool 2: as soon as you enter 1st stage (known bug) **Screenshots** ![PinkPanther_v1.0_0236](https://user-images.githubusercontent.com/8100500/204258393-bb656091-02e5-41c1-b9af-a7f4803273e2.png) ![SurfNinjas_v1 0_AGA_1291](https://user-images.githubusercontent.com/8100500/204258406-be4ada55-2f70-42dd-9f2a-f75367a8b6ef.png) ![WWFEuropeanRampageTour_v1 1_1298](https://user-images.githubusercontent.com/8100500/204258413-458d2abc-24d0-4ef2-bb56-b751b4a8f6a3.png) ![Zool2_v1 0_AGA_0415](https://user-images.githubusercontent.com/8100500/204258420-8137ad95-37cc-471b-b25c-3c9911295711.png) **Additional context** I'm running 5.5 (latest main) with Retropie 4.8 on Pi4. Buster / 32-bit / DMX. Everything (`.uae`, `.chd`, `.lha`) is therefore launched with the `--autoload` parameter. As far as memory goes, Zool 2 worked in 3.3 yet I'm pretty sure the others never worked properly.
non_priority
gfx glitches describe the bug some games do present some graphical artifacts either in menu or in game this is reproductible currently the following packages are impacted pinkpanther does run fine on fs uae puttysquad aga does run fine on fs uae surfninjas aga does not run fine on fs uae wwfeuropeanrampagetour does not run fine on fs uae aga does run fine on fs uae to reproduce simply run the aforementioned games surf ninjas you have to go to the far left screen beach and get the bat wwf european rampage tour glitches can be seen only in menus zool as soon as you enter stage known bug screenshots additional context i m running latest main with retropie on buster bit dmx everything uae chd lha is therefore launched with the autoload parameter as far as memory goes zool worked in yet i m pretty sure the others never worked properly
0
5,328
5,631,973,192
IssuesEvent
2017-04-05 15:36:45
Cadasta/cadasta-platform
https://api.github.com/repos/Cadasta/cadasta-platform
closed
Public Users can still view `records/locations` and `records/parties`
bug security
### Steps to reproduce the error Add a user to an organization and assign them as a Public User for a project. Login as that user and go to that project page. Add `/records/locations/` to the URL. ::or:: Add `/records/parties/` to the URL ### Actual behavior They see a stripped down version of the "map" page. They see a list of party names and types. They don't have access to any details. ### Expected behavior I don't think `/records/location/` is much of an issue because it's not showing you any information that you don't already have access to. But the party table should be stripped the same way that the resource table is.
True
Public Users can still view `records/locations` and `records/parties` - ### Steps to reproduce the error Add a user to an organization and assign them as a Public User for a project. Login as that user and go to that project page. Add `/records/locations/` to the URL. ::or:: Add `/records/parties/` to the URL ### Actual behavior They see a stripped down version of the "map" page. They see a list of party names and types. They don't have access to any details. ### Expected behavior I don't think `/records/location/` is much of an issue because it's not showing you any information that you don't already have access to. But the party table should be stripped the same way that the resource table is.
non_priority
public users can still view records locations and records parties steps to reproduce the error add a user to an organization and assign them as a public user for a project login as that user and go to that project page add records locations to the url or add records parties to the url actual behavior they see a stripped down version of the map page they see a list of party names and types they don t have access to any details expected behavior i don t think records location is much of an issue because it s not showing you any information that you don t already have access to but the party table should be stripped the same way that the resource table is
0
7,738
3,600,637,174
IssuesEvent
2016-02-03 07:16:32
hjwylde/werewolf
https://api.github.com/repos/hjwylde/werewolf
closed
Add an `upcoming-deaths` field to the game state
existing: enhancement kind: code state: awaiting release
This will make it easier to control when someone dies and help to fix the message ordering issue.
1.0
Add an `upcoming-deaths` field to the game state - This will make it easier to control when someone dies and help to fix the message ordering issue.
non_priority
add an upcoming deaths field to the game state this will make it easier to control when someone dies and help to fix the message ordering issue
0
90,887
15,856,305,104
IssuesEvent
2021-04-08 02:01:43
Molizo/FTC-Scouting-App-Skystone
https://api.github.com/repos/Molizo/FTC-Scouting-App-Skystone
opened
CVE-2021-23337 (High) detected in lodash-4.17.15.tgz
security vulnerability
## CVE-2021-23337 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.15.tgz</b></p></summary> <p>Lodash modular utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz</a></p> <p>Path to dependency file: FTC-Scouting-App-Skystone/SkystoneScouting/package.json</p> <p>Path to vulnerable library: FTC-Scouting-App-Skystone/SkystoneScouting/node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - multi-select-0.3.4.tgz (Root Library) - can-5.30.2.tgz - can-observable-array-0.5.0.tgz - cli-7.5.5.tgz - :x: **lodash-4.17.15.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Lodash versions prior to 4.17.21 are vulnerable to Command Injection via the template function. <p>Publish Date: 2021-02-15 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23337>CVE-2021-23337</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.2</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: High - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/lodash/lodash/commit/3469357cff396a26c363f8c1b5a91dde28ba4b1c">https://github.com/lodash/lodash/commit/3469357cff396a26c363f8c1b5a91dde28ba4b1c</a></p> <p>Release Date: 2021-02-15</p> <p>Fix Resolution: lodash - 4.17.21</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-23337 (High) detected in lodash-4.17.15.tgz - ## CVE-2021-23337 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.15.tgz</b></p></summary> <p>Lodash modular utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz</a></p> <p>Path to dependency file: FTC-Scouting-App-Skystone/SkystoneScouting/package.json</p> <p>Path to vulnerable library: FTC-Scouting-App-Skystone/SkystoneScouting/node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - multi-select-0.3.4.tgz (Root Library) - can-5.30.2.tgz - can-observable-array-0.5.0.tgz - cli-7.5.5.tgz - :x: **lodash-4.17.15.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Lodash versions prior to 4.17.21 are vulnerable to Command Injection via the template function. <p>Publish Date: 2021-02-15 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23337>CVE-2021-23337</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.2</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: High - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/lodash/lodash/commit/3469357cff396a26c363f8c1b5a91dde28ba4b1c">https://github.com/lodash/lodash/commit/3469357cff396a26c363f8c1b5a91dde28ba4b1c</a></p> <p>Release Date: 2021-02-15</p> <p>Fix Resolution: lodash - 4.17.21</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve high detected in lodash tgz cve high severity vulnerability vulnerable library lodash tgz lodash modular utilities library home page a href path to dependency file ftc scouting app skystone skystonescouting package json path to vulnerable library ftc scouting app skystone skystonescouting node modules lodash package json dependency hierarchy multi select tgz root library can tgz can observable array tgz cli tgz x lodash tgz vulnerable library found in base branch master vulnerability details lodash versions prior to are vulnerable to command injection via the template function publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution lodash step up your open source security game with whitesource
0
94,410
15,962,371,818
IssuesEvent
2021-04-16 01:10:17
RG4421/azure-iot-platform-dotnet
https://api.github.com/repos/RG4421/azure-iot-platform-dotnet
opened
CVE-2021-23369 (Medium) detected in handlebars-4.1.0.tgz
security vulnerability
## CVE-2021-23369 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.1.0.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.1.0.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.1.0.tgz</a></p> <p>Path to dependency file: azure-iot-platform-dotnet/src/webui/azure-iot-ux-fluent-controls/package.json</p> <p>Path to vulnerable library: azure-iot-platform-dotnet/src/webui/azure-iot-ux-fluent-controls/node_modules/nyc/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - nyc-13.3.0.tgz (Root Library) - istanbul-reports-2.1.1.tgz - :x: **handlebars-4.1.0.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The package handlebars before 4.7.7 are vulnerable to Remote Code Execution (RCE) when selecting certain compiling options to compile templates coming from an untrusted source. <p>Publish Date: 2021-04-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23369>CVE-2021-23369</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23369">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23369</a></p> <p>Release Date: 2021-04-12</p> <p>Fix Resolution: handlebars - 4.7.7</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"handlebars","packageVersion":"4.1.0","packageFilePaths":["/src/webui/azure-iot-ux-fluent-controls/package.json"],"isTransitiveDependency":true,"dependencyTree":"nyc:13.3.0;istanbul-reports:2.1.1;handlebars:4.1.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"handlebars - 4.7.7"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-23369","vulnerabilityDetails":"The package handlebars before 4.7.7 are vulnerable to Remote Code Execution (RCE) when selecting certain compiling options to compile templates coming from an untrusted source.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23369","cvss3Severity":"medium","cvss3Score":"5.6","cvss3Metrics":{"A":"Low","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
True
CVE-2021-23369 (Medium) detected in handlebars-4.1.0.tgz - ## CVE-2021-23369 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.1.0.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.1.0.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.1.0.tgz</a></p> <p>Path to dependency file: azure-iot-platform-dotnet/src/webui/azure-iot-ux-fluent-controls/package.json</p> <p>Path to vulnerable library: azure-iot-platform-dotnet/src/webui/azure-iot-ux-fluent-controls/node_modules/nyc/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - nyc-13.3.0.tgz (Root Library) - istanbul-reports-2.1.1.tgz - :x: **handlebars-4.1.0.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The package handlebars before 4.7.7 are vulnerable to Remote Code Execution (RCE) when selecting certain compiling options to compile templates coming from an untrusted source. <p>Publish Date: 2021-04-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23369>CVE-2021-23369</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23369">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23369</a></p> <p>Release Date: 2021-04-12</p> <p>Fix Resolution: handlebars - 4.7.7</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"handlebars","packageVersion":"4.1.0","packageFilePaths":["/src/webui/azure-iot-ux-fluent-controls/package.json"],"isTransitiveDependency":true,"dependencyTree":"nyc:13.3.0;istanbul-reports:2.1.1;handlebars:4.1.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"handlebars - 4.7.7"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-23369","vulnerabilityDetails":"The package handlebars before 4.7.7 are vulnerable to Remote Code Execution (RCE) when selecting certain compiling options to compile templates coming from an untrusted source.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23369","cvss3Severity":"medium","cvss3Score":"5.6","cvss3Metrics":{"A":"Low","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
non_priority
cve medium detected in handlebars tgz cve medium severity vulnerability vulnerable library handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file azure iot platform dotnet src webui azure iot ux fluent controls package json path to vulnerable library azure iot platform dotnet src webui azure iot ux fluent controls node modules nyc node modules handlebars package json dependency hierarchy nyc tgz root library istanbul reports tgz x handlebars tgz vulnerable library found in base branch master vulnerability details the package handlebars before are vulnerable to remote code execution rce when selecting certain compiling options to compile templates coming from an untrusted source publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution handlebars isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree nyc istanbul reports handlebars isminimumfixversionavailable true minimumfixversion handlebars basebranches vulnerabilityidentifier cve vulnerabilitydetails the package handlebars before are vulnerable to remote code execution rce when selecting certain compiling options to compile templates coming from an untrusted source vulnerabilityurl
0
62,681
8,634,368,480
IssuesEvent
2018-11-22 16:35:57
coursier/coursier
https://api.github.com/repos/coursier/coursier
closed
Document cache location rules and overrides
documentation
Per the source code, the rules are quite complicated, but the (deprecated) docs suggest it is a simple matter of `~/.coursier/cache` unless overridden. Eyeballing the source says there are 4 possibilities: 1/ wherever `$COURSIER_CACHE` says 2/ wherever `-Dcoursier.cache=...` says 3/ a platform specific place like `~/.cache/coursier` or `~/Library/Caches/coursier` or a Windows place I don't understand 4/ `~/.coursier/cache` An added complication is when considering 3/ and 4/ the choice depends on whether a directory already exists.
1.0
Document cache location rules and overrides - Per the source code, the rules are quite complicated, but the (deprecated) docs suggest it is a simple matter of `~/.coursier/cache` unless overridden. Eyeballing the source says there are 4 possibilities: 1/ wherever `$COURSIER_CACHE` says 2/ wherever `-Dcoursier.cache=...` says 3/ a platform specific place like `~/.cache/coursier` or `~/Library/Caches/coursier` or a Windows place I don't understand 4/ `~/.coursier/cache` An added complication is when considering 3/ and 4/ the choice depends on whether a directory already exists.
non_priority
document cache location rules and overrides per the source code the rules are quite complicated but the deprecated docs suggest it is a simple matter of coursier cache unless overridden eyeballing the source says there are possibilities wherever coursier cache says wherever dcoursier cache says a platform specific place like cache coursier or library caches coursier or a windows place i don t understand coursier cache an added complication is when considering and the choice depends on whether a directory already exists
0
184,893
14,994,757,718
IssuesEvent
2021-01-29 13:23:45
HubertWelp/SweetPicker3
https://api.github.com/repos/HubertWelp/SweetPicker3
opened
Alle Readme.md sind veraltet
documentation enhancement
Die Readme Dateien der Pakete und des Projektes wurden noch nicht aktualisiert.
1.0
Alle Readme.md sind veraltet - Die Readme Dateien der Pakete und des Projektes wurden noch nicht aktualisiert.
non_priority
alle readme md sind veraltet die readme dateien der pakete und des projektes wurden noch nicht aktualisiert
0
309,960
26,687,631,297
IssuesEvent
2023-01-26 23:51:21
status-im/status-mobile
https://api.github.com/repos/status-im/status-mobile
closed
Accessibility-id is gone from community channel name
bug e2e test blocker
# Bug Report ## Problem Regression from https://github.com/status-im/status-mobile/pull/14799 No accessibility-ids on channels inside the community #### Expected behavior 'chat-name-text' #### Actual behavior ![Appium 2023-01-26 11-44-47](https://user-images.githubusercontent.com/4557972/214816976-edce6410-a415-4f61-b19a-4c1e87249fae.png) ### Reproduction 1) Open Status 2) Create a community > Navigate to community > check accessibility id on channel ### Additional Information - Status version: nightly 26/01/23 - Operating System: Android, iOS [comment]: # (Please, add logs/notes if necessary)
1.0
Accessibility-id is gone from community channel name - # Bug Report ## Problem Regression from https://github.com/status-im/status-mobile/pull/14799 No accessibility-ids on channels inside the community #### Expected behavior 'chat-name-text' #### Actual behavior ![Appium 2023-01-26 11-44-47](https://user-images.githubusercontent.com/4557972/214816976-edce6410-a415-4f61-b19a-4c1e87249fae.png) ### Reproduction 1) Open Status 2) Create a community > Navigate to community > check accessibility id on channel ### Additional Information - Status version: nightly 26/01/23 - Operating System: Android, iOS [comment]: # (Please, add logs/notes if necessary)
non_priority
accessibility id is gone from community channel name bug report problem regression from no accessibility ids on channels inside the community expected behavior chat name text actual behavior reproduction open status create a community navigate to community check accessibility id on channel additional information status version nightly operating system android ios please add logs notes if necessary
0
439,713
30,710,949,026
IssuesEvent
2023-07-27 09:44:16
pacificlion/pacificlion.github.io
https://api.github.com/repos/pacificlion/pacificlion.github.io
closed
complete freecodecamp youtube tutorial on data structures
documentation
[Data Structures Easy to Advanced Course - Full Tutorial from a Google Engineer](https://www.youtube.com/watch?v=RBSGKlAvoiM) - [x] Dynamic Array - [x] LinkedList - [x] Double Linked List - [x] Stack - [x] Queue - [ ] Priority Queue - [ ] Union Find - [ ] Binary Search Tree - [ ] Hashtable
1.0
complete freecodecamp youtube tutorial on data structures - [Data Structures Easy to Advanced Course - Full Tutorial from a Google Engineer](https://www.youtube.com/watch?v=RBSGKlAvoiM) - [x] Dynamic Array - [x] LinkedList - [x] Double Linked List - [x] Stack - [x] Queue - [ ] Priority Queue - [ ] Union Find - [ ] Binary Search Tree - [ ] Hashtable
non_priority
complete freecodecamp youtube tutorial on data structures dynamic array linkedlist double linked list stack queue priority queue union find binary search tree hashtable
0
95,107
11,954,118,410
IssuesEvent
2020-04-03 22:35:10
mozilla/foundation.mozilla.org
https://api.github.com/repos/mozilla/foundation.mozilla.org
opened
IA: gather initial website content
design
[Copy doc is here](https://docs.google.com/document/d/19Y5hNmEU4wix_gCoSvmhMKUk2F1Dht6Nwf1JQAu0lsg/edit#heading=h.jga10kqap7d8). We need to continue to update the doc section to reflect our IA, continue to ping Lotta, Anil, and others to populate the doc.
1.0
IA: gather initial website content - [Copy doc is here](https://docs.google.com/document/d/19Y5hNmEU4wix_gCoSvmhMKUk2F1Dht6Nwf1JQAu0lsg/edit#heading=h.jga10kqap7d8). We need to continue to update the doc section to reflect our IA, continue to ping Lotta, Anil, and others to populate the doc.
non_priority
ia gather initial website content we need to continue to update the doc section to reflect our ia continue to ping lotta anil and others to populate the doc
0
401,591
27,333,138,720
IssuesEvent
2023-02-25 22:03:43
wxWidgets/wxWidgets
https://api.github.com/repos/wxWidgets/wxWidgets
closed
May be an error in a settings.h
documentation
I see 2 files settings.h, one in include/wx/ and an other in interface/wx. I have the feeling that only the first one is used, so it's not really a problem. The second one, has probably an error at line 118 because this line is not ended by a comma, so the enum is probably not correct. In the first one, the corresponding line (85) is ended by a comma. ### Platform and version information wxWidgets version 3.2.2 or 3.3.0
1.0
May be an error in a settings.h - I see 2 files settings.h, one in include/wx/ and an other in interface/wx. I have the feeling that only the first one is used, so it's not really a problem. The second one, has probably an error at line 118 because this line is not ended by a comma, so the enum is probably not correct. In the first one, the corresponding line (85) is ended by a comma. ### Platform and version information wxWidgets version 3.2.2 or 3.3.0
non_priority
may be an error in a settings h i see files settings h one in include wx and an other in interface wx i have the feeling that only the first one is used so it s not really a problem the second one has probably an error at line because this line is not ended by a comma so the enum is probably not correct in the first one the corresponding line is ended by a comma platform and version information wxwidgets version or
0
144,381
11,614,148,682
IssuesEvent
2020-02-26 12:03:47
pingcap/tidb-operator
https://api.github.com/repos/pingcap/tidb-operator
closed
e2e: "[Feature: AdvancedStatefulSet] Scaling tidb cluster with advanced statefulset " is flaky
test/e2e
## Bug Report https://internal.pingcap.net/idc-jenkins/blue/organizations/jenkins/operator_ghpr_e2e_test_kind/detail/operator_ghpr_e2e_test_kind/2273/tests ``` Stacktrace /home/jenkins/agent/workspace/operator_ghpr_e2e_test_kind/go/src/github.com/pingcap/tidb-operator/tests/e2e/tidbcluster/serial.go:139 Jan 10 10:04:16.182: Unexpected error: <*meta.NoKindMatchError | 0xc0010bb400>: { GroupKind: {Group: "pingcap.com", Kind: "TidbCluster"}, SearchedVersions: ["v1alpha1"], } no matches for kind "TidbCluster" in version "pingcap.com/v1alpha1" occurred /home/jenkins/agent/workspace/operator_ghpr_e2e_test_kind/go/src/github.com/pingcap/tidb-operator/tests/e2e/tidbcluster/serial.go:241 ``` k8s version: v1.12.10
1.0
e2e: "[Feature: AdvancedStatefulSet] Scaling tidb cluster with advanced statefulset " is flaky - ## Bug Report https://internal.pingcap.net/idc-jenkins/blue/organizations/jenkins/operator_ghpr_e2e_test_kind/detail/operator_ghpr_e2e_test_kind/2273/tests ``` Stacktrace /home/jenkins/agent/workspace/operator_ghpr_e2e_test_kind/go/src/github.com/pingcap/tidb-operator/tests/e2e/tidbcluster/serial.go:139 Jan 10 10:04:16.182: Unexpected error: <*meta.NoKindMatchError | 0xc0010bb400>: { GroupKind: {Group: "pingcap.com", Kind: "TidbCluster"}, SearchedVersions: ["v1alpha1"], } no matches for kind "TidbCluster" in version "pingcap.com/v1alpha1" occurred /home/jenkins/agent/workspace/operator_ghpr_e2e_test_kind/go/src/github.com/pingcap/tidb-operator/tests/e2e/tidbcluster/serial.go:241 ``` k8s version: v1.12.10
non_priority
scaling tidb cluster with advanced statefulset is flaky bug report stacktrace home jenkins agent workspace operator ghpr test kind go src github com pingcap tidb operator tests tidbcluster serial go jan unexpected error groupkind group pingcap com kind tidbcluster searchedversions no matches for kind tidbcluster in version pingcap com occurred home jenkins agent workspace operator ghpr test kind go src github com pingcap tidb operator tests tidbcluster serial go version
0
304,762
26,331,250,591
IssuesEvent
2023-01-10 10:57:38
serlo/frontend
https://api.github.com/repos/serlo/frontend
closed
visual: scrollbars on mitmachen-menu
in testing
On the landing page with the featured Spenden button the Mitmachen overlay is cropped: <img width="574" alt="image" src="https://user-images.githubusercontent.com/1258870/210352452-77bf173f-68b4-4cfa-91eb-4808ff5963f2.png">
1.0
visual: scrollbars on mitmachen-menu - On the landing page with the featured Spenden button the Mitmachen overlay is cropped: <img width="574" alt="image" src="https://user-images.githubusercontent.com/1258870/210352452-77bf173f-68b4-4cfa-91eb-4808ff5963f2.png">
non_priority
visual scrollbars on mitmachen menu on the landing page with the featured spenden button the mitmachen overlay is cropped img width alt image src
0
129,596
10,579,341,766
IssuesEvent
2019-10-08 02:17:31
rancher/rancher
https://api.github.com/repos/rancher/rancher
closed
Error on Rollback on a workload
[zube]: To Test kind/bug-qa team/ca
**What kind of request is this (question/bug/enhancement/feature request):** bug **Steps to reproduce (least amount of steps as possible):** 1. Deploy a workload with image: `sangeetha/mytestcontainer`. Scale up the pods to 2. 2. Edit the workload, change image to `ubuntu`. 3. Pods will get recreated 4. Edit the workload, change image to `nginx`. 5. Pods will get recreated 6. Rollback to the first revision id. **Expected Result:** The Rollback should be successful **Actual Result:** The rollback throws an error <img width="1547" alt="Screen Shot 2019-10-07 at 2 41 48 PM" src="https://user-images.githubusercontent.com/26032343/66350991-8a913a00-e911-11e9-891a-63e159d6503c.png"> **Note:** This does NOT happen always. **Other details that may be helpful:** **Environment information** - Rancher version (`rancher/rancher`/`rancher/server` image tag or shown bottom left in the UI): rancher:v2.3.0-rc12 - Installation option (single install/HA): HA <!-- If the reported issue is regarding a created cluster, please provide requested info below --> **Cluster information** - Cluster type (Hosted/Infrastructure Provider/Custom/Imported): custom - Kubernetes version (use `kubectl version`): ``` 1.15 ``` - Docker version (use `docker version`): ``` 18.9.9 ```
1.0
Error on Rollback on a workload - **What kind of request is this (question/bug/enhancement/feature request):** bug **Steps to reproduce (least amount of steps as possible):** 1. Deploy a workload with image: `sangeetha/mytestcontainer`. Scale up the pods to 2. 2. Edit the workload, change image to `ubuntu`. 3. Pods will get recreated 4. Edit the workload, change image to `nginx`. 5. Pods will get recreated 6. Rollback to the first revision id. **Expected Result:** The Rollback should be successful **Actual Result:** The rollback throws an error <img width="1547" alt="Screen Shot 2019-10-07 at 2 41 48 PM" src="https://user-images.githubusercontent.com/26032343/66350991-8a913a00-e911-11e9-891a-63e159d6503c.png"> **Note:** This does NOT happen always. **Other details that may be helpful:** **Environment information** - Rancher version (`rancher/rancher`/`rancher/server` image tag or shown bottom left in the UI): rancher:v2.3.0-rc12 - Installation option (single install/HA): HA <!-- If the reported issue is regarding a created cluster, please provide requested info below --> **Cluster information** - Cluster type (Hosted/Infrastructure Provider/Custom/Imported): custom - Kubernetes version (use `kubectl version`): ``` 1.15 ``` - Docker version (use `docker version`): ``` 18.9.9 ```
non_priority
error on rollback on a workload what kind of request is this question bug enhancement feature request bug steps to reproduce least amount of steps as possible deploy a workload with image sangeetha mytestcontainer scale up the pods to edit the workload change image to ubuntu pods will get recreated edit the workload change image to nginx pods will get recreated rollback to the first revision id expected result the rollback should be successful actual result the rollback throws an error img width alt screen shot at pm src note this does not happen always other details that may be helpful environment information rancher version rancher rancher rancher server image tag or shown bottom left in the ui rancher installation option single install ha ha if the reported issue is regarding a created cluster please provide requested info below cluster information cluster type hosted infrastructure provider custom imported custom kubernetes version use kubectl version docker version use docker version
0
77,564
10,000,921,402
IssuesEvent
2019-07-12 14:28:49
systemd/systemd
https://api.github.com/repos/systemd/systemd
closed
systemctl man page needs full examples for multiple property assignment
RFE 🎁 documentation 📖 has-pr ✨ systemctl
### Submission type - [ ] Bug report - [X] Request for enhancement (RFE) ### systemd version the issue has been seen with v232 ### Used distribution Debian See also https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=807464 File: /usr/share/man/man1/systemctl.1.gz We read: ``` --runtime is passed, in which case the settings only apply until the next reboot. The syntax of the property assignment follows closely the syntax of assignments in unit files. Example: systemctl set-property foobar.service CPUShares=777 Note that this command allows changing multiple properties at the same time, which is preferable over setting them individually. Like unit file configuration settings, assigning the empty list to list parameters will reset the list. ``` OK but you then need to add a further example: ```systemctl set-property foobar.service nurdbar.servive CPUShares=777``` Or ```systemctl set-property foobar.service CPUShares=777 BlaBoo=888``` Or ```systemctl set-property foobar.service nurdbar.servive CPUShares=777 BlaBoo=888``` to clarify. Also give an example of assigning the empty list. Since the syntax "follows closely" but not exactly, hence full examples are needed.
1.0
systemctl man page needs full examples for multiple property assignment - ### Submission type - [ ] Bug report - [X] Request for enhancement (RFE) ### systemd version the issue has been seen with v232 ### Used distribution Debian See also https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=807464 File: /usr/share/man/man1/systemctl.1.gz We read: ``` --runtime is passed, in which case the settings only apply until the next reboot. The syntax of the property assignment follows closely the syntax of assignments in unit files. Example: systemctl set-property foobar.service CPUShares=777 Note that this command allows changing multiple properties at the same time, which is preferable over setting them individually. Like unit file configuration settings, assigning the empty list to list parameters will reset the list. ``` OK but you then need to add a further example: ```systemctl set-property foobar.service nurdbar.servive CPUShares=777``` Or ```systemctl set-property foobar.service CPUShares=777 BlaBoo=888``` Or ```systemctl set-property foobar.service nurdbar.servive CPUShares=777 BlaBoo=888``` to clarify. Also give an example of assigning the empty list. Since the syntax "follows closely" but not exactly, hence full examples are needed.
non_priority
systemctl man page needs full examples for multiple property assignment submission type bug report request for enhancement rfe systemd version the issue has been seen with used distribution debian see also file usr share man systemctl gz we read runtime is passed in which case the settings only apply until the next reboot the syntax of the property assignment follows closely the syntax of assignments in unit files example systemctl set property foobar service cpushares note that this command allows changing multiple properties at the same time which is preferable over setting them individually like unit file configuration settings assigning the empty list to list parameters will reset the list ok but you then need to add a further example systemctl set property foobar service nurdbar servive cpushares or systemctl set property foobar service cpushares blaboo or systemctl set property foobar service nurdbar servive cpushares blaboo to clarify also give an example of assigning the empty list since the syntax follows closely but not exactly hence full examples are needed
0
43,753
5,559,379,713
IssuesEvent
2017-03-24 16:48:29
dotnet/corefx
https://api.github.com/repos/dotnet/corefx
closed
Run DCS/DCJS Tests in CoreFx against uapaot.
area-Serialization os-windows-uwp test enhancement
Running tests against uapaot is not supported. The task is to make our DCS/DCJS test projects build and run against uapaot.
1.0
Run DCS/DCJS Tests in CoreFx against uapaot. - Running tests against uapaot is not supported. The task is to make our DCS/DCJS test projects build and run against uapaot.
non_priority
run dcs dcjs tests in corefx against uapaot running tests against uapaot is not supported the task is to make our dcs dcjs test projects build and run against uapaot
0
43,740
5,558,873,992
IssuesEvent
2017-03-24 15:40:42
MitocGroup/deep-framework
https://api.github.com/repos/MitocGroup/deep-framework
closed
[deep-security] Move custom auth context from event data to lambda context object
test delayed
Move custom auth context `_deep_auth_context_` injected by api gateway integration template when using CUSOM authorizer from event to lambda context object.
1.0
[deep-security] Move custom auth context from event data to lambda context object - Move custom auth context `_deep_auth_context_` injected by api gateway integration template when using CUSOM authorizer from event to lambda context object.
non_priority
move custom auth context from event data to lambda context object move custom auth context deep auth context injected by api gateway integration template when using cusom authorizer from event to lambda context object
0
207,161
16,067,155,417
IssuesEvent
2021-04-23 21:11:48
pulibrary/dspace-development
https://api.github.com/repos/pulibrary/dspace-development
closed
Document the process of ingesting large-scale datasets into DataSpace on behalf of researchers
dataspace documentation research-data
@jrgriffiniii commented on [Mon May 04 2020](https://github.com/pulibrary/dspace-cli/issues/11) Currently we cannot support the ingestion of data sets beyond 250MB for public download into DataSpace collections due to storage and server configuration restrictions. As a consequence, we have recently started to use Dropbox as an alternative storage solution for larger scale datasets, providing public users with the ability to directly download the datasets for any given Item.
1.0
Document the process of ingesting large-scale datasets into DataSpace on behalf of researchers - @jrgriffiniii commented on [Mon May 04 2020](https://github.com/pulibrary/dspace-cli/issues/11) Currently we cannot support the ingestion of data sets beyond 250MB for public download into DataSpace collections due to storage and server configuration restrictions. As a consequence, we have recently started to use Dropbox as an alternative storage solution for larger scale datasets, providing public users with the ability to directly download the datasets for any given Item.
non_priority
document the process of ingesting large scale datasets into dataspace on behalf of researchers jrgriffiniii commented on currently we cannot support the ingestion of data sets beyond for public download into dataspace collections due to storage and server configuration restrictions as a consequence we have recently started to use dropbox as an alternative storage solution for larger scale datasets providing public users with the ability to directly download the datasets for any given item
0
78,806
9,797,058,808
IssuesEvent
2019-06-11 09:06:54
unee-t/frontend
https://api.github.com/repos/unee-t/frontend
closed
Notification email - first link to the case is not clickable
design/ux enhancement
# The problem: With the rollout of PR #414 we ## Expected result: The link to the case is clickable ## Actual result: The first link to the case is NOT clickable (but the second link is) ![image](https://user-images.githubusercontent.com/31331637/43349816-201924c8-9235-11e8-87fd-6aa8201d4c49.png)
1.0
Notification email - first link to the case is not clickable - # The problem: With the rollout of PR #414 we ## Expected result: The link to the case is clickable ## Actual result: The first link to the case is NOT clickable (but the second link is) ![image](https://user-images.githubusercontent.com/31331637/43349816-201924c8-9235-11e8-87fd-6aa8201d4c49.png)
non_priority
notification email first link to the case is not clickable the problem with the rollout of pr we expected result the link to the case is clickable actual result the first link to the case is not clickable but the second link is
0
45,917
11,758,502,304
IssuesEvent
2020-03-13 15:31:29
golang/go
https://api.github.com/repos/golang/go
closed
x/build: darwin builders are missing
Builders NeedsInvestigation
As per: https://farmer.golang.org/status/macs Multiple darwin builders are missing. ```# "macs" status: MacStadium Mac VMs # Notes: https://github.com/golang/build/tree/master/env/darwin/macstadium Warn: macstadium_host04a missing, not seen for 24m22s Warn: macstadium_host08a missing, not seen for 59m37s Warn: macstadium_host08b missing, not seen for 59m37s Warn: macstadium_host10b missing, not seen for 14m13s Error: 4 machines missing, 20% of capacity ```
1.0
x/build: darwin builders are missing - As per: https://farmer.golang.org/status/macs Multiple darwin builders are missing. ```# "macs" status: MacStadium Mac VMs # Notes: https://github.com/golang/build/tree/master/env/darwin/macstadium Warn: macstadium_host04a missing, not seen for 24m22s Warn: macstadium_host08a missing, not seen for 59m37s Warn: macstadium_host08b missing, not seen for 59m37s Warn: macstadium_host10b missing, not seen for 14m13s Error: 4 machines missing, 20% of capacity ```
non_priority
x build darwin builders are missing as per multiple darwin builders are missing macs status macstadium mac vms notes warn macstadium missing not seen for warn macstadium missing not seen for warn macstadium missing not seen for warn macstadium missing not seen for error machines missing of capacity
0
4,222
7,179,916,359
IssuesEvent
2018-01-31 21:20:35
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
closed
NPE Failure in ChunkTopicParser
bug preprocess/chunking
## Expected Behavior Publication should be processed without failure, with chunking correctly reflected. ## Actual Behavior NPE: chunk: [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/Chunk591488048.dita [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-03.dita [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-04.dita [chunk] Writing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/Chunk1150021098.dita BUILD FAILED /Users/ekimber/dita-ot/dita-ot-2.5.3/build.xml:45: The following error occurred while executing this line: /Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.base/build_preprocess.xml:317: java.lang.NullPointerException at org.dita.dost.writer.ChunkTopicParser.processChunk(ChunkTopicParser.java:239) at org.dita.dost.writer.ChunkTopicParser.write(ChunkTopicParser.java:61) at org.dita.dost.reader.ChunkMapReader.processCombineChunk(ChunkMapReader.java:542) at org.dita.dost.reader.ChunkMapReader.processTopicref(ChunkMapReader.java:347) at org.dita.dost.reader.ChunkMapReader.processChildTopicref(ChunkMapReader.java:516) at org.dita.dost.reader.ChunkMapReader.processTopicref(ChunkMapReader.java:372) at org.dita.dost.reader.ChunkMapReader.process(ChunkMapReader.java:143) at org.dita.dost.writer.AbstractDomFilter.read(AbstractDomFilter.java:55) at org.dita.dost.reader.ChunkMapReader.read(ChunkMapReader.java:119) at org.dita.dost.module.ChunkModule.execute(ChunkModule.java:80) at org.dita.dost.pipeline.PipelineFacade.execute(PipelineFacade.java:80) at org.dita.dost.invoker.ExtensibleAntInvoker.execute(ExtensibleAntInvoker.java:230) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:435) at org.apache.tools.ant.Target.performTasks(Target.java:456) at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1405) at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38) at org.apache.tools.ant.Project.executeTargets(Project.java:1260) at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:441) at org.apache.tools.ant.taskdefs.CallTarget.execute(CallTarget.java:105) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:435) at org.apache.tools.ant.Target.performTasks(Target.java:456) at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1405) at org.apache.tools.ant.Project.executeTarget(Project.java:1376) at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41) at org.apache.tools.ant.Project.executeTargets(Project.java:1260) at org.apache.tools.ant.Main.runBuild(Main.java:857) at org.apache.tools.ant.Main.startAnt(Main.java:236) at org.apache.tools.ant.launch.Launcher.run(Launcher.java:287) at org.apache.tools.ant.launch.Launcher.main(Launcher.java:113) ## Possible Solution None yet identified. ## Steps to Reproduce <!-- Test case, Gist, set of files or steps required to reproduce the issue. --> 1. Apply any transform with unmodified preprocessing, e.g., xhtml, to root topic https://github.com/dita-community/dita-test-cases/blob/master/topichead-chunking/topichead-chunking-test-01.ditamap Test data set is in github here: https://github.com/dita-community/dita-test-cases/tree/master/topichead-chunking ## Copy of the error message, log file or stack trace Executing: "/Applications/Oxygen XML Editor/.install4j/jre.bundle/Contents/Home/jre/bin/java" -Dapple.awt.UIElement=true -Xmx384m "-Doxygen.org.apache.xerces.xni.parser.XMLParserConfiguration=org.ditang.relaxng.defaults.RelaxDefaultsParserConfiguration" -classpath "/Applications/Oxygen XML Editor/tools/ant/lib/ant-launcher.jar" "-Dant.home=/Applications/Oxygen XML Editor/tools/ant" org.apache.tools.ant.launch.Launcher -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/saxon-9.1.0.8-dom.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/saxon-9.1.0.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/commons-io-2.5.jar" -lib "/Applications/Oxygen XML Editor/classes" -lib "/Applications/Oxygen XML Editor/lib/oxygen-annotations.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-content-completion-api.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-css-flute-parser.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-css-pretty-printer.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-css-validator.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-emf.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-jfx-components.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-pdf-chemistry.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-text-search.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-token-markers.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-validation-api.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-xquery-pretty-printer.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen.jar" -lib "/Applications/Oxygen XML Editor/lib/resolver.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-token-markers.jar" -lib "/Applications/Oxygen XML Editor/lib/org.eclipse.wst.xml.xpath2.processor_1.2.0.jar" -lib "/Applications/Oxygen XML Editor/lib/xml-apis.jar" -lib "/Applications/Oxygen XML Editor/lib/xercesImpl.jar" -lib "/Applications/Oxygen XML Editor/lib/commons-io-1.3.1.jar" -lib "/Applications/Oxygen XML Editor/lib/commons-logging-1.2.jar" -lib "/Applications/Oxygen XML Editor/lib/log4j.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/commons-codec-1.10.jar" -lib "/Applications/Oxygen XML Editor/lib/jing.jar" -lib "/Applications/Oxygen XML Editor/lib/saxon9ee.jar" -lib "/Applications/Oxygen XML Editor/lib/saxon.jar" -lib "/Applications/Oxygen XML Editor/lib/xmlgraphics-commons-2.1.jar" -lib "/Applications/Oxygen XML Editor/lib/fop.jar" -lib "/Applications/Oxygen XML Editor/lib/fontbox-1.8.5.jar" -lib "/Applications/Oxygen XML Editor/lib/batik-all-1.8.jar" -lib "/Applications/Oxygen XML Editor/lib/js.jar" -lib "/Applications/Oxygen XML Editor/lib/poi-3.10-FINAL-20140208.jar" -lib "/Applications/Oxygen XML Editor/lib/nekohtml.jar" -lib "/Applications/Oxygen XML Editor/lib/xml-apis-ext.jar" -lib "/Applications/Oxygen XML Editor/lib/avalon-framework-api-4.3.1.jar" -lib "/Applications/Oxygen XML Editor/lib/avalon-framework-impl-4.3.1.jar" -lib "/Applications/Oxygen XML Editor/lib/jeuclid-core.jar" -lib "/Applications/Oxygen XML Editor/lib/jeuclid-fop.jar" -lib "/Applications/Oxygen XML Editor/lib/jai_tiff.jar" -lib "/Applications/Oxygen XML Editor/lib/jh.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/jsearch.jar" -lib "/Applications/Oxygen XML Editor/lib/lucene-analyzers-common-4.7.2.jar" -lib "/Applications/Oxygen XML Editor/lib/lucene-analyzers-kuromoji-4.7.2.jar" -lib "/Applications/Oxygen XML Editor/lib/lucene-core-4.7.2.jar" -lib "/Applications/Oxygen XML Editor/lib/lucene-misc-4.7.2.jar" -lib "/Applications/Oxygen XML Editor/lib/lucene-queries-4.7.2.jar" -lib "/Applications/Oxygen XML Editor/lib/lucene-queryparser-4.7.2.jar" -lib "/Applications/Oxygen XML Editor/lib/lucene-suggest-4.7.2.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/dost-patches.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/ant-apache-resolver-1.10.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/ant-launcher.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/ant.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/dost-configuration.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/dost.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/guava-19.0.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/icu4j-57.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/logback-classic-1.2.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/logback-core-1.2.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/slf4j-api-1.7.23.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/xercesImpl-2.11.0.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/xml-apis-1.4.01.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/xml-resolver-1.2.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2/lib/fo.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.axf/lib/axf.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/avalon-framework-api-4.3.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/avalon-framework-impl-4.3.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-anim-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-awt-util-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-bridge-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-css-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-dom-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-ext-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-extension-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-gvt-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-parser-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-script-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-svg-dom-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-svggen-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-transcoder-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-util-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-xml-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/commons-logging-1.0.4.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/fop-2.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/xmlgraphics-commons-2.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/xml-apis-ext-1.3.04.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.xep/lib/xep.jar" -f "/Users/ekimber/dita-ot/dita-ot-2.5.3/build.xml" "-Dtranstype=html5" "-Dbasedir=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking" "-Doutput.dir=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/out/html5" "-Ddita.temp.dir=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5" "-Dargs.input=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topichead-chunking-test-01.ditamap" "-Ddita.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3" "-DbaseJVMArgLine=-Xmx384m" Buildfile: /Users/ekimber/dita-ot/dita-ot-2.5.3/build.xml init: html5.init: init-properties: check-arg: [mkdir] Created dir: /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/out/html5 [mkdir] Created dir: /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5 log-arg: [echo] ***************************************************************** [echo] * basedir = /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking [echo] * dita.dir = /Users/ekimber/dita-ot/dita-ot-2.5.3 [echo] * transtype = html5 [echo] * tempdir = /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5 [echo] * outputdir = /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/out/html5 [echo] * clean.temp = true [echo] * DITA-OT version = 2.5.3 [echo] * XML parser = Xerces [echo] * XSLT processor = Saxon [echo] * collator = ICU [echo] ***************************************************************** [echo] #Ant properties [echo] #Sun Sep 17 18:15:21 CDT 2017 [echo] args.css.file.temp=${args.css} [echo] args.css.real=${args.css} [echo] args.grammar.cache=yes [echo] args.input=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topichead-chunking-test-01.ditamap [echo] args.logdir=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/out/html5 [echo] args.xml.systemid.set=yes [echo] args.xsl=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.html5/xsl/dita2html5.xsl [echo] dita.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3 [echo] dita.html5.reloadstylesheet=false [echo] dita.output.dir=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/out/html5 [echo] dita.plugin.com.sophos.tocjs.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/com.sophos.tocjs [echo] dita.plugin.org.dita.base.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3 [echo] dita.plugin.org.dita.eclipsehelp.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.eclipsehelp [echo] dita.plugin.org.dita.html5.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.html5 [echo] dita.plugin.org.dita.htmlhelp.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.htmlhelp [echo] dita.plugin.org.dita.javahelp.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.javahelp [echo] dita.plugin.org.dita.pdf2.axf.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.axf [echo] dita.plugin.org.dita.pdf2.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2 [echo] dita.plugin.org.dita.pdf2.fop.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop [echo] dita.plugin.org.dita.pdf2.xep.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.xep [echo] dita.plugin.org.dita.specialization.dita11.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.specialization.dita11 [echo] dita.plugin.org.dita.troff.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.troff [echo] dita.plugin.org.dita.xhtml.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.xhtml [echo] dita.plugin.org.oasis-open.dita.v1_2.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.oasis-open.dita.v1_2 [echo] dita.plugin.org.oasis-open.dita.v1_3.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.oasis-open.dita.v1_3 [echo] dita.temp.dir=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5 [echo] ***************************************************************** build-init: preprocess.init: [echo] ***************************************************************** [echo] * input = /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topichead-chunking-test-01.ditamap [echo] ***************************************************************** ditaval-merge: gen-list: [gen-list] Using Xerces grammar pool for DTD and schema caching. [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topichead-chunking-test-01.ditamap [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topichead-chunking-test-01.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-01.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-02.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-03.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-04.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-05.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-06.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-07.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-08.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-09-chunk-root.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-10.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-11.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-12.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-13-chunk-root.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-14.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-15.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-16.dita [gen-list] Serializing job specification debug-filter: [filter] Using Xerces grammar pool for DTD and schema caching. [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-16.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-16.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topichead-chunking-test-01.ditamap to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-05.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-05.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topichead-chunking-test-01.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-02.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-02.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-15.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-15.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-10.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-10.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-09-chunk-root.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-09-chunk-root.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-04.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-04.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-01.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-01.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-07.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-07.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-13-chunk-root.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-13-chunk-root.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-12.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-12.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-08.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-08.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-03.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-03.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-11.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-11.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-06.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-06.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-14.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-14.dita [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/canditopics.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/conref.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/conreftargets.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/copytosource.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/fullditamap.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/fullditamapandtopic.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/fullditatopic.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/hrefditatopic.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/hreftargets.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/html.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/image.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/outditafiles.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/resourceonly.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/resourceonly.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/subjectscheme.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/subtargets.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/user.input.file.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl mapref-check: mapref: [mapref] Transforming into /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5 [mapref] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/preprocess/mapref.xsl [mapref] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap branch-filter: [branch-filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap copy-image: copy-html: copy-flag-check: copy-flag: copy-files: keyref: [keyref] Reading file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap [keyref] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap [keyref] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-11.dita [keyref] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-15.dita [mapref] Transforming into /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5 [mapref] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/preprocess/mapref.xsl [mapref] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap copy-to: conrefpush: [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/conref.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl conref-check: conref: profile-check: profile: topic-fragment: [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-16.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-05.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-02.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-15.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-10.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-09-chunk-root.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-04.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-01.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-07.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-13-chunk-root.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-12.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-08.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-03.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-11.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-06.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-14.dita chunk-check: chunk: [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/Chunk591488048.dita [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-03.dita [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-04.dita [chunk] Writing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/Chunk1150021098.dita BUILD FAILED /Users/ekimber/dita-ot/dita-ot-2.5.3/build.xml:45: The following error occurred while executing this line: /Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.base/build_preprocess.xml:317: java.lang.NullPointerException at org.dita.dost.writer.ChunkTopicParser.processChunk(ChunkTopicParser.java:239) at org.dita.dost.writer.ChunkTopicParser.write(ChunkTopicParser.java:61) at org.dita.dost.reader.ChunkMapReader.processCombineChunk(ChunkMapReader.java:542) at org.dita.dost.reader.ChunkMapReader.processTopicref(ChunkMapReader.java:347) at org.dita.dost.reader.ChunkMapReader.processChildTopicref(ChunkMapReader.java:516) at org.dita.dost.reader.ChunkMapReader.processTopicref(ChunkMapReader.java:372) at org.dita.dost.reader.ChunkMapReader.process(ChunkMapReader.java:143) at org.dita.dost.writer.AbstractDomFilter.read(AbstractDomFilter.java:55) at org.dita.dost.reader.ChunkMapReader.read(ChunkMapReader.java:119) at org.dita.dost.module.ChunkModule.execute(ChunkModule.java:80) at org.dita.dost.pipeline.PipelineFacade.execute(PipelineFacade.java:80) at org.dita.dost.invoker.ExtensibleAntInvoker.execute(ExtensibleAntInvoker.java:230) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:435) at org.apache.tools.ant.Target.performTasks(Target.java:456) at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1405) at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38) at org.apache.tools.ant.Project.executeTargets(Project.java:1260) at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:441) at org.apache.tools.ant.taskdefs.CallTarget.execute(CallTarget.java:105) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:435) at org.apache.tools.ant.Target.performTasks(Target.java:456) at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1405) at org.apache.tools.ant.Project.executeTarget(Project.java:1376) at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41) at org.apache.tools.ant.Project.executeTargets(Project.java:1260) at org.apache.tools.ant.Main.runBuild(Main.java:857) at org.apache.tools.ant.Main.startAnt(Main.java:236) at org.apache.tools.ant.launch.Launcher.run(Launcher.java:287) at org.apache.tools.ant.launch.Launcher.main(Launcher.java:113) Total time: 4 seconds The process finished with exit code: 1 ## Environment <!-- Include relevant details about the environment you experienced this in. --> * DITA-OT version: 2.5.3 * Operating system and version: macOS 10.12.6 (16G29) * How did you run DITA-OT? oXygen * Transformation type: HTML5 <!-- Before submitting, check the Preview tab above to verify the XML markup appears correctly and remember you can edit the description later to add information. -->
1.0
NPE Failure in ChunkTopicParser - ## Expected Behavior Publication should be processed without failure, with chunking correctly reflected. ## Actual Behavior NPE: chunk: [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/Chunk591488048.dita [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-03.dita [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-04.dita [chunk] Writing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/Chunk1150021098.dita BUILD FAILED /Users/ekimber/dita-ot/dita-ot-2.5.3/build.xml:45: The following error occurred while executing this line: /Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.base/build_preprocess.xml:317: java.lang.NullPointerException at org.dita.dost.writer.ChunkTopicParser.processChunk(ChunkTopicParser.java:239) at org.dita.dost.writer.ChunkTopicParser.write(ChunkTopicParser.java:61) at org.dita.dost.reader.ChunkMapReader.processCombineChunk(ChunkMapReader.java:542) at org.dita.dost.reader.ChunkMapReader.processTopicref(ChunkMapReader.java:347) at org.dita.dost.reader.ChunkMapReader.processChildTopicref(ChunkMapReader.java:516) at org.dita.dost.reader.ChunkMapReader.processTopicref(ChunkMapReader.java:372) at org.dita.dost.reader.ChunkMapReader.process(ChunkMapReader.java:143) at org.dita.dost.writer.AbstractDomFilter.read(AbstractDomFilter.java:55) at org.dita.dost.reader.ChunkMapReader.read(ChunkMapReader.java:119) at org.dita.dost.module.ChunkModule.execute(ChunkModule.java:80) at org.dita.dost.pipeline.PipelineFacade.execute(PipelineFacade.java:80) at org.dita.dost.invoker.ExtensibleAntInvoker.execute(ExtensibleAntInvoker.java:230) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:435) at org.apache.tools.ant.Target.performTasks(Target.java:456) at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1405) at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38) at org.apache.tools.ant.Project.executeTargets(Project.java:1260) at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:441) at org.apache.tools.ant.taskdefs.CallTarget.execute(CallTarget.java:105) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:435) at org.apache.tools.ant.Target.performTasks(Target.java:456) at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1405) at org.apache.tools.ant.Project.executeTarget(Project.java:1376) at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41) at org.apache.tools.ant.Project.executeTargets(Project.java:1260) at org.apache.tools.ant.Main.runBuild(Main.java:857) at org.apache.tools.ant.Main.startAnt(Main.java:236) at org.apache.tools.ant.launch.Launcher.run(Launcher.java:287) at org.apache.tools.ant.launch.Launcher.main(Launcher.java:113) ## Possible Solution None yet identified. ## Steps to Reproduce <!-- Test case, Gist, set of files or steps required to reproduce the issue. --> 1. Apply any transform with unmodified preprocessing, e.g., xhtml, to root topic https://github.com/dita-community/dita-test-cases/blob/master/topichead-chunking/topichead-chunking-test-01.ditamap Test data set is in github here: https://github.com/dita-community/dita-test-cases/tree/master/topichead-chunking ## Copy of the error message, log file or stack trace Executing: "/Applications/Oxygen XML Editor/.install4j/jre.bundle/Contents/Home/jre/bin/java" -Dapple.awt.UIElement=true -Xmx384m "-Doxygen.org.apache.xerces.xni.parser.XMLParserConfiguration=org.ditang.relaxng.defaults.RelaxDefaultsParserConfiguration" -classpath "/Applications/Oxygen XML Editor/tools/ant/lib/ant-launcher.jar" "-Dant.home=/Applications/Oxygen XML Editor/tools/ant" org.apache.tools.ant.launch.Launcher -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/saxon-9.1.0.8-dom.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/saxon-9.1.0.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/commons-io-2.5.jar" -lib "/Applications/Oxygen XML Editor/classes" -lib "/Applications/Oxygen XML Editor/lib/oxygen-annotations.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-content-completion-api.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-css-flute-parser.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-css-pretty-printer.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-css-validator.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-emf.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-jfx-components.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-pdf-chemistry.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-text-search.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-token-markers.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-validation-api.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-xquery-pretty-printer.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen.jar" -lib "/Applications/Oxygen XML Editor/lib/resolver.jar" -lib "/Applications/Oxygen XML Editor/lib/oxygen-token-markers.jar" -lib "/Applications/Oxygen XML Editor/lib/org.eclipse.wst.xml.xpath2.processor_1.2.0.jar" -lib "/Applications/Oxygen XML Editor/lib/xml-apis.jar" -lib "/Applications/Oxygen XML Editor/lib/xercesImpl.jar" -lib "/Applications/Oxygen XML Editor/lib/commons-io-1.3.1.jar" -lib "/Applications/Oxygen XML Editor/lib/commons-logging-1.2.jar" -lib "/Applications/Oxygen XML Editor/lib/log4j.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/commons-codec-1.10.jar" -lib "/Applications/Oxygen XML Editor/lib/jing.jar" -lib "/Applications/Oxygen XML Editor/lib/saxon9ee.jar" -lib "/Applications/Oxygen XML Editor/lib/saxon.jar" -lib "/Applications/Oxygen XML Editor/lib/xmlgraphics-commons-2.1.jar" -lib "/Applications/Oxygen XML Editor/lib/fop.jar" -lib "/Applications/Oxygen XML Editor/lib/fontbox-1.8.5.jar" -lib "/Applications/Oxygen XML Editor/lib/batik-all-1.8.jar" -lib "/Applications/Oxygen XML Editor/lib/js.jar" -lib "/Applications/Oxygen XML Editor/lib/poi-3.10-FINAL-20140208.jar" -lib "/Applications/Oxygen XML Editor/lib/nekohtml.jar" -lib "/Applications/Oxygen XML Editor/lib/xml-apis-ext.jar" -lib "/Applications/Oxygen XML Editor/lib/avalon-framework-api-4.3.1.jar" -lib "/Applications/Oxygen XML Editor/lib/avalon-framework-impl-4.3.1.jar" -lib "/Applications/Oxygen XML Editor/lib/jeuclid-core.jar" -lib "/Applications/Oxygen XML Editor/lib/jeuclid-fop.jar" -lib "/Applications/Oxygen XML Editor/lib/jai_tiff.jar" -lib "/Applications/Oxygen XML Editor/lib/jh.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/jsearch.jar" -lib "/Applications/Oxygen XML Editor/lib/lucene-analyzers-common-4.7.2.jar" -lib "/Applications/Oxygen XML Editor/lib/lucene-analyzers-kuromoji-4.7.2.jar" -lib "/Applications/Oxygen XML Editor/lib/lucene-core-4.7.2.jar" -lib "/Applications/Oxygen XML Editor/lib/lucene-misc-4.7.2.jar" -lib "/Applications/Oxygen XML Editor/lib/lucene-queries-4.7.2.jar" -lib "/Applications/Oxygen XML Editor/lib/lucene-queryparser-4.7.2.jar" -lib "/Applications/Oxygen XML Editor/lib/lucene-suggest-4.7.2.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/dost-patches.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/ant-apache-resolver-1.10.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/ant-launcher.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/ant.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/dost-configuration.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/dost.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/guava-19.0.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/icu4j-57.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/logback-classic-1.2.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/logback-core-1.2.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/slf4j-api-1.7.23.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/xercesImpl-2.11.0.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/xml-apis-1.4.01.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/lib/xml-resolver-1.2.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2/lib/fo.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.axf/lib/axf.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/avalon-framework-api-4.3.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/avalon-framework-impl-4.3.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-anim-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-awt-util-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-bridge-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-css-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-dom-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-ext-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-extension-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-gvt-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-parser-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-script-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-svg-dom-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-svggen-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-transcoder-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-util-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/batik-xml-1.8.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/commons-logging-1.0.4.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/fop-2.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/xmlgraphics-commons-2.1.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop/lib/xml-apis-ext-1.3.04.jar" -lib "/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.xep/lib/xep.jar" -f "/Users/ekimber/dita-ot/dita-ot-2.5.3/build.xml" "-Dtranstype=html5" "-Dbasedir=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking" "-Doutput.dir=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/out/html5" "-Ddita.temp.dir=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5" "-Dargs.input=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topichead-chunking-test-01.ditamap" "-Ddita.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3" "-DbaseJVMArgLine=-Xmx384m" Buildfile: /Users/ekimber/dita-ot/dita-ot-2.5.3/build.xml init: html5.init: init-properties: check-arg: [mkdir] Created dir: /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/out/html5 [mkdir] Created dir: /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5 log-arg: [echo] ***************************************************************** [echo] * basedir = /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking [echo] * dita.dir = /Users/ekimber/dita-ot/dita-ot-2.5.3 [echo] * transtype = html5 [echo] * tempdir = /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5 [echo] * outputdir = /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/out/html5 [echo] * clean.temp = true [echo] * DITA-OT version = 2.5.3 [echo] * XML parser = Xerces [echo] * XSLT processor = Saxon [echo] * collator = ICU [echo] ***************************************************************** [echo] #Ant properties [echo] #Sun Sep 17 18:15:21 CDT 2017 [echo] args.css.file.temp=${args.css} [echo] args.css.real=${args.css} [echo] args.grammar.cache=yes [echo] args.input=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topichead-chunking-test-01.ditamap [echo] args.logdir=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/out/html5 [echo] args.xml.systemid.set=yes [echo] args.xsl=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.html5/xsl/dita2html5.xsl [echo] dita.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3 [echo] dita.html5.reloadstylesheet=false [echo] dita.output.dir=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/out/html5 [echo] dita.plugin.com.sophos.tocjs.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/com.sophos.tocjs [echo] dita.plugin.org.dita.base.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3 [echo] dita.plugin.org.dita.eclipsehelp.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.eclipsehelp [echo] dita.plugin.org.dita.html5.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.html5 [echo] dita.plugin.org.dita.htmlhelp.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.htmlhelp [echo] dita.plugin.org.dita.javahelp.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.javahelp [echo] dita.plugin.org.dita.pdf2.axf.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.axf [echo] dita.plugin.org.dita.pdf2.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2 [echo] dita.plugin.org.dita.pdf2.fop.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.fop [echo] dita.plugin.org.dita.pdf2.xep.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.pdf2.xep [echo] dita.plugin.org.dita.specialization.dita11.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.specialization.dita11 [echo] dita.plugin.org.dita.troff.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.troff [echo] dita.plugin.org.dita.xhtml.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.xhtml [echo] dita.plugin.org.oasis-open.dita.v1_2.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.oasis-open.dita.v1_2 [echo] dita.plugin.org.oasis-open.dita.v1_3.dir=/Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.oasis-open.dita.v1_3 [echo] dita.temp.dir=/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5 [echo] ***************************************************************** build-init: preprocess.init: [echo] ***************************************************************** [echo] * input = /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topichead-chunking-test-01.ditamap [echo] ***************************************************************** ditaval-merge: gen-list: [gen-list] Using Xerces grammar pool for DTD and schema caching. [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topichead-chunking-test-01.ditamap [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topichead-chunking-test-01.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-01.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-02.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-03.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-04.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-05.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-06.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-07.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-08.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-09-chunk-root.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-10.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-11.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-12.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-13-chunk-root.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-14.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-15.dita [gen-list] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-16.dita [gen-list] Serializing job specification debug-filter: [filter] Using Xerces grammar pool for DTD and schema caching. [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-16.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-16.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topichead-chunking-test-01.ditamap to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-05.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-05.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topichead-chunking-test-01.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-02.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-02.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-15.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-15.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-10.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-10.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-09-chunk-root.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-09-chunk-root.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-04.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-04.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-01.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-01.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-07.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-07.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-13-chunk-root.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-13-chunk-root.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-12.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-12.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-08.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-08.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-03.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-03.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-11.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-11.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-06.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-06.dita [filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/topics/topic-14.dita to file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-14.dita [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/canditopics.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/conref.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/conreftargets.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/copytosource.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/fullditamap.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/fullditamapandtopic.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/fullditatopic.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/hrefditatopic.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/hreftargets.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/html.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/image.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/outditafiles.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/resourceonly.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/resourceonly.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/subjectscheme.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/subtargets.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/user.input.file.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl mapref-check: mapref: [mapref] Transforming into /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5 [mapref] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/preprocess/mapref.xsl [mapref] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap branch-filter: [branch-filter] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap copy-image: copy-html: copy-flag-check: copy-flag: copy-files: keyref: [keyref] Reading file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap [keyref] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap [keyref] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-11.dita [keyref] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-15.dita [mapref] Transforming into /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5 [mapref] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/preprocess/mapref.xsl [mapref] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap copy-to: conrefpush: [job-helper] Processing /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/.job.xml to /Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/conref.list [job-helper] Loading stylesheet /Users/ekimber/dita-ot/dita-ot-2.5.3/xsl/job-helper.xsl conref-check: conref: profile-check: profile: topic-fragment: [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-16.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-05.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-02.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-15.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-10.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-09-chunk-root.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-04.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-01.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-07.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-13-chunk-root.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-12.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-08.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-03.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-11.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-06.dita [topic-fragment] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-14.dita chunk-check: chunk: [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topichead-chunking-test-01.ditamap [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/Chunk591488048.dita [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-03.dita [chunk] Processing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/topics/topic-04.dita [chunk] Writing file:/Users/ekimber/workspace-dita-community/dita-test-cases/topichead-chunking/temp/html5/Chunk1150021098.dita BUILD FAILED /Users/ekimber/dita-ot/dita-ot-2.5.3/build.xml:45: The following error occurred while executing this line: /Users/ekimber/dita-ot/dita-ot-2.5.3/plugins/org.dita.base/build_preprocess.xml:317: java.lang.NullPointerException at org.dita.dost.writer.ChunkTopicParser.processChunk(ChunkTopicParser.java:239) at org.dita.dost.writer.ChunkTopicParser.write(ChunkTopicParser.java:61) at org.dita.dost.reader.ChunkMapReader.processCombineChunk(ChunkMapReader.java:542) at org.dita.dost.reader.ChunkMapReader.processTopicref(ChunkMapReader.java:347) at org.dita.dost.reader.ChunkMapReader.processChildTopicref(ChunkMapReader.java:516) at org.dita.dost.reader.ChunkMapReader.processTopicref(ChunkMapReader.java:372) at org.dita.dost.reader.ChunkMapReader.process(ChunkMapReader.java:143) at org.dita.dost.writer.AbstractDomFilter.read(AbstractDomFilter.java:55) at org.dita.dost.reader.ChunkMapReader.read(ChunkMapReader.java:119) at org.dita.dost.module.ChunkModule.execute(ChunkModule.java:80) at org.dita.dost.pipeline.PipelineFacade.execute(PipelineFacade.java:80) at org.dita.dost.invoker.ExtensibleAntInvoker.execute(ExtensibleAntInvoker.java:230) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:435) at org.apache.tools.ant.Target.performTasks(Target.java:456) at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1405) at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38) at org.apache.tools.ant.Project.executeTargets(Project.java:1260) at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:441) at org.apache.tools.ant.taskdefs.CallTarget.execute(CallTarget.java:105) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:435) at org.apache.tools.ant.Target.performTasks(Target.java:456) at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1405) at org.apache.tools.ant.Project.executeTarget(Project.java:1376) at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41) at org.apache.tools.ant.Project.executeTargets(Project.java:1260) at org.apache.tools.ant.Main.runBuild(Main.java:857) at org.apache.tools.ant.Main.startAnt(Main.java:236) at org.apache.tools.ant.launch.Launcher.run(Launcher.java:287) at org.apache.tools.ant.launch.Launcher.main(Launcher.java:113) Total time: 4 seconds The process finished with exit code: 1 ## Environment <!-- Include relevant details about the environment you experienced this in. --> * DITA-OT version: 2.5.3 * Operating system and version: macOS 10.12.6 (16G29) * How did you run DITA-OT? oXygen * Transformation type: HTML5 <!-- Before submitting, check the Preview tab above to verify the XML markup appears correctly and remember you can edit the description later to add information. -->
non_priority
npe failure in chunktopicparser expected behavior publication should be processed without failure with chunking correctly reflected actual behavior npe chunk processing file users ekimber workspace dita community dita test cases topichead chunking temp topichead chunking test ditamap processing file users ekimber workspace dita community dita test cases topichead chunking temp dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita writing file users ekimber workspace dita community dita test cases topichead chunking temp dita build failed users ekimber dita ot dita ot build xml the following error occurred while executing this line users ekimber dita ot dita ot plugins org dita base build preprocess xml java lang nullpointerexception at org dita dost writer chunktopicparser processchunk chunktopicparser java at org dita dost writer chunktopicparser write chunktopicparser java at org dita dost reader chunkmapreader processcombinechunk chunkmapreader java at org dita dost reader chunkmapreader processtopicref chunkmapreader java at org dita dost reader chunkmapreader processchildtopicref chunkmapreader java at org dita dost reader chunkmapreader processtopicref chunkmapreader java at org dita dost reader chunkmapreader process chunkmapreader java at org dita dost writer abstractdomfilter read abstractdomfilter java at org dita dost reader chunkmapreader read chunkmapreader java at org dita dost module chunkmodule execute chunkmodule java at org dita dost pipeline pipelinefacade execute pipelinefacade java at org dita dost invoker extensibleantinvoker execute extensibleantinvoker java at org apache tools ant unknownelement execute unknownelement java at sun reflect invoke unknown source at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org apache tools ant dispatch dispatchutils execute dispatchutils java at org apache tools ant task perform task java at org apache tools ant target execute target java at org apache tools ant target performtasks target java at org apache tools ant project executesortedtargets project java at org apache tools ant helper singlecheckexecutor executetargets singlecheckexecutor java at org apache tools ant project executetargets project java at org apache tools ant taskdefs ant execute ant java at org apache tools ant taskdefs calltarget execute calltarget java at org apache tools ant unknownelement execute unknownelement java at sun reflect invoke unknown source at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org apache tools ant dispatch dispatchutils execute dispatchutils java at org apache tools ant task perform task java at org apache tools ant target execute target java at org apache tools ant target performtasks target java at org apache tools ant project executesortedtargets project java at org apache tools ant project executetarget project java at org apache tools ant helper defaultexecutor executetargets defaultexecutor java at org apache tools ant project executetargets project java at org apache tools ant main runbuild main java at org apache tools ant main startant main java at org apache tools ant launch launcher run launcher java at org apache tools ant launch launcher main launcher java possible solution none yet identified steps to reproduce apply any transform with unmodified preprocessing e g xhtml to root topic test data set is in github here copy of the error message log file or stack trace executing applications oxygen xml editor jre bundle contents home jre bin java dapple awt uielement true doxygen org apache xerces xni parser xmlparserconfiguration org ditang relaxng defaults relaxdefaultsparserconfiguration classpath applications oxygen xml editor tools ant lib ant launcher jar dant home applications oxygen xml editor tools ant org apache tools ant launch launcher lib users ekimber dita ot dita ot lib saxon dom jar lib users ekimber dita ot dita ot lib saxon jar lib users ekimber dita ot dita ot lib commons io jar lib applications oxygen xml editor classes lib applications oxygen xml editor lib oxygen annotations jar lib applications oxygen xml editor lib oxygen content completion api jar lib applications oxygen xml editor lib oxygen css flute parser jar lib applications oxygen xml editor lib oxygen css pretty printer jar lib applications oxygen xml editor lib oxygen css validator jar lib applications oxygen xml editor lib oxygen emf jar lib applications oxygen xml editor lib oxygen jfx components jar lib applications oxygen xml editor lib oxygen pdf chemistry jar lib applications oxygen xml editor lib oxygen text search jar lib applications oxygen xml editor lib oxygen token markers jar lib applications oxygen xml editor lib oxygen validation api jar lib applications oxygen xml editor lib oxygen xquery pretty printer jar lib applications oxygen xml editor lib oxygen jar lib applications oxygen xml editor lib resolver jar lib applications oxygen xml editor lib oxygen token markers jar lib applications oxygen xml editor lib org eclipse wst xml processor jar lib applications oxygen xml editor lib xml apis jar lib applications oxygen xml editor lib xercesimpl jar lib applications oxygen xml editor lib commons io jar lib applications oxygen xml editor lib commons logging jar lib applications oxygen xml editor lib jar lib users ekimber dita ot dita ot lib commons codec jar lib applications oxygen xml editor lib jing jar lib applications oxygen xml editor lib jar lib applications oxygen xml editor lib saxon jar lib applications oxygen xml editor lib xmlgraphics commons jar lib applications oxygen xml editor lib fop jar lib applications oxygen xml editor lib fontbox jar lib applications oxygen xml editor lib batik all jar lib applications oxygen xml editor lib js jar lib applications oxygen xml editor lib poi final jar lib applications oxygen xml editor lib nekohtml jar lib applications oxygen xml editor lib xml apis ext jar lib applications oxygen xml editor lib avalon framework api jar lib applications oxygen xml editor lib avalon framework impl jar lib applications oxygen xml editor lib jeuclid core jar lib applications oxygen xml editor lib jeuclid fop jar lib applications oxygen xml editor lib jai tiff jar lib applications oxygen xml editor lib jh jar lib users ekimber dita ot dita ot lib jsearch jar lib applications oxygen xml editor lib lucene analyzers common jar lib applications oxygen xml editor lib lucene analyzers kuromoji jar lib applications oxygen xml editor lib lucene core jar lib applications oxygen xml editor lib lucene misc jar lib applications oxygen xml editor lib lucene queries jar lib applications oxygen xml editor lib lucene queryparser jar lib applications oxygen xml editor lib lucene suggest jar lib users ekimber dita ot dita ot lib users ekimber dita ot dita ot lib dost patches jar lib users ekimber dita ot dita ot lib lib users ekimber dita ot dita ot lib ant apache resolver jar lib users ekimber dita ot dita ot lib ant launcher jar lib users ekimber dita ot dita ot lib ant jar lib users ekimber dita ot dita ot lib dost configuration jar lib users ekimber dita ot dita ot lib dost jar lib users ekimber dita ot dita ot lib guava jar lib users ekimber dita ot dita ot lib jar lib users ekimber dita ot dita ot lib logback classic jar lib users ekimber dita ot dita ot lib logback core jar lib users ekimber dita ot dita ot lib api jar lib users ekimber dita ot dita ot lib xercesimpl jar lib users ekimber dita ot dita ot lib xml apis jar lib users ekimber dita ot dita ot lib xml resolver jar lib users ekimber dita ot dita ot plugins org dita lib fo jar lib users ekimber dita ot dita ot plugins org dita axf lib axf jar lib users ekimber dita ot dita ot plugins org dita fop lib avalon framework api jar lib users ekimber dita ot dita ot plugins org dita fop lib avalon framework impl jar lib users ekimber dita ot dita ot plugins org dita fop lib batik anim jar lib users ekimber dita ot dita ot plugins org dita fop lib batik awt util jar lib users ekimber dita ot dita ot plugins org dita fop lib batik bridge jar lib users ekimber dita ot dita ot plugins org dita fop lib batik css jar lib users ekimber dita ot dita ot plugins org dita fop lib batik dom jar lib users ekimber dita ot dita ot plugins org dita fop lib batik ext jar lib users ekimber dita ot dita ot plugins org dita fop lib batik extension jar lib users ekimber dita ot dita ot plugins org dita fop lib batik gvt jar lib users ekimber dita ot dita ot plugins org dita fop lib batik parser jar lib users ekimber dita ot dita ot plugins org dita fop lib batik script jar lib users ekimber dita ot dita ot plugins org dita fop lib batik svg dom jar lib users ekimber dita ot dita ot plugins org dita fop lib batik svggen jar lib users ekimber dita ot dita ot plugins org dita fop lib batik transcoder jar lib users ekimber dita ot dita ot plugins org dita fop lib batik util jar lib users ekimber dita ot dita ot plugins org dita fop lib batik xml jar lib users ekimber dita ot dita ot plugins org dita fop lib commons logging jar lib users ekimber dita ot dita ot plugins org dita fop lib fop jar lib users ekimber dita ot dita ot plugins org dita fop lib xmlgraphics commons jar lib users ekimber dita ot dita ot plugins org dita fop lib xml apis ext jar lib users ekimber dita ot dita ot plugins org dita xep lib xep jar f users ekimber dita ot dita ot build xml dtranstype dbasedir users ekimber workspace dita community dita test cases topichead chunking doutput dir users ekimber workspace dita community dita test cases topichead chunking out ddita temp dir users ekimber workspace dita community dita test cases topichead chunking temp dargs input users ekimber workspace dita community dita test cases topichead chunking topichead chunking test ditamap ddita dir users ekimber dita ot dita ot dbasejvmargline buildfile users ekimber dita ot dita ot build xml init init init properties check arg created dir users ekimber workspace dita community dita test cases topichead chunking out created dir users ekimber workspace dita community dita test cases topichead chunking temp log arg basedir users ekimber workspace dita community dita test cases topichead chunking dita dir users ekimber dita ot dita ot transtype tempdir users ekimber workspace dita community dita test cases topichead chunking temp outputdir users ekimber workspace dita community dita test cases topichead chunking out clean temp true dita ot version xml parser xerces xslt processor saxon collator icu ant properties sun sep cdt args css file temp args css args css real args css args grammar cache yes args input users ekimber workspace dita community dita test cases topichead chunking topichead chunking test ditamap args logdir users ekimber workspace dita community dita test cases topichead chunking out args xml systemid set yes args xsl users ekimber dita ot dita ot plugins org dita xsl xsl dita dir users ekimber dita ot dita ot dita reloadstylesheet false dita output dir users ekimber workspace dita community dita test cases topichead chunking out dita plugin com sophos tocjs dir users ekimber dita ot dita ot plugins com sophos tocjs dita plugin org dita base dir users ekimber dita ot dita ot dita plugin org dita eclipsehelp dir users ekimber dita ot dita ot plugins org dita eclipsehelp dita plugin org dita dir users ekimber dita ot dita ot plugins org dita dita plugin org dita htmlhelp dir users ekimber dita ot dita ot plugins org dita htmlhelp dita plugin org dita javahelp dir users ekimber dita ot dita ot plugins org dita javahelp dita plugin org dita axf dir users ekimber dita ot dita ot plugins org dita axf dita plugin org dita dir users ekimber dita ot dita ot plugins org dita dita plugin org dita fop dir users ekimber dita ot dita ot plugins org dita fop dita plugin org dita xep dir users ekimber dita ot dita ot plugins org dita xep dita plugin org dita specialization dir users ekimber dita ot dita ot plugins org dita specialization dita plugin org dita troff dir users ekimber dita ot dita ot plugins org dita troff dita plugin org dita xhtml dir users ekimber dita ot dita ot plugins org dita xhtml dita plugin org oasis open dita dir users ekimber dita ot dita ot plugins org oasis open dita dita plugin org oasis open dita dir users ekimber dita ot dita ot plugins org oasis open dita dita temp dir users ekimber workspace dita community dita test cases topichead chunking temp build init preprocess init input users ekimber workspace dita community dita test cases topichead chunking topichead chunking test ditamap ditaval merge gen list using xerces grammar pool for dtd and schema caching processing file users ekimber workspace dita community dita test cases topichead chunking topichead chunking test ditamap processing file users ekimber workspace dita community dita test cases topichead chunking topichead chunking test dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic chunk root dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic chunk root dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita serializing job specification debug filter using xerces grammar pool for dtd and schema caching processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topichead chunking test ditamap to file users ekimber workspace dita community dita test cases topichead chunking temp topichead chunking test ditamap processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topichead chunking test dita to file users ekimber workspace dita community dita test cases topichead chunking temp topichead chunking test dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic chunk root dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic chunk root dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic chunk root dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic chunk root dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking topics topic dita to file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp canditopics list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp conref list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp conreftargets list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp copytosource list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp fullditamap list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp fullditamapandtopic list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp fullditatopic list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp hrefditatopic list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp hreftargets list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp html list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp image list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp outditafiles list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp resourceonly list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp resourceonly list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp subjectscheme list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp subtargets list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp user input file list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl mapref check mapref transforming into users ekimber workspace dita community dita test cases topichead chunking temp loading stylesheet users ekimber dita ot dita ot xsl preprocess mapref xsl processing users ekimber workspace dita community dita test cases topichead chunking temp topichead chunking test ditamap branch filter processing file users ekimber workspace dita community dita test cases topichead chunking temp topichead chunking test ditamap copy image copy html copy flag check copy flag copy files keyref reading file users ekimber workspace dita community dita test cases topichead chunking temp topichead chunking test ditamap processing file users ekimber workspace dita community dita test cases topichead chunking temp topichead chunking test ditamap processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita transforming into users ekimber workspace dita community dita test cases topichead chunking temp loading stylesheet users ekimber dita ot dita ot xsl preprocess mapref xsl processing users ekimber workspace dita community dita test cases topichead chunking temp topichead chunking test ditamap copy to conrefpush processing users ekimber workspace dita community dita test cases topichead chunking temp job xml to users ekimber workspace dita community dita test cases topichead chunking temp conref list loading stylesheet users ekimber dita ot dita ot xsl job helper xsl conref check conref profile check profile topic fragment processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topichead chunking test dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic chunk root dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic chunk root dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita chunk check chunk processing file users ekimber workspace dita community dita test cases topichead chunking temp topichead chunking test ditamap processing file users ekimber workspace dita community dita test cases topichead chunking temp dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita processing file users ekimber workspace dita community dita test cases topichead chunking temp topics topic dita writing file users ekimber workspace dita community dita test cases topichead chunking temp dita build failed users ekimber dita ot dita ot build xml the following error occurred while executing this line users ekimber dita ot dita ot plugins org dita base build preprocess xml java lang nullpointerexception at org dita dost writer chunktopicparser processchunk chunktopicparser java at org dita dost writer chunktopicparser write chunktopicparser java at org dita dost reader chunkmapreader processcombinechunk chunkmapreader java at org dita dost reader chunkmapreader processtopicref chunkmapreader java at org dita dost reader chunkmapreader processchildtopicref chunkmapreader java at org dita dost reader chunkmapreader processtopicref chunkmapreader java at org dita dost reader chunkmapreader process chunkmapreader java at org dita dost writer abstractdomfilter read abstractdomfilter java at org dita dost reader chunkmapreader read chunkmapreader java at org dita dost module chunkmodule execute chunkmodule java at org dita dost pipeline pipelinefacade execute pipelinefacade java at org dita dost invoker extensibleantinvoker execute extensibleantinvoker java at org apache tools ant unknownelement execute unknownelement java at sun reflect invoke unknown source at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org apache tools ant dispatch dispatchutils execute dispatchutils java at org apache tools ant task perform task java at org apache tools ant target execute target java at org apache tools ant target performtasks target java at org apache tools ant project executesortedtargets project java at org apache tools ant helper singlecheckexecutor executetargets singlecheckexecutor java at org apache tools ant project executetargets project java at org apache tools ant taskdefs ant execute ant java at org apache tools ant taskdefs calltarget execute calltarget java at org apache tools ant unknownelement execute unknownelement java at sun reflect invoke unknown source at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org apache tools ant dispatch dispatchutils execute dispatchutils java at org apache tools ant task perform task java at org apache tools ant target execute target java at org apache tools ant target performtasks target java at org apache tools ant project executesortedtargets project java at org apache tools ant project executetarget project java at org apache tools ant helper defaultexecutor executetargets defaultexecutor java at org apache tools ant project executetargets project java at org apache tools ant main runbuild main java at org apache tools ant main startant main java at org apache tools ant launch launcher run launcher java at org apache tools ant launch launcher main launcher java total time seconds the process finished with exit code environment dita ot version operating system and version macos how did you run dita ot oxygen transformation type before submitting check the preview tab above to verify the xml markup appears correctly and remember you can edit the description later to add information
0
96,760
12,156,291,166
IssuesEvent
2020-04-25 16:42:09
DominicSherman/easy-budget-mobile
https://api.github.com/repos/DominicSherman/easy-budget-mobile
closed
Create UI animations for selection and loading
design enhancement
This is the definition of a nice to have but minor animations to elements can really increase the pleasantry of the User's Experience. I'll have to research what works with implementation as well.
1.0
Create UI animations for selection and loading - This is the definition of a nice to have but minor animations to elements can really increase the pleasantry of the User's Experience. I'll have to research what works with implementation as well.
non_priority
create ui animations for selection and loading this is the definition of a nice to have but minor animations to elements can really increase the pleasantry of the user s experience i ll have to research what works with implementation as well
0
91,725
26,473,006,246
IssuesEvent
2023-01-17 09:02:27
PaddlePaddle/Paddle
https://api.github.com/repos/PaddlePaddle/Paddle
closed
使用develop分支源码构建aarch64架构的包,打完之后包版本显示0.0.0,请问修改哪部分代码可以修改版本
status/new-issue type/build status/close
### 问题描述 Issue Description 使用develop分支源码构建aarch64架构的包,打完之后包版本显示0.0.0,安装完whl之后,pip list也显示paddle的版本为0.0.0,请问我应该如何解决这个问题,谢谢 ### 版本&环境信息 Version & Environment Information 操作系统:统信UOSV20-服务器版 CPU:鲲鹏920 CPU架构:aarch64
1.0
使用develop分支源码构建aarch64架构的包,打完之后包版本显示0.0.0,请问修改哪部分代码可以修改版本 - ### 问题描述 Issue Description 使用develop分支源码构建aarch64架构的包,打完之后包版本显示0.0.0,安装完whl之后,pip list也显示paddle的版本为0.0.0,请问我应该如何解决这个问题,谢谢 ### 版本&环境信息 Version & Environment Information 操作系统:统信UOSV20-服务器版 CPU:鲲鹏920 CPU架构:aarch64
non_priority
, ,请问修改哪部分代码可以修改版本 问题描述 issue description , ,安装完whl之后,pip ,请问我应该如何解决这个问题,谢谢 版本 环境信息 version environment information 操作系统: 服务器版 cpu: cpu架构:
0
133,141
18,836,477,323
IssuesEvent
2021-11-11 01:57:27
CSBSJU-CS330-F21/ChefsKiss
https://api.github.com/repos/CSBSJU-CS330-F21/ChefsKiss
opened
Design Pattern - Builder
Design Patterns
Using the Builder design pattern, we will be able to better organize our homepage. The organization of Androids activities is somewhat setup in a way that has depth from the homepage. <img width="433" alt="Screen Shot 2021-11-10 at 7 57 13 PM" src="https://user-images.githubusercontent.com/81580099/141223312-7437e747-f9d8-405b-864a-a8afac1837b6.png"> Link to more info on Builder: https://sourcemaking.com/design_patterns/builder
1.0
Design Pattern - Builder - Using the Builder design pattern, we will be able to better organize our homepage. The organization of Androids activities is somewhat setup in a way that has depth from the homepage. <img width="433" alt="Screen Shot 2021-11-10 at 7 57 13 PM" src="https://user-images.githubusercontent.com/81580099/141223312-7437e747-f9d8-405b-864a-a8afac1837b6.png"> Link to more info on Builder: https://sourcemaking.com/design_patterns/builder
non_priority
design pattern builder using the builder design pattern we will be able to better organize our homepage the organization of androids activities is somewhat setup in a way that has depth from the homepage img width alt screen shot at pm src link to more info on builder
0
226,716
24,996,511,976
IssuesEvent
2022-11-03 01:10:33
hzhaoCS/juice-shop
https://api.github.com/repos/hzhaoCS/juice-shop
opened
socket.io-2.1.13.tgz: 1 vulnerabilities (highest severity is: 9.8)
security vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>socket.io-2.1.13.tgz</b></p></summary> <p></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/socket.io-parser/package.json</p> <p> <p>Found in HEAD commit: <a href="https://github.com/hzhaoCS/juice-shop/commit/8c9b0228ffbc4bbd54967e5ee1dcb66ebde454fe">8c9b0228ffbc4bbd54967e5ee1dcb66ebde454fe</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (socket.io version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2022-2421](https://www.mend.io/vulnerability-database/CVE-2022-2421) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | socket.io-parser-4.1.2.tgz | Transitive | N/A* | &#10060; | <p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the section "Details" below to see if there is a version of transitive dependency where vulnerability is fixed.</p> ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-2421</summary> ### Vulnerable Library - <b>socket.io-parser-4.1.2.tgz</b></p> <p>socket.io protocol parser</p> <p>Library home page: <a href="https://registry.npmjs.org/socket.io-parser/-/socket.io-parser-4.1.2.tgz">https://registry.npmjs.org/socket.io-parser/-/socket.io-parser-4.1.2.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/socket.io-parser/package.json</p> <p> Dependency Hierarchy: - socket.io-2.1.13.tgz (Root Library) - socket.io-parser-3.0.0.tgz - :x: **socket.io-parser-4.1.2.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/hzhaoCS/juice-shop/commit/8c9b0228ffbc4bbd54967e5ee1dcb66ebde454fe">8c9b0228ffbc4bbd54967e5ee1dcb66ebde454fe</a></p> <p>Found in base branch: <b>master</b></p> </p> <p></p> ### Vulnerability Details <p> Due to improper type validation in attachment parsing the Socket.io js library, it is possible to overwrite the _placeholder object which allows an attacker to place references to functions at arbitrary places in the resulting query object. <p>Publish Date: 2022-10-26 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-2421>CVE-2022-2421</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>9.8</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://csirt.divd.nl/cases/DIVD-2022-00045/">https://csirt.divd.nl/cases/DIVD-2022-00045/</a></p> <p>Release Date: 2022-10-26</p> <p>Fix Resolution: socket.io-parser - 4.0.5,4.2.1</p> </p> <p></p> Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) </details>
True
socket.io-2.1.13.tgz: 1 vulnerabilities (highest severity is: 9.8) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>socket.io-2.1.13.tgz</b></p></summary> <p></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/socket.io-parser/package.json</p> <p> <p>Found in HEAD commit: <a href="https://github.com/hzhaoCS/juice-shop/commit/8c9b0228ffbc4bbd54967e5ee1dcb66ebde454fe">8c9b0228ffbc4bbd54967e5ee1dcb66ebde454fe</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (socket.io version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2022-2421](https://www.mend.io/vulnerability-database/CVE-2022-2421) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | socket.io-parser-4.1.2.tgz | Transitive | N/A* | &#10060; | <p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the section "Details" below to see if there is a version of transitive dependency where vulnerability is fixed.</p> ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-2421</summary> ### Vulnerable Library - <b>socket.io-parser-4.1.2.tgz</b></p> <p>socket.io protocol parser</p> <p>Library home page: <a href="https://registry.npmjs.org/socket.io-parser/-/socket.io-parser-4.1.2.tgz">https://registry.npmjs.org/socket.io-parser/-/socket.io-parser-4.1.2.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/socket.io-parser/package.json</p> <p> Dependency Hierarchy: - socket.io-2.1.13.tgz (Root Library) - socket.io-parser-3.0.0.tgz - :x: **socket.io-parser-4.1.2.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/hzhaoCS/juice-shop/commit/8c9b0228ffbc4bbd54967e5ee1dcb66ebde454fe">8c9b0228ffbc4bbd54967e5ee1dcb66ebde454fe</a></p> <p>Found in base branch: <b>master</b></p> </p> <p></p> ### Vulnerability Details <p> Due to improper type validation in attachment parsing the Socket.io js library, it is possible to overwrite the _placeholder object which allows an attacker to place references to functions at arbitrary places in the resulting query object. <p>Publish Date: 2022-10-26 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-2421>CVE-2022-2421</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>9.8</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://csirt.divd.nl/cases/DIVD-2022-00045/">https://csirt.divd.nl/cases/DIVD-2022-00045/</a></p> <p>Release Date: 2022-10-26</p> <p>Fix Resolution: socket.io-parser - 4.0.5,4.2.1</p> </p> <p></p> Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) </details>
non_priority
socket io tgz vulnerabilities highest severity is vulnerable library socket io tgz path to dependency file package json path to vulnerable library node modules socket io parser package json found in head commit a href vulnerabilities cve severity cvss dependency type fixed in socket io version remediation available high socket io parser tgz transitive n a for some transitive vulnerabilities there is no version of direct dependency with a fix check the section details below to see if there is a version of transitive dependency where vulnerability is fixed details cve vulnerable library socket io parser tgz socket io protocol parser library home page a href path to dependency file package json path to vulnerable library node modules socket io parser package json dependency hierarchy socket io tgz root library socket io parser tgz x socket io parser tgz vulnerable library found in head commit a href found in base branch master vulnerability details due to improper type validation in attachment parsing the socket io js library it is possible to overwrite the placeholder object which allows an attacker to place references to functions at arbitrary places in the resulting query object publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution socket io parser step up your open source security game with mend
0
15,350
8,852,562,976
IssuesEvent
2019-01-08 18:42:52
webhintio/hint
https://api.github.com/repos/webhintio/hint
reopened
Add rule(s) to check for redirects
area:hint help wanted hint-category:performance
Possible checks: - [X] Multiple redirect (bad for performance) (#641) - [ ] Redirects from HTTPS to HTTP - [ ] Client-side redirects (meta or JavaScript) - [ ] Other?
True
Add rule(s) to check for redirects - Possible checks: - [X] Multiple redirect (bad for performance) (#641) - [ ] Redirects from HTTPS to HTTP - [ ] Client-side redirects (meta or JavaScript) - [ ] Other?
non_priority
add rule s to check for redirects possible checks multiple redirect bad for performance redirects from https to http client side redirects meta or javascript other
0
396,920
27,141,615,220
IssuesEvent
2023-02-16 16:43:59
chartjs/Chart.js
https://api.github.com/repos/chartjs/Chart.js
opened
Docs incomplete, no code examples.
type: documentation
### Documentation Is: - [X] Missing or needed? - [X] Confusing - [ ] Not sure? ### Please Explain in Detail... There are no full examples of how to use this library and the code that has been provided is incomplete. I am making a simple app and I just need to display a basic array as a line chart. None of the examples are copy and paste-able or really explain how to use Chart.js Example: https://www.chartjs.org/docs/latest/samples/line/stepped.html You've just got random code snippets on the page, this make make sense to the developers but its useless for people who are new to this library. I've been writing JS for +10 years, what am I meant to do with this? This isn't a code example, you've just defined an object called config... How does this help me implement ChartJS into my project? ```js const config = { type: 'line', data: data, options: { responsive: true, interaction: { intersect: false, axis: 'x' }, plugins: { title: { display: true, text: (ctx) => 'Step ' + ctx.chart.data.datasets[0].stepped + ' Interpolation', } } } }; ``` ### Your Proposal for Changes Copy+Paste-able code examples. ### Example _No response_
1.0
Docs incomplete, no code examples. - ### Documentation Is: - [X] Missing or needed? - [X] Confusing - [ ] Not sure? ### Please Explain in Detail... There are no full examples of how to use this library and the code that has been provided is incomplete. I am making a simple app and I just need to display a basic array as a line chart. None of the examples are copy and paste-able or really explain how to use Chart.js Example: https://www.chartjs.org/docs/latest/samples/line/stepped.html You've just got random code snippets on the page, this make make sense to the developers but its useless for people who are new to this library. I've been writing JS for +10 years, what am I meant to do with this? This isn't a code example, you've just defined an object called config... How does this help me implement ChartJS into my project? ```js const config = { type: 'line', data: data, options: { responsive: true, interaction: { intersect: false, axis: 'x' }, plugins: { title: { display: true, text: (ctx) => 'Step ' + ctx.chart.data.datasets[0].stepped + ' Interpolation', } } } }; ``` ### Your Proposal for Changes Copy+Paste-able code examples. ### Example _No response_
non_priority
docs incomplete no code examples documentation is missing or needed confusing not sure please explain in detail there are no full examples of how to use this library and the code that has been provided is incomplete i am making a simple app and i just need to display a basic array as a line chart none of the examples are copy and paste able or really explain how to use chart js example you ve just got random code snippets on the page this make make sense to the developers but its useless for people who are new to this library i ve been writing js for years what am i meant to do with this this isn t a code example you ve just defined an object called config how does this help me implement chartjs into my project js const config type line data data options responsive true interaction intersect false axis x plugins title display true text ctx step ctx chart data datasets stepped interpolation your proposal for changes copy paste able code examples example no response
0
33,300
4,467,483,789
IssuesEvent
2016-08-25 05:02:35
owncloud/core
https://api.github.com/repos/owncloud/core
closed
favicon.ico from custom theme is not applied
bug design feature:theming
<!-- Thanks for reporting issues back to ownCloud! This is the issue tracker of ownCloud, if you have any support question please check out https://owncloud.org/support This is the bug tracker for the Server component. Find other components at https://github.com/owncloud/core/blob/master/CONTRIBUTING.md#guidelines To make it possible for us to help you please fill out below information carefully. --> ### Steps to reproduce 1. Edit config.php to specify 'theme' => 'example', ### Expected behaviour favicon will be served from "/themes/example/core/img/favicon.ico": `<link rel="shortcut icon" href="/themes/example/core/img/favicon.ico">` ### Actual behaviour favicon.ico is served from /core/img: `<link rel="shortcut icon" href="/core/img/favicon.ico">` ### Additional Troubleshooting steps / comments If I replace 'favicon.ico' with 'favicon.png' in core/templates/layout.user.php and refresh my browser, the correct file is served. `<link rel="shortcut icon" href="/themes/example/core/img/favicon.png">` I have confirmed that favicon.ico does exist in the example theme, and is accessible from the web: https://my-owncloud-server/themes/example/core/img/favicon.ico It seems that this line determines the correct theme path for for .png files but not for .ico files: `print_unescaped(image_path($_['appid'], 'favicon.ico')); /* IE11+ supports png */ ?>">` ### Server configuration **Operating system**: Ubuntu 14.04 **Web server:** Apache/2.4.7 (Ubuntu) **Database:** 5.5.47-0ubuntu0.14.04.1 (Ubuntu) **PHP version:** PHP 5.5.9-1ubuntu4.14 **ownCloud version:** (see ownCloud admin page) - version: 9.0.0.19 **Updated from an older ownCloud or fresh install:** - updated from 8.2 **Where did you install ownCloud from:** Repositories **Signing status (ownCloud 9.0 and above):** ``` Technical information ===================== The following list covers which files have failed the integrity check. Please read the previous linked documentation to learn more about the errors and how to fix them. Results ======= - core - INVALID_HASH - core/img/favicon.ico - lib/private/route/cachingrouter.php - EXTRA_FILE - lib/private/route/cachingrouter.php.org - files_external - INVALID_HASH - lib/smb.php - user_external - INVALID_HASH - lib/imap.php Raw output ========== Array ( [core] => Array ( [INVALID_HASH] => Array ( [core/img/favicon.ico] => Array ( [expected] => 9894ca649dbbf6bcda4d77965744c2695ad071570bbedb9509db9747747f6f139b6ed9483360c607feff8eeb105b323fb8089b2b9d067d77b91530d81b09b898 [current] => 17bc4bb185e4b9c077d6d4ac40876bf81c4e9a87badeb05c074e1ba5a15df5d6e2d129793dbc45640b87cabe42ba4a15c8131cc7ac5392d09b4c56ecf6efa061 ) [lib/private/route/cachingrouter.php] => Array ( [expected] => d246e73235b17d79ec37f9d4fa2110c3b68cc4584893fe0be5c983f9dc2a65ae6e2098cbd4df0e45b090296be9f583dfe8ca5ed47fb880277e44279c80f127e7 [current] => 2468462b71bbcdf4e7b5a161ab4d8ef0330e06275ea611064130f8f0e4881072041399d87a34c5e930daf586d50cfca89247d95c38137f03dec45287558e9982 ) ) [EXTRA_FILE] => Array ( [lib/private/route/cachingrouter.php.org] => Array ( [expected] => [current] => ) ) ) [files_external] => Array ( [INVALID_HASH] => Array ( [lib/smb.php] => Array ( [expected] => 823db849be4a8e785848d760fddd79c008eeebbf2390dc821b391a634d8e49db61a0e3f641eafe71dea2686d86189d8d80ff2f467f674e27990916c22d225ecd [current] => b29a2d69106e703f1eb92244fa0b6d0adc3048ca82497b9b215b3ad1d0f45fb0a644420689fbee9a6b8d37d2f352c234e3a613dc690e396d5383757305ba46f4 ) ) ) [user_external] => Array ( [INVALID_HASH] => Array ( [lib/imap.php] => Array ( [expected] => d4ebccdc4a2f2a97456c666ef6f6608ed631ceaee3f605e270df88be08a1c3e2c3b6944f07a271612bc6fa79b708c0d9a2ee54b310e68255c415f46904665efa [current] => 8c8bd0fb53e94920c1b773dfdafb7fea7d223b9600ebcb328d72cbf36f232b5bc19ac6f2f13e81e226569cfc3e26ea7a73eb00da32bc306b216204a140c4d152 ) ) ) ) ``` Explanations of signature failures: - files_external/lib/smb.php customized to remove '@domain.tld' from users before mounting external share - user_external/lib/imap.php customized to select IMAP server by '@domain.tld' and add new users group 'domain.tld' automatically - core/img/favicon.ico customized for my installation - lib/private/route/cachingrouter.php [patched](https://github.com/owncloud/core/pull/23214/commits/c3c491689b2243b88fe88f4ffb1f8274575d2029) to fix a problem with the activity app when using memcached **List of activated apps:** ``` sudo -u www-data ./occ app:list Enabled: - activity: 2.2.1 - comments: 0.2 - dav: 0.1.5 - federatedfilesharing: 0.1.0 - federation: 0.0.4 - files: 1.4.4 - files_external: 0.5.2 - files_pdfviewer: 0.8 - files_sharing: 0.9.1 - files_videoplayer: 0.9.8 - firstrunwizard: 1.1 - gallery: 14.5.0 - notifications: 0.2.3 - provisioning_api: 0.4.1 - systemtags: 0.2 - templateeditor: 0.1 - updatenotification: 0.1.0 - user_external: 0.4 Disabled: - encryption - external - files_texteditor - files_trashbin - files_versions - user_ldap ``` **The content of config/config.php:** ``` sudo -u www-data ./occ config:list system { "system": { "instanceid": "ocojcz4c8jtx", "passwordsalt": "***REMOVED SENSITIVE VALUE***", "secret": "***REMOVED SENSITIVE VALUE***", "trusted_domains": [ "192.168.1.7", "REMOVED" ], "datadirectory": "\/vcloud\/data", "overwrite.cli.url": "", "dbtype": "mysql", "version": "9.0.0.19", "dbname": "owncloud", "dbhost": "localhost", "dbtableprefix": "oc_", "dbuser": "***REMOVED SENSITIVE VALUE***", "dbpassword": "***REMOVED SENSITIVE VALUE***", "installed": true, "theme": "example", "user_backends": [ { "class": "OC_User_IMAP", "arguments": [ "{mail06.aicr.org:993\/imap\/ssl\/novalidate-cert\/notls\/readonly}" ] }, { "class": "OC_User_IMAP", "arguments": [ "{office.mmsionline.us:993\/imap\/ssl\/novalidate-cert\/notls\/readonly}" ] } ], "maintenance": false, "forcessl": true, "forceSSLforSubdomains": true, "mail_smtpmode": "smtp", "mail_from_address": "vcloud-noreply", "mail_domain": "aicr.org", "mail_smtphost": "mail06.aicr.org", "loglevel": 3, "logtimezone": "America\/New_York", "log_authfailip": true, "log_rotate_size": 104857600, "memcache.local": "\\OC\\Memcache\\Memcached", "memcache.distributed": "\\OC\\Memcache\\Memcached", "memcached_servers": [ [ "localhost", 11211 ] ], "cache_path": "", "knowledgebaseenabled": true, "enable_avatars": true, "allow_user_to_change_display_name": true, "preview_libreoffice_path": "\/usr\/bin\/libreoffice", "enabledPreviewProviders": [ "OC\\Preview\\PNG", "OC\\Preview\\JPEG", "OC\\Preview\\GIF", "OC\\Preview\\BMP", "OC\\Preview\\XBitmap", "OC\\Preview\\MP3", "OC\\Preview\\TXT", "OC\\Preview\\MarkDown", "OC\\Preview\\MSOfficeDoc", "OC\\Preview\\PDF", "OC\\Preview\\SVG", "OC\\Preview\\Photoshop" ], "asset-pipeline.enabled": true, "apps_paths": [ { "path": "\/var\/www\/owncloud\/apps", "url": "\/apps", "writable": true }, { "path": "\/var\/www\/owncloud\/apps-aicr", "url": "\/apps-aicr", "writable": false } ], "xframe_restriction": true, "mail_smtpsecure": "ssl", "mail_smtpauthtype": "PLAIN", "mail_smtpport": "465", "mail_smtpauth": 1, "mail_smtpname": "***REMOVED SENSITIVE VALUE***", "mail_smtppassword": "***REMOVED SENSITIVE VALUE***", "updatechecker": false, "activity_expire_days": 3652 } } ``` **Are you using external storage, if yes which one:** local/smb/sftp/... SFTP, FTP, SMB/CIFS **Are you using encryption:** yes/no NO **Are you using an external user-backend, if yes which one:** LDAP/ActiveDirectory/Webdav/... IMAP ### Client configuration **Browser:** - Firefox 45.0.1 - Internet Explorer 11.162.10586.0 - Microsoft Edge 25.10586 / Microsoft EdgeHTML 13.10586 **Operating system:** Windows 10 Version 10.0.10586 ### Logs #### Web server error log ``` [Sun Apr 03 08:58:17.954598 2016] [mpm_prefork:notice] [pid 3522] AH00163: Apache/2.4.7 (Ubuntu) PHP/5.5.9-1ubuntu4.14 OpenSSL/1.0.1f configured -- resuming normal operations [Sun Apr 03 08:58:17.954645 2016] [core:notice] [pid 3522] AH00094: Command line: '/usr/sbin/apache2' ``` #### ownCloud log (data/owncloud.log) not useful #### Browser log not useful
1.0
favicon.ico from custom theme is not applied - <!-- Thanks for reporting issues back to ownCloud! This is the issue tracker of ownCloud, if you have any support question please check out https://owncloud.org/support This is the bug tracker for the Server component. Find other components at https://github.com/owncloud/core/blob/master/CONTRIBUTING.md#guidelines To make it possible for us to help you please fill out below information carefully. --> ### Steps to reproduce 1. Edit config.php to specify 'theme' => 'example', ### Expected behaviour favicon will be served from "/themes/example/core/img/favicon.ico": `<link rel="shortcut icon" href="/themes/example/core/img/favicon.ico">` ### Actual behaviour favicon.ico is served from /core/img: `<link rel="shortcut icon" href="/core/img/favicon.ico">` ### Additional Troubleshooting steps / comments If I replace 'favicon.ico' with 'favicon.png' in core/templates/layout.user.php and refresh my browser, the correct file is served. `<link rel="shortcut icon" href="/themes/example/core/img/favicon.png">` I have confirmed that favicon.ico does exist in the example theme, and is accessible from the web: https://my-owncloud-server/themes/example/core/img/favicon.ico It seems that this line determines the correct theme path for for .png files but not for .ico files: `print_unescaped(image_path($_['appid'], 'favicon.ico')); /* IE11+ supports png */ ?>">` ### Server configuration **Operating system**: Ubuntu 14.04 **Web server:** Apache/2.4.7 (Ubuntu) **Database:** 5.5.47-0ubuntu0.14.04.1 (Ubuntu) **PHP version:** PHP 5.5.9-1ubuntu4.14 **ownCloud version:** (see ownCloud admin page) - version: 9.0.0.19 **Updated from an older ownCloud or fresh install:** - updated from 8.2 **Where did you install ownCloud from:** Repositories **Signing status (ownCloud 9.0 and above):** ``` Technical information ===================== The following list covers which files have failed the integrity check. Please read the previous linked documentation to learn more about the errors and how to fix them. Results ======= - core - INVALID_HASH - core/img/favicon.ico - lib/private/route/cachingrouter.php - EXTRA_FILE - lib/private/route/cachingrouter.php.org - files_external - INVALID_HASH - lib/smb.php - user_external - INVALID_HASH - lib/imap.php Raw output ========== Array ( [core] => Array ( [INVALID_HASH] => Array ( [core/img/favicon.ico] => Array ( [expected] => 9894ca649dbbf6bcda4d77965744c2695ad071570bbedb9509db9747747f6f139b6ed9483360c607feff8eeb105b323fb8089b2b9d067d77b91530d81b09b898 [current] => 17bc4bb185e4b9c077d6d4ac40876bf81c4e9a87badeb05c074e1ba5a15df5d6e2d129793dbc45640b87cabe42ba4a15c8131cc7ac5392d09b4c56ecf6efa061 ) [lib/private/route/cachingrouter.php] => Array ( [expected] => d246e73235b17d79ec37f9d4fa2110c3b68cc4584893fe0be5c983f9dc2a65ae6e2098cbd4df0e45b090296be9f583dfe8ca5ed47fb880277e44279c80f127e7 [current] => 2468462b71bbcdf4e7b5a161ab4d8ef0330e06275ea611064130f8f0e4881072041399d87a34c5e930daf586d50cfca89247d95c38137f03dec45287558e9982 ) ) [EXTRA_FILE] => Array ( [lib/private/route/cachingrouter.php.org] => Array ( [expected] => [current] => ) ) ) [files_external] => Array ( [INVALID_HASH] => Array ( [lib/smb.php] => Array ( [expected] => 823db849be4a8e785848d760fddd79c008eeebbf2390dc821b391a634d8e49db61a0e3f641eafe71dea2686d86189d8d80ff2f467f674e27990916c22d225ecd [current] => b29a2d69106e703f1eb92244fa0b6d0adc3048ca82497b9b215b3ad1d0f45fb0a644420689fbee9a6b8d37d2f352c234e3a613dc690e396d5383757305ba46f4 ) ) ) [user_external] => Array ( [INVALID_HASH] => Array ( [lib/imap.php] => Array ( [expected] => d4ebccdc4a2f2a97456c666ef6f6608ed631ceaee3f605e270df88be08a1c3e2c3b6944f07a271612bc6fa79b708c0d9a2ee54b310e68255c415f46904665efa [current] => 8c8bd0fb53e94920c1b773dfdafb7fea7d223b9600ebcb328d72cbf36f232b5bc19ac6f2f13e81e226569cfc3e26ea7a73eb00da32bc306b216204a140c4d152 ) ) ) ) ``` Explanations of signature failures: - files_external/lib/smb.php customized to remove '@domain.tld' from users before mounting external share - user_external/lib/imap.php customized to select IMAP server by '@domain.tld' and add new users group 'domain.tld' automatically - core/img/favicon.ico customized for my installation - lib/private/route/cachingrouter.php [patched](https://github.com/owncloud/core/pull/23214/commits/c3c491689b2243b88fe88f4ffb1f8274575d2029) to fix a problem with the activity app when using memcached **List of activated apps:** ``` sudo -u www-data ./occ app:list Enabled: - activity: 2.2.1 - comments: 0.2 - dav: 0.1.5 - federatedfilesharing: 0.1.0 - federation: 0.0.4 - files: 1.4.4 - files_external: 0.5.2 - files_pdfviewer: 0.8 - files_sharing: 0.9.1 - files_videoplayer: 0.9.8 - firstrunwizard: 1.1 - gallery: 14.5.0 - notifications: 0.2.3 - provisioning_api: 0.4.1 - systemtags: 0.2 - templateeditor: 0.1 - updatenotification: 0.1.0 - user_external: 0.4 Disabled: - encryption - external - files_texteditor - files_trashbin - files_versions - user_ldap ``` **The content of config/config.php:** ``` sudo -u www-data ./occ config:list system { "system": { "instanceid": "ocojcz4c8jtx", "passwordsalt": "***REMOVED SENSITIVE VALUE***", "secret": "***REMOVED SENSITIVE VALUE***", "trusted_domains": [ "192.168.1.7", "REMOVED" ], "datadirectory": "\/vcloud\/data", "overwrite.cli.url": "", "dbtype": "mysql", "version": "9.0.0.19", "dbname": "owncloud", "dbhost": "localhost", "dbtableprefix": "oc_", "dbuser": "***REMOVED SENSITIVE VALUE***", "dbpassword": "***REMOVED SENSITIVE VALUE***", "installed": true, "theme": "example", "user_backends": [ { "class": "OC_User_IMAP", "arguments": [ "{mail06.aicr.org:993\/imap\/ssl\/novalidate-cert\/notls\/readonly}" ] }, { "class": "OC_User_IMAP", "arguments": [ "{office.mmsionline.us:993\/imap\/ssl\/novalidate-cert\/notls\/readonly}" ] } ], "maintenance": false, "forcessl": true, "forceSSLforSubdomains": true, "mail_smtpmode": "smtp", "mail_from_address": "vcloud-noreply", "mail_domain": "aicr.org", "mail_smtphost": "mail06.aicr.org", "loglevel": 3, "logtimezone": "America\/New_York", "log_authfailip": true, "log_rotate_size": 104857600, "memcache.local": "\\OC\\Memcache\\Memcached", "memcache.distributed": "\\OC\\Memcache\\Memcached", "memcached_servers": [ [ "localhost", 11211 ] ], "cache_path": "", "knowledgebaseenabled": true, "enable_avatars": true, "allow_user_to_change_display_name": true, "preview_libreoffice_path": "\/usr\/bin\/libreoffice", "enabledPreviewProviders": [ "OC\\Preview\\PNG", "OC\\Preview\\JPEG", "OC\\Preview\\GIF", "OC\\Preview\\BMP", "OC\\Preview\\XBitmap", "OC\\Preview\\MP3", "OC\\Preview\\TXT", "OC\\Preview\\MarkDown", "OC\\Preview\\MSOfficeDoc", "OC\\Preview\\PDF", "OC\\Preview\\SVG", "OC\\Preview\\Photoshop" ], "asset-pipeline.enabled": true, "apps_paths": [ { "path": "\/var\/www\/owncloud\/apps", "url": "\/apps", "writable": true }, { "path": "\/var\/www\/owncloud\/apps-aicr", "url": "\/apps-aicr", "writable": false } ], "xframe_restriction": true, "mail_smtpsecure": "ssl", "mail_smtpauthtype": "PLAIN", "mail_smtpport": "465", "mail_smtpauth": 1, "mail_smtpname": "***REMOVED SENSITIVE VALUE***", "mail_smtppassword": "***REMOVED SENSITIVE VALUE***", "updatechecker": false, "activity_expire_days": 3652 } } ``` **Are you using external storage, if yes which one:** local/smb/sftp/... SFTP, FTP, SMB/CIFS **Are you using encryption:** yes/no NO **Are you using an external user-backend, if yes which one:** LDAP/ActiveDirectory/Webdav/... IMAP ### Client configuration **Browser:** - Firefox 45.0.1 - Internet Explorer 11.162.10586.0 - Microsoft Edge 25.10586 / Microsoft EdgeHTML 13.10586 **Operating system:** Windows 10 Version 10.0.10586 ### Logs #### Web server error log ``` [Sun Apr 03 08:58:17.954598 2016] [mpm_prefork:notice] [pid 3522] AH00163: Apache/2.4.7 (Ubuntu) PHP/5.5.9-1ubuntu4.14 OpenSSL/1.0.1f configured -- resuming normal operations [Sun Apr 03 08:58:17.954645 2016] [core:notice] [pid 3522] AH00094: Command line: '/usr/sbin/apache2' ``` #### ownCloud log (data/owncloud.log) not useful #### Browser log not useful
non_priority
favicon ico from custom theme is not applied thanks for reporting issues back to owncloud this is the issue tracker of owncloud if you have any support question please check out this is the bug tracker for the server component find other components at to make it possible for us to help you please fill out below information carefully steps to reproduce edit config php to specify theme example expected behaviour favicon will be served from themes example core img favicon ico actual behaviour favicon ico is served from core img additional troubleshooting steps comments if i replace favicon ico with favicon png in core templates layout user php and refresh my browser the correct file is served i have confirmed that favicon ico does exist in the example theme and is accessible from the web it seems that this line determines the correct theme path for for png files but not for ico files print unescaped image path favicon ico supports png server configuration operating system ubuntu web server apache ubuntu database ubuntu php version php owncloud version see owncloud admin page version updated from an older owncloud or fresh install updated from where did you install owncloud from repositories signing status owncloud and above technical information the following list covers which files have failed the integrity check please read the previous linked documentation to learn more about the errors and how to fix them results core invalid hash core img favicon ico lib private route cachingrouter php extra file lib private route cachingrouter php org files external invalid hash lib smb php user external invalid hash lib imap php raw output array array array array array array array array array array array array array explanations of signature failures files external lib smb php customized to remove domain tld from users before mounting external share user external lib imap php customized to select imap server by domain tld and add new users group domain tld automatically core img favicon ico customized for my installation lib private route cachingrouter php to fix a problem with the activity app when using memcached list of activated apps sudo u www data occ app list enabled activity comments dav federatedfilesharing federation files files external files pdfviewer files sharing files videoplayer firstrunwizard gallery notifications provisioning api systemtags templateeditor updatenotification user external disabled encryption external files texteditor files trashbin files versions user ldap the content of config config php sudo u www data occ config list system system instanceid passwordsalt removed sensitive value secret removed sensitive value trusted domains removed datadirectory vcloud data overwrite cli url dbtype mysql version dbname owncloud dbhost localhost dbtableprefix oc dbuser removed sensitive value dbpassword removed sensitive value installed true theme example user backends class oc user imap arguments aicr org imap ssl novalidate cert notls readonly class oc user imap arguments office mmsionline us imap ssl novalidate cert notls readonly maintenance false forcessl true forcesslforsubdomains true mail smtpmode smtp mail from address vcloud noreply mail domain aicr org mail smtphost aicr org loglevel logtimezone america new york log authfailip true log rotate size memcache local oc memcache memcached memcache distributed oc memcache memcached memcached servers localhost cache path knowledgebaseenabled true enable avatars true allow user to change display name true preview libreoffice path usr bin libreoffice enabledpreviewproviders oc preview png oc preview jpeg oc preview gif oc preview bmp oc preview xbitmap oc preview oc preview txt oc preview markdown oc preview msofficedoc oc preview pdf oc preview svg oc preview photoshop asset pipeline enabled true apps paths path var www owncloud apps url apps writable true path var www owncloud apps aicr url apps aicr writable false xframe restriction true mail smtpsecure ssl mail smtpauthtype plain mail smtpport mail smtpauth mail smtpname removed sensitive value mail smtppassword removed sensitive value updatechecker false activity expire days are you using external storage if yes which one local smb sftp sftp ftp smb cifs are you using encryption yes no no are you using an external user backend if yes which one ldap activedirectory webdav imap client configuration browser firefox internet explorer microsoft edge microsoft edgehtml operating system windows version logs web server error log apache ubuntu php openssl configured resuming normal operations command line usr sbin owncloud log data owncloud log not useful browser log not useful
0
34,697
4,940,295,649
IssuesEvent
2016-11-29 16:28:45
openbakery/gradle-xcodePlugin
https://api.github.com/repos/openbakery/gradle-xcodePlugin
closed
Unit tests without simulator
bug status:testing
It's possible to run Unit tests with a real device? configuration: ``` buildscript { repositories { maven { url 'https://plugins.gradle.org/m2/' } // Mirrors jcenter() and mavenCentral() maven { url 'http://repository.openbakery.org/' } } dependencies { // iOS gradle plugin classpath 'org.openbakery:xcode-plugin:0.13.0' // Code Coverage classpath 'org.kt3k.gradle.plugin:coveralls-gradle-plugin:2.6.3' // Check for plugin updates classpath 'com.github.ben-manes:gradle-versions-plugin:0.12.0' } } apply plugin: 'org.openbakery.xcode-plugin' apply plugin: 'com.github.kt3k.coveralls' apply plugin: 'com.github.ben-manes.versions' xcodebuild { scheme 'MyApp' target 'MyApp' configuration 'Debug' destination { platform = 'iOS' id = '33333333333333333333333' simulator = 'false' } signing { certificateURI = 'file:///Users/david/Desktop/Certifiicates.p12' certificatePassword = "realSafePassword" mobileProvisionURI = 'file:///Users/david/Desktop/Profile.mobileprovision' } coverage { outputFormat 'xml' exclude '.*h$|.*UnitTests.*m$' } } coveralls { coberturaReportPath './build/coverage/cobertura.xml' } ``` I'm trying but I have this result: ``` 17:59:03.911 [INFO] [org.openbakery.CommandRunner] 17:59:03.912 [INFO] [org.openbakery.CommandRunner] Testing failed: 17:59:03.912 [INFO] [org.openbakery.CommandRunner] Test target AppTests encountered an error (Early unexpected exit, operation never finished bootstrapping - no restart will be attempted) 17:59:03.913 [INFO] [org.openbakery.CommandRunner] ** TEST FAILED ** 17:59:03.913 [DEBUG] [org.gradle.api.Project] printFailureOutput 17:59:03.918 [DEBUG] [org.openbakery.XcodeBuildPluginExtension] getAvailableDestinations 17:59:03.918 [DEBUG] [org.openbakery.XcodeBuildPluginExtension] is a device build so add all given device destinations 17:59:03.919 [DEBUG] [org.openbakery.XcodeBuildPluginExtension] availableDestinations: [Destination{platform='iOS', name='null', arch='null', id='6dca14b53667af96e9bd8e2278e732b361e196a2', os='null'}] 17:59:03.926 [LIFECYCLE] [org.openbakery.XcodeBuildTask] 0 tests completed 17:59:03.927 [INFO] [org.openbakery.CommandRunner] 17:59:03.927 [INFO] [org.openbakery.CommandRunner] 17:59:03.927 [DEBUG] [org.openbakery.CommandRunner] Exit Code: 65 17:59:03.934 [DEBUG] [org.gradle.api.Task] parse result from: /Users/david/Desktop/HeatHub_test/HeatHubiOS/build/test/xcodebuild-output.txt 17:59:03.946 [DEBUG] [org.openbakery.XcodeBuildPluginExtension] getAvailableDestinations 17:59:03.947 [DEBUG] [org.openbakery.XcodeBuildPluginExtension] is a device build so add all given device destinations 17:59:03.947 [DEBUG] [org.openbakery.XcodeBuildPluginExtension] availableDestinations: [Destination{platform='iOS', name='null', arch='null', id='6dca14b53667af96e9bd8e2278e732b361e196a2', os='null'}] 17:59:03.949 [DEBUG] [org.gradle.api.Task] store to test-result.xml 17:59:04.267 [LIFECYCLE] [org.gradle.api.Task] 17:59:04.270 [LIFECYCLE] [org.gradle.api.Task] 0 tests were successful, and 0 failed 17:59:04.270 [DEBUG] [org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter] Finished executing task ':xcodetest' 17:59:04.271 [LIFECYCLE] [class org.gradle.TaskExecutionLogger] :xcodetest FAILED 17:59:04.271 [INFO] [org.gradle.execution.taskgraph.AbstractTaskPlanExecutor] :xcodetest (Thread[main,5,main]) completed. Took 6.341 secs. 17:59:04.271 [DEBUG] [org.gradle.execution.taskgraph.AbstractTaskPlanExecutor] Task worker [Thread[main,5,main]] finished, busy: 6.823 secs, idle: 0.001 secs 17:59:04.278 [ERROR] [org.gradle.BuildExceptionReporter] 17:59:04.278 [ERROR] [org.gradle.BuildExceptionReporter] FAILURE: Build failed with an exception. 17:59:04.278 [ERROR] [org.gradle.BuildExceptionReporter] 17:59:04.278 [ERROR] [org.gradle.BuildExceptionReporter] * What went wrong: 17:59:04.279 [ERROR] [org.gradle.BuildExceptionReporter] Execution failed for task ':xcodetest'. 17:59:04.279 [ERROR] [org.gradle.BuildExceptionReporter] > java.lang.Exception: Not all unit tests are successful! 17:59:04.279 [ERROR] [org.gradle.BuildExceptionReporter] 17:59:04.279 [ERROR] [org.gradle.BuildExceptionReporter] * Try: 17:59:04.280 [ERROR] [org.gradle.BuildExceptionReporter] Run with --stacktrace option to get the stack trace. 17:59:04.280 [LIFECYCLE] [org.gradle.BuildResultLogger] 17:59:04.280 [LIFECYCLE] [org.gradle.BuildResultLogger] BUILD FAILED 17:59:04.280 [LIFECYCLE] [org.gradle.BuildResultLogger] 17:59:04.281 [LIFECYCLE] [org.gradle.BuildResultLogger] Total time: 10.482 secs ```
1.0
Unit tests without simulator - It's possible to run Unit tests with a real device? configuration: ``` buildscript { repositories { maven { url 'https://plugins.gradle.org/m2/' } // Mirrors jcenter() and mavenCentral() maven { url 'http://repository.openbakery.org/' } } dependencies { // iOS gradle plugin classpath 'org.openbakery:xcode-plugin:0.13.0' // Code Coverage classpath 'org.kt3k.gradle.plugin:coveralls-gradle-plugin:2.6.3' // Check for plugin updates classpath 'com.github.ben-manes:gradle-versions-plugin:0.12.0' } } apply plugin: 'org.openbakery.xcode-plugin' apply plugin: 'com.github.kt3k.coveralls' apply plugin: 'com.github.ben-manes.versions' xcodebuild { scheme 'MyApp' target 'MyApp' configuration 'Debug' destination { platform = 'iOS' id = '33333333333333333333333' simulator = 'false' } signing { certificateURI = 'file:///Users/david/Desktop/Certifiicates.p12' certificatePassword = "realSafePassword" mobileProvisionURI = 'file:///Users/david/Desktop/Profile.mobileprovision' } coverage { outputFormat 'xml' exclude '.*h$|.*UnitTests.*m$' } } coveralls { coberturaReportPath './build/coverage/cobertura.xml' } ``` I'm trying but I have this result: ``` 17:59:03.911 [INFO] [org.openbakery.CommandRunner] 17:59:03.912 [INFO] [org.openbakery.CommandRunner] Testing failed: 17:59:03.912 [INFO] [org.openbakery.CommandRunner] Test target AppTests encountered an error (Early unexpected exit, operation never finished bootstrapping - no restart will be attempted) 17:59:03.913 [INFO] [org.openbakery.CommandRunner] ** TEST FAILED ** 17:59:03.913 [DEBUG] [org.gradle.api.Project] printFailureOutput 17:59:03.918 [DEBUG] [org.openbakery.XcodeBuildPluginExtension] getAvailableDestinations 17:59:03.918 [DEBUG] [org.openbakery.XcodeBuildPluginExtension] is a device build so add all given device destinations 17:59:03.919 [DEBUG] [org.openbakery.XcodeBuildPluginExtension] availableDestinations: [Destination{platform='iOS', name='null', arch='null', id='6dca14b53667af96e9bd8e2278e732b361e196a2', os='null'}] 17:59:03.926 [LIFECYCLE] [org.openbakery.XcodeBuildTask] 0 tests completed 17:59:03.927 [INFO] [org.openbakery.CommandRunner] 17:59:03.927 [INFO] [org.openbakery.CommandRunner] 17:59:03.927 [DEBUG] [org.openbakery.CommandRunner] Exit Code: 65 17:59:03.934 [DEBUG] [org.gradle.api.Task] parse result from: /Users/david/Desktop/HeatHub_test/HeatHubiOS/build/test/xcodebuild-output.txt 17:59:03.946 [DEBUG] [org.openbakery.XcodeBuildPluginExtension] getAvailableDestinations 17:59:03.947 [DEBUG] [org.openbakery.XcodeBuildPluginExtension] is a device build so add all given device destinations 17:59:03.947 [DEBUG] [org.openbakery.XcodeBuildPluginExtension] availableDestinations: [Destination{platform='iOS', name='null', arch='null', id='6dca14b53667af96e9bd8e2278e732b361e196a2', os='null'}] 17:59:03.949 [DEBUG] [org.gradle.api.Task] store to test-result.xml 17:59:04.267 [LIFECYCLE] [org.gradle.api.Task] 17:59:04.270 [LIFECYCLE] [org.gradle.api.Task] 0 tests were successful, and 0 failed 17:59:04.270 [DEBUG] [org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter] Finished executing task ':xcodetest' 17:59:04.271 [LIFECYCLE] [class org.gradle.TaskExecutionLogger] :xcodetest FAILED 17:59:04.271 [INFO] [org.gradle.execution.taskgraph.AbstractTaskPlanExecutor] :xcodetest (Thread[main,5,main]) completed. Took 6.341 secs. 17:59:04.271 [DEBUG] [org.gradle.execution.taskgraph.AbstractTaskPlanExecutor] Task worker [Thread[main,5,main]] finished, busy: 6.823 secs, idle: 0.001 secs 17:59:04.278 [ERROR] [org.gradle.BuildExceptionReporter] 17:59:04.278 [ERROR] [org.gradle.BuildExceptionReporter] FAILURE: Build failed with an exception. 17:59:04.278 [ERROR] [org.gradle.BuildExceptionReporter] 17:59:04.278 [ERROR] [org.gradle.BuildExceptionReporter] * What went wrong: 17:59:04.279 [ERROR] [org.gradle.BuildExceptionReporter] Execution failed for task ':xcodetest'. 17:59:04.279 [ERROR] [org.gradle.BuildExceptionReporter] > java.lang.Exception: Not all unit tests are successful! 17:59:04.279 [ERROR] [org.gradle.BuildExceptionReporter] 17:59:04.279 [ERROR] [org.gradle.BuildExceptionReporter] * Try: 17:59:04.280 [ERROR] [org.gradle.BuildExceptionReporter] Run with --stacktrace option to get the stack trace. 17:59:04.280 [LIFECYCLE] [org.gradle.BuildResultLogger] 17:59:04.280 [LIFECYCLE] [org.gradle.BuildResultLogger] BUILD FAILED 17:59:04.280 [LIFECYCLE] [org.gradle.BuildResultLogger] 17:59:04.281 [LIFECYCLE] [org.gradle.BuildResultLogger] Total time: 10.482 secs ```
non_priority
unit tests without simulator it s possible to run unit tests with a real device configuration buildscript repositories maven url mirrors jcenter and mavencentral maven url dependencies ios gradle plugin classpath org openbakery xcode plugin code coverage classpath org gradle plugin coveralls gradle plugin check for plugin updates classpath com github ben manes gradle versions plugin apply plugin org openbakery xcode plugin apply plugin com github coveralls apply plugin com github ben manes versions xcodebuild scheme myapp target myapp configuration debug destination platform ios id simulator false signing certificateuri file users david desktop certifiicates certificatepassword realsafepassword mobileprovisionuri file users david desktop profile mobileprovision coverage outputformat xml exclude h unittests m coveralls coberturareportpath build coverage cobertura xml i m trying but i have this result testing failed test target apptests encountered an error early unexpected exit operation never finished bootstrapping no restart will be attempted test failed printfailureoutput getavailabledestinations is a device build so add all given device destinations availabledestinations tests completed exit code parse result from users david desktop heathub test heathubios build test xcodebuild output txt getavailabledestinations is a device build so add all given device destinations availabledestinations store to test result xml tests were successful and failed finished executing task xcodetest xcodetest failed xcodetest thread completed took secs task worker finished busy secs idle secs failure build failed with an exception what went wrong execution failed for task xcodetest java lang exception not all unit tests are successful try run with stacktrace option to get the stack trace build failed total time secs
0
19,599
11,254,098,455
IssuesEvent
2020-01-11 20:56:26
cityofaustin/atd-data-tech
https://api.github.com/repos/cityofaustin/atd-data-tech
closed
[Bug] Task Orders Status is Not Current in Data Tracker
Project: Warehouse Inventory Service: Dev Type: Bug Report Workgroup: AMD
The `ACTIVE` field for task orders in the Data Tracker are not being updated. The issue is that we're not detecting this change in the integration script. [Here](https://github.com/cityofaustin/atd-data-publishing/blob/bf44df63f3f9e7c63b878667b65aa290fe9d9f74/transportation-data-publishing/data_tracker/task_orders.py#L48) is where the script needs to be updated to check for changes in status.
1.0
[Bug] Task Orders Status is Not Current in Data Tracker - The `ACTIVE` field for task orders in the Data Tracker are not being updated. The issue is that we're not detecting this change in the integration script. [Here](https://github.com/cityofaustin/atd-data-publishing/blob/bf44df63f3f9e7c63b878667b65aa290fe9d9f74/transportation-data-publishing/data_tracker/task_orders.py#L48) is where the script needs to be updated to check for changes in status.
non_priority
task orders status is not current in data tracker the active field for task orders in the data tracker are not being updated the issue is that we re not detecting this change in the integration script is where the script needs to be updated to check for changes in status
0
327,726
24,149,892,043
IssuesEvent
2022-09-21 22:49:29
airbytehq/airbyte
https://api.github.com/repos/airbytehq/airbyte
opened
Firestore Connector: Documentation link broken
type/bug team/databases team/documentation
In the firestore connector the link to the documentation is broken and does not exist: https://airbyte.gitbook.io/airbyte/integrations/destinations/firestore
1.0
Firestore Connector: Documentation link broken - In the firestore connector the link to the documentation is broken and does not exist: https://airbyte.gitbook.io/airbyte/integrations/destinations/firestore
non_priority
firestore connector documentation link broken in the firestore connector the link to the documentation is broken and does not exist
0
284,993
31,023,087,298
IssuesEvent
2023-08-10 07:14:46
whitesource-ps/ws-ignore-alerts
https://api.github.com/repos/whitesource-ps/ws-ignore-alerts
closed
requests-2.27.1-py2.py3-none-any.whl: 1 vulnerabilities (highest severity is: 6.1) - autoclosed
Mend: dependency security vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>requests-2.27.1-py2.py3-none-any.whl</b></p></summary> <p>Python HTTP for Humans.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/2d/61/08076519c80041bc0ffa1a8af0cbd3bf3e2b62af10435d269a9d0f40564d/requests-2.27.1-py2.py3-none-any.whl">https://files.pythonhosted.org/packages/2d/61/08076519c80041bc0ffa1a8af0cbd3bf3e2b62af10435d269a9d0f40564d/requests-2.27.1-py2.py3-none-any.whl</a></p> <p> <p>Found in HEAD commit: <a href="https://github.com/whitesource-ps/ws-ignore-alerts/commit/f6d200a42276eabb72eb0817c525335804f501ae">f6d200a42276eabb72eb0817c525335804f501ae</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (requests version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2023-32681](https://www.mend.io/vulnerability-database/CVE-2023-32681) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Medium | 6.1 | requests-2.27.1-py2.py3-none-any.whl | Direct | requests -2.31.0 | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> CVE-2023-32681</summary> ### Vulnerable Library - <b>requests-2.27.1-py2.py3-none-any.whl</b></p> <p>Python HTTP for Humans.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/2d/61/08076519c80041bc0ffa1a8af0cbd3bf3e2b62af10435d269a9d0f40564d/requests-2.27.1-py2.py3-none-any.whl">https://files.pythonhosted.org/packages/2d/61/08076519c80041bc0ffa1a8af0cbd3bf3e2b62af10435d269a9d0f40564d/requests-2.27.1-py2.py3-none-any.whl</a></p> <p> Dependency Hierarchy: - :x: **requests-2.27.1-py2.py3-none-any.whl** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/whitesource-ps/ws-ignore-alerts/commit/f6d200a42276eabb72eb0817c525335804f501ae">f6d200a42276eabb72eb0817c525335804f501ae</a></p> <p>Found in base branch: <b>master</b></p> </p> <p></p> ### Vulnerability Details <p> Requests is a HTTP library. Since Requests 2.3.0, Requests has been leaking Proxy-Authorization headers to destination servers when redirected to an HTTPS endpoint. This is a product of how we use `rebuild_proxies` to reattach the `Proxy-Authorization` header to requests. For HTTP connections sent through the tunnel, the proxy will identify the header in the request itself and remove it prior to forwarding to the destination server. However when sent over HTTPS, the `Proxy-Authorization` header must be sent in the CONNECT request as the proxy has no visibility into the tunneled request. This results in Requests forwarding proxy credentials to the destination server unintentionally, allowing a malicious actor to potentially exfiltrate sensitive information. This issue has been patched in version 2.31.0. <p>Publish Date: 2023-05-26 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-32681>CVE-2023-32681</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>6.1</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-j8r2-6x86-q33q">https://github.com/advisories/GHSA-j8r2-6x86-q33q</a></p> <p>Release Date: 2023-05-26</p> <p>Fix Resolution: requests -2.31.0</p> </p> <p></p> </details>
True
requests-2.27.1-py2.py3-none-any.whl: 1 vulnerabilities (highest severity is: 6.1) - autoclosed - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>requests-2.27.1-py2.py3-none-any.whl</b></p></summary> <p>Python HTTP for Humans.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/2d/61/08076519c80041bc0ffa1a8af0cbd3bf3e2b62af10435d269a9d0f40564d/requests-2.27.1-py2.py3-none-any.whl">https://files.pythonhosted.org/packages/2d/61/08076519c80041bc0ffa1a8af0cbd3bf3e2b62af10435d269a9d0f40564d/requests-2.27.1-py2.py3-none-any.whl</a></p> <p> <p>Found in HEAD commit: <a href="https://github.com/whitesource-ps/ws-ignore-alerts/commit/f6d200a42276eabb72eb0817c525335804f501ae">f6d200a42276eabb72eb0817c525335804f501ae</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (requests version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2023-32681](https://www.mend.io/vulnerability-database/CVE-2023-32681) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Medium | 6.1 | requests-2.27.1-py2.py3-none-any.whl | Direct | requests -2.31.0 | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> CVE-2023-32681</summary> ### Vulnerable Library - <b>requests-2.27.1-py2.py3-none-any.whl</b></p> <p>Python HTTP for Humans.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/2d/61/08076519c80041bc0ffa1a8af0cbd3bf3e2b62af10435d269a9d0f40564d/requests-2.27.1-py2.py3-none-any.whl">https://files.pythonhosted.org/packages/2d/61/08076519c80041bc0ffa1a8af0cbd3bf3e2b62af10435d269a9d0f40564d/requests-2.27.1-py2.py3-none-any.whl</a></p> <p> Dependency Hierarchy: - :x: **requests-2.27.1-py2.py3-none-any.whl** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/whitesource-ps/ws-ignore-alerts/commit/f6d200a42276eabb72eb0817c525335804f501ae">f6d200a42276eabb72eb0817c525335804f501ae</a></p> <p>Found in base branch: <b>master</b></p> </p> <p></p> ### Vulnerability Details <p> Requests is a HTTP library. Since Requests 2.3.0, Requests has been leaking Proxy-Authorization headers to destination servers when redirected to an HTTPS endpoint. This is a product of how we use `rebuild_proxies` to reattach the `Proxy-Authorization` header to requests. For HTTP connections sent through the tunnel, the proxy will identify the header in the request itself and remove it prior to forwarding to the destination server. However when sent over HTTPS, the `Proxy-Authorization` header must be sent in the CONNECT request as the proxy has no visibility into the tunneled request. This results in Requests forwarding proxy credentials to the destination server unintentionally, allowing a malicious actor to potentially exfiltrate sensitive information. This issue has been patched in version 2.31.0. <p>Publish Date: 2023-05-26 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-32681>CVE-2023-32681</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>6.1</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-j8r2-6x86-q33q">https://github.com/advisories/GHSA-j8r2-6x86-q33q</a></p> <p>Release Date: 2023-05-26</p> <p>Fix Resolution: requests -2.31.0</p> </p> <p></p> </details>
non_priority
requests none any whl vulnerabilities highest severity is autoclosed vulnerable library requests none any whl python http for humans library home page a href found in head commit a href vulnerabilities cve severity cvss dependency type fixed in requests version remediation available medium requests none any whl direct requests details cve vulnerable library requests none any whl python http for humans library home page a href dependency hierarchy x requests none any whl vulnerable library found in head commit a href found in base branch master vulnerability details requests is a http library since requests requests has been leaking proxy authorization headers to destination servers when redirected to an https endpoint this is a product of how we use rebuild proxies to reattach the proxy authorization header to requests for http connections sent through the tunnel the proxy will identify the header in the request itself and remove it prior to forwarding to the destination server however when sent over https the proxy authorization header must be sent in the connect request as the proxy has no visibility into the tunneled request this results in requests forwarding proxy credentials to the destination server unintentionally allowing a malicious actor to potentially exfiltrate sensitive information this issue has been patched in version publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction required scope changed impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution requests
0
158,826
12,426,108,820
IssuesEvent
2020-05-24 19:32:06
FriendsOfREDAXO/tricks
https://api.github.com/repos/FriendsOfREDAXO/tricks
closed
Ansatz für ein simples Rewrite-Schema mit Parameter für rex_getUrl
question testing needed
Hintergrund: Ich will, dass in einer schönen Url verschiedene Get-Parameter auf dem einen Artikel abgebildet werden Thomas Blum [1 month ago] Du könntest dir da was selber umsetzen … Hänge dich an den `URL_REWRITE` und wandel dann die Params um. Thomas Blum [1 month ago] Das ist der Part wie es früher in R4 passierte https://github.com/redaxo/redaxo4/blob/24d34c7b95a9cb948dde5c5d630eb09d9ee3008a/redaxo/include/addons/url_rewrite/classes/class.rewrite_fullnames.inc.php#L110-L127 ``` // konvertiert params zu GET/REQUEST Variablen if($this->use_params_rewrite) { if(strstr($path,'/+/')) { $tmp = explode('/+/',$path); $path = $tmp[0].'/'; $vars = explode('/',$tmp[1]); for($c=0;$c<count($vars);$c+=2) { if($vars[$c]!='') { $_GET[$vars[$c]] = $vars[$c+1]; $_REQUEST[$vars[$c]] = $vars[$c+1]; } } } } ``` Cukabeka [1 month ago] Danke Dir - aber ich brauche ja 2 stellen, einmal bei rex_getUrl, damit die ++ in die URL reinkommen, und dann beim Interpretieren des Aufrufes.. Leider weiß ich das Snippet nicht so recht zu deuten, wie ich das damit hinbekommen könnte @thomas.blum Cukabeka [1 month ago] Und wohin damit dann? project addon als YREWRITE klasse? :thinking_face: Thomas Blum [1 month ago] Wieso Yrewrite? > aber ich brauche ja 2 stellen, einmal bei rex_getUrl Das ist der EP `URL_REWRITE` die zweite Stelle machst du wie das Url addon da hängst du dich an den EP `YREWRITE_PREPARE` vom yrewrite addon. der wird aufgerufen, falls es zu der url keinen artikel gibt. an der stelle gibst du dann ein array zurück [‘article_id’ => deine-id, ‘clang’ => deine-clang]; die restlichen params an der url setzt du dann in das $_GET Thomas Blum [1 month ago] das yrewrite addon selber brauchst du nicht zu erweitern Thomas Blum [1 month ago] `URL_REWRITE` > Url als `/artikel/key/value/` zurückgeben `YREWRITE_PREPARE` > die Url auflösen und dann article-id und clang-id zurückgeben, damit redaxo weiß welcher artikel aufgerufen werden soll Stefan Cukabeka [1 month ago] Danke Dir nochmal Thomas - ich merke, beim Thema EP habe ich null Ahnung. Wüsste nicht, wo ich jetzt mit den Infos starten sollte. Eine Klasse in den Project-Ordner legen? Thomas Blum [1 month ago] Ich schreib da nachher oder morgen mal was Thomas Blum [1 month ago] Das nachfolgende kannst du in die boot.php des project addons legen. Jetzt müsstest du das Auswerten und das Schreiben der Urls umsetzen ``` <?php \rex_extension::register('YREWRITE_PREPARE', function (\rex_extension_point $ep) { // aufgerufene Url holen und auswerten // wenn übergebene params enthalten sind, diese in $_GET und $_REQUEST speichern // diese stehen dann in deinen Skripten über rex_get() bzw. rex_request() zur Verfügung dump($ep); $articleId = null; $clangId = null; return ['article_id' => $articleId, 'clang' => $clangId]; }, \rex_extension::EARLY); rex_extension::register('PACKAGES_INCLUDED', function (\rex_extension_point $epPackagesIncluded) { rex_extension::register('URL_REWRITE', function (\rex_extension_point $ep) { // params auswerten // und deine Url /article/param-key/param-value/ zurückgeben dump($ep); $url = ''; return $url; }, rex_extension::EARLY); }, rex_extension::EARLY); ``` Noch nicht fertig, aber ein Ansatz, um dieses Thema zu lösen (und ggf in ein Addon zu stecken).
1.0
Ansatz für ein simples Rewrite-Schema mit Parameter für rex_getUrl - Hintergrund: Ich will, dass in einer schönen Url verschiedene Get-Parameter auf dem einen Artikel abgebildet werden Thomas Blum [1 month ago] Du könntest dir da was selber umsetzen … Hänge dich an den `URL_REWRITE` und wandel dann die Params um. Thomas Blum [1 month ago] Das ist der Part wie es früher in R4 passierte https://github.com/redaxo/redaxo4/blob/24d34c7b95a9cb948dde5c5d630eb09d9ee3008a/redaxo/include/addons/url_rewrite/classes/class.rewrite_fullnames.inc.php#L110-L127 ``` // konvertiert params zu GET/REQUEST Variablen if($this->use_params_rewrite) { if(strstr($path,'/+/')) { $tmp = explode('/+/',$path); $path = $tmp[0].'/'; $vars = explode('/',$tmp[1]); for($c=0;$c<count($vars);$c+=2) { if($vars[$c]!='') { $_GET[$vars[$c]] = $vars[$c+1]; $_REQUEST[$vars[$c]] = $vars[$c+1]; } } } } ``` Cukabeka [1 month ago] Danke Dir - aber ich brauche ja 2 stellen, einmal bei rex_getUrl, damit die ++ in die URL reinkommen, und dann beim Interpretieren des Aufrufes.. Leider weiß ich das Snippet nicht so recht zu deuten, wie ich das damit hinbekommen könnte @thomas.blum Cukabeka [1 month ago] Und wohin damit dann? project addon als YREWRITE klasse? :thinking_face: Thomas Blum [1 month ago] Wieso Yrewrite? > aber ich brauche ja 2 stellen, einmal bei rex_getUrl Das ist der EP `URL_REWRITE` die zweite Stelle machst du wie das Url addon da hängst du dich an den EP `YREWRITE_PREPARE` vom yrewrite addon. der wird aufgerufen, falls es zu der url keinen artikel gibt. an der stelle gibst du dann ein array zurück [‘article_id’ => deine-id, ‘clang’ => deine-clang]; die restlichen params an der url setzt du dann in das $_GET Thomas Blum [1 month ago] das yrewrite addon selber brauchst du nicht zu erweitern Thomas Blum [1 month ago] `URL_REWRITE` > Url als `/artikel/key/value/` zurückgeben `YREWRITE_PREPARE` > die Url auflösen und dann article-id und clang-id zurückgeben, damit redaxo weiß welcher artikel aufgerufen werden soll Stefan Cukabeka [1 month ago] Danke Dir nochmal Thomas - ich merke, beim Thema EP habe ich null Ahnung. Wüsste nicht, wo ich jetzt mit den Infos starten sollte. Eine Klasse in den Project-Ordner legen? Thomas Blum [1 month ago] Ich schreib da nachher oder morgen mal was Thomas Blum [1 month ago] Das nachfolgende kannst du in die boot.php des project addons legen. Jetzt müsstest du das Auswerten und das Schreiben der Urls umsetzen ``` <?php \rex_extension::register('YREWRITE_PREPARE', function (\rex_extension_point $ep) { // aufgerufene Url holen und auswerten // wenn übergebene params enthalten sind, diese in $_GET und $_REQUEST speichern // diese stehen dann in deinen Skripten über rex_get() bzw. rex_request() zur Verfügung dump($ep); $articleId = null; $clangId = null; return ['article_id' => $articleId, 'clang' => $clangId]; }, \rex_extension::EARLY); rex_extension::register('PACKAGES_INCLUDED', function (\rex_extension_point $epPackagesIncluded) { rex_extension::register('URL_REWRITE', function (\rex_extension_point $ep) { // params auswerten // und deine Url /article/param-key/param-value/ zurückgeben dump($ep); $url = ''; return $url; }, rex_extension::EARLY); }, rex_extension::EARLY); ``` Noch nicht fertig, aber ein Ansatz, um dieses Thema zu lösen (und ggf in ein Addon zu stecken).
non_priority
ansatz für ein simples rewrite schema mit parameter für rex geturl hintergrund ich will dass in einer schönen url verschiedene get parameter auf dem einen artikel abgebildet werden thomas blum du könntest dir da was selber umsetzen … hänge dich an den url rewrite und wandel dann die params um thomas blum das ist der part wie es früher in passierte konvertiert params zu get request variablen if this use params rewrite if strstr path tmp explode path path tmp vars explode tmp for c c count vars c if vars get vars request vars cukabeka danke dir aber ich brauche ja stellen einmal bei rex geturl damit die in die url reinkommen und dann beim interpretieren des aufrufes leider weiß ich das snippet nicht so recht zu deuten wie ich das damit hinbekommen könnte thomas blum cukabeka und wohin damit dann project addon als yrewrite klasse thinking face thomas blum wieso yrewrite aber ich brauche ja stellen einmal bei rex geturl das ist der ep url rewrite die zweite stelle machst du wie das url addon da hängst du dich an den ep yrewrite prepare vom yrewrite addon der wird aufgerufen falls es zu der url keinen artikel gibt an der stelle gibst du dann ein array zurück die restlichen params an der url setzt du dann in das get thomas blum das yrewrite addon selber brauchst du nicht zu erweitern thomas blum url rewrite url als artikel key value zurückgeben yrewrite prepare die url auflösen und dann article id und clang id zurückgeben damit redaxo weiß welcher artikel aufgerufen werden soll stefan cukabeka danke dir nochmal thomas ich merke beim thema ep habe ich null ahnung wüsste nicht wo ich jetzt mit den infos starten sollte eine klasse in den project ordner legen thomas blum ich schreib da nachher oder morgen mal was thomas blum das nachfolgende kannst du in die boot php des project addons legen jetzt müsstest du das auswerten und das schreiben der urls umsetzen php rex extension register yrewrite prepare function rex extension point ep aufgerufene url holen und auswerten wenn übergebene params enthalten sind diese in get und request speichern diese stehen dann in deinen skripten über rex get bzw rex request zur verfügung dump ep articleid null clangid null return rex extension early rex extension register packages included function rex extension point eppackagesincluded rex extension register url rewrite function rex extension point ep params auswerten und deine url article param key param value zurückgeben dump ep url return url rex extension early rex extension early noch nicht fertig aber ein ansatz um dieses thema zu lösen und ggf in ein addon zu stecken
0
55,362
30,713,991,660
IssuesEvent
2023-07-27 11:49:36
davidhalter/jedi
https://api.github.com/repos/davidhalter/jedi
closed
iPython Jedi completion result timing depends sensitively on CWD
performance high-prio
At ipython/ipython#13866 I reported an issue in which completions inside an `astropy.units` object (`x.unit.[Tab]`) produce results in widely varying amounts of time, depending on the _current working directory_. If iPython is in a deep directory (like `~/`) Jedi actually crashes. Not-so-deep but significant (`~/code`) takes 30 or 40s, and a shallow directory at the end of the tree structure completes in about a second, with the same results. So it seems a static analysis of the entire directory tree, starting at iPython's current working directory, is occurring. It's not clear why. Reproduction recipe in the above. See also #1446.
True
iPython Jedi completion result timing depends sensitively on CWD - At ipython/ipython#13866 I reported an issue in which completions inside an `astropy.units` object (`x.unit.[Tab]`) produce results in widely varying amounts of time, depending on the _current working directory_. If iPython is in a deep directory (like `~/`) Jedi actually crashes. Not-so-deep but significant (`~/code`) takes 30 or 40s, and a shallow directory at the end of the tree structure completes in about a second, with the same results. So it seems a static analysis of the entire directory tree, starting at iPython's current working directory, is occurring. It's not clear why. Reproduction recipe in the above. See also #1446.
non_priority
ipython jedi completion result timing depends sensitively on cwd at ipython ipython i reported an issue in which completions inside an astropy units object x unit produce results in widely varying amounts of time depending on the current working directory if ipython is in a deep directory like jedi actually crashes not so deep but significant code takes or and a shallow directory at the end of the tree structure completes in about a second with the same results so it seems a static analysis of the entire directory tree starting at ipython s current working directory is occurring it s not clear why reproduction recipe in the above see also
0
169,937
14,236,906,175
IssuesEvent
2020-11-18 16:33:11
SAP/openui5
https://api.github.com/repos/SAP/openui5
closed
API reference: constructor of all View classes should be hidden
documentation in progress
## URL (minimal example if possible) * https://openui5nightly.hana.ondemand.com/api/sap.ui.core.mvc.View#constructor * https://openui5nightly.hana.ondemand.com/api/sap.ui.core.mvc.HTMLView#constructor * https://openui5nightly.hana.ondemand.com/api/sap.ui.core.mvc.JSONView#constructor * https://openui5nightly.hana.ondemand.com/api/sap.ui.core.mvc.XMLView#constructor According to View's API reference: > Applications should **not** call the constructor directly, but use one of the view factories instead, e.g. `View.create`. Accordingly, shouldn't they all apply [`@hideconstructor` JSDoc tag](https://jsdoc.app/tags-hideconstructor.html)? I've seen many times in the wild app developers using `new XMLView(/*...*/)` instead of `XMLView.create(/*...*/)`.
1.0
API reference: constructor of all View classes should be hidden - ## URL (minimal example if possible) * https://openui5nightly.hana.ondemand.com/api/sap.ui.core.mvc.View#constructor * https://openui5nightly.hana.ondemand.com/api/sap.ui.core.mvc.HTMLView#constructor * https://openui5nightly.hana.ondemand.com/api/sap.ui.core.mvc.JSONView#constructor * https://openui5nightly.hana.ondemand.com/api/sap.ui.core.mvc.XMLView#constructor According to View's API reference: > Applications should **not** call the constructor directly, but use one of the view factories instead, e.g. `View.create`. Accordingly, shouldn't they all apply [`@hideconstructor` JSDoc tag](https://jsdoc.app/tags-hideconstructor.html)? I've seen many times in the wild app developers using `new XMLView(/*...*/)` instead of `XMLView.create(/*...*/)`.
non_priority
api reference constructor of all view classes should be hidden url minimal example if possible according to view s api reference applications should not call the constructor directly but use one of the view factories instead e g view create accordingly shouldn t they all apply i ve seen many times in the wild app developers using new xmlview instead of xmlview create
0
2,865
3,671,962,832
IssuesEvent
2016-02-22 10:19:17
owncloud/client
https://api.github.com/repos/owncloud/client
closed
Client "exploring" unselected folders
Performance
Hi, I am using version 2.0.1 of the windows sync client and I am syncing my lightroom catalogue to a local OC server. Lightroom creates previews of every photo you have in your catalogue and saves them in a lot of directories. In my case these are 15 GB in 52'000 directories holding 120'000 files. I have deselected these subfolders from the sync, but the client is always exploring them. As this process is quite CPU intensive and takes a while the complete, all other folders are not synced in time any more. Please let me know if you need more information.
True
Client "exploring" unselected folders - Hi, I am using version 2.0.1 of the windows sync client and I am syncing my lightroom catalogue to a local OC server. Lightroom creates previews of every photo you have in your catalogue and saves them in a lot of directories. In my case these are 15 GB in 52'000 directories holding 120'000 files. I have deselected these subfolders from the sync, but the client is always exploring them. As this process is quite CPU intensive and takes a while the complete, all other folders are not synced in time any more. Please let me know if you need more information.
non_priority
client exploring unselected folders hi i am using version of the windows sync client and i am syncing my lightroom catalogue to a local oc server lightroom creates previews of every photo you have in your catalogue and saves them in a lot of directories in my case these are gb in directories holding files i have deselected these subfolders from the sync but the client is always exploring them as this process is quite cpu intensive and takes a while the complete all other folders are not synced in time any more please let me know if you need more information
0
22,555
15,269,210,596
IssuesEvent
2021-02-22 12:29:00
conan-io/conan-center-index
https://api.github.com/repos/conan-io/conan-center-index
closed
[service] Build packages for Clang 10
infrastructure
Now where https://github.com/conan-io/conan-docker-tools/pull/193 is merged, the service should generate packages for Clang 10. If added, the wiki should be updated as well: https://github.com/conan-io/conan-center-index/wiki/Supported-Platforms-And-Configurations
1.0
[service] Build packages for Clang 10 - Now where https://github.com/conan-io/conan-docker-tools/pull/193 is merged, the service should generate packages for Clang 10. If added, the wiki should be updated as well: https://github.com/conan-io/conan-center-index/wiki/Supported-Platforms-And-Configurations
non_priority
build packages for clang now where is merged the service should generate packages for clang if added the wiki should be updated as well
0
196,093
14,798,810,687
IssuesEvent
2021-01-13 00:42:22
Slimefun/Slimefun4
https://api.github.com/repos/Slimefun/Slimefun4
closed
LiteXpansion metal forge issues when crafting.
🎯 Needs testing 🐞 Bug Report
<!-- FILL IN THE FORM BELOW --> ## :round_pushpin: Description (REQUIRED) <!-- A clear and detailed description of what went wrong. --> <!-- The more information you can provide, the easier we can handle this problem. --> <!-- Start writing below this line --> When attempting to make Mixed Metal Ingot in the metal forge from LiteXpansion WITHOUT using an output chest, Each of these combinations give the message 'Sorry, my Inventory is too full!' message: ![image](https://user-images.githubusercontent.com/58162831/104387065-fc383a80-54fb-11eb-9591-3c071c76eeee.png) If an output chest IS used, the Mixed metal ingot IS created and placed into the output chest properly however, the forge uses three ingots from each of the left side slots as seen in these three examples: ![image](https://user-images.githubusercontent.com/58162831/104389528-4c65cb80-5501-11eb-9b1b-5bd6beaf44ab.png) If THIS configuration is used: ![image](https://user-images.githubusercontent.com/58162831/104390151-a74bf280-5502-11eb-9ac1-f30eec00b47b.png) Then this is the result: ![image](https://user-images.githubusercontent.com/58162831/104390204-c3e82a80-5502-11eb-8507-ac3efef9c95e.png) Which shows that the forge is consuming ingots from the left side first, instead of uniformly. ## :bookmark_tabs: Steps to reproduce the Issue (REQUIRED) <!-- Tell us the exact steps to reproduce this issue, the more detailed the easier we can reproduce it. --> <!-- Youtube Videos and Screenshots are recommended!!! --> <!-- Start writing below this line --> 1. Place the ingots for the recipe 'Mixed Metal Ingot' into the metal forge. 2. Click the forge to create the mixed metal ingot. 3. if no output chest is used, you will get the 'Sorry, my Inventory is too full!' message. 4. The mixed metal ingot will be created if an output chest is used, but the consumed ingots will vary depending on the amount used in each slot. When I first encountered this issue, I happened to be using 3 of each ingot in each slot and Became concerned that it was simply consuming ALL of the ingots from the slots on the left side, but, after testing this thoroughly, I've realized this is mostly a cosmetic issue, since the proper amount of each ingredient are being consumed, but, I went though the trouble to document my findings that I'll go ahead and just submit it so that I don't feel like I wasted my afternoon. Thank you. ## :bulb: Expected behavior (REQUIRED) <!-- What were you expecting to happen? --> <!-- What do you think would have been the correct behaviour? --> <!-- Start writing below this line --> If I used this recipe without an output chest and one of the slots had a single ingot, I would expect the resulting mixed metal ingot to be placed into that slot when crafted, instead of the inventory too full message. If I were using the output chest, I would expect the mixed metal ingot to be placed into the chest, but with only a single ingot ingredient taken from each slot. -Perhaps the fact that this Multiblock machine requires a diamond block as 'fuel' for each combination has something to with this. ## :scroll: Server Log <!-- Take a look at your Server Log and post any errors you can find via https://pastebin.com/ --> <!-- If you are unsure about it, post your full log, you can find it under /logs/latest.log --> <!-- Paste your link(s) below this line --> No access to server logs. ## :open_file_folder: /error-reports/ Folder <!-- Check the folder /plugins/Slimefun/error-reports/ and upload any files inside that folder. --> <!-- You can also post these files via https://pastebin.com/ --> <!-- Paste your link(s) below this line --> No access to the error reports folder. ## :compass: Environment (REQUIRED) <!-- Any info without the exact version numbers will be closed! --> <!-- "latest" IS NOT A VERSION NUMBER. --> <!-- We recommend running "/sf versions" and showing us a screenshot of that. --> <!-- Make sure that the screenshot covers the entire output of that command. --> <!-- If your issue is related to other plugins, make sure to include the versions of these plugins too! --> I do not have the permissions to use the /sf versions command, however, I have tried this on two different servers with the same results on both. ![image](https://user-images.githubusercontent.com/58162831/104386107-0ce7b100-54fa-11eb-8ff2-6af88f45c9f0.png)
1.0
LiteXpansion metal forge issues when crafting. - <!-- FILL IN THE FORM BELOW --> ## :round_pushpin: Description (REQUIRED) <!-- A clear and detailed description of what went wrong. --> <!-- The more information you can provide, the easier we can handle this problem. --> <!-- Start writing below this line --> When attempting to make Mixed Metal Ingot in the metal forge from LiteXpansion WITHOUT using an output chest, Each of these combinations give the message 'Sorry, my Inventory is too full!' message: ![image](https://user-images.githubusercontent.com/58162831/104387065-fc383a80-54fb-11eb-9591-3c071c76eeee.png) If an output chest IS used, the Mixed metal ingot IS created and placed into the output chest properly however, the forge uses three ingots from each of the left side slots as seen in these three examples: ![image](https://user-images.githubusercontent.com/58162831/104389528-4c65cb80-5501-11eb-9b1b-5bd6beaf44ab.png) If THIS configuration is used: ![image](https://user-images.githubusercontent.com/58162831/104390151-a74bf280-5502-11eb-9ac1-f30eec00b47b.png) Then this is the result: ![image](https://user-images.githubusercontent.com/58162831/104390204-c3e82a80-5502-11eb-8507-ac3efef9c95e.png) Which shows that the forge is consuming ingots from the left side first, instead of uniformly. ## :bookmark_tabs: Steps to reproduce the Issue (REQUIRED) <!-- Tell us the exact steps to reproduce this issue, the more detailed the easier we can reproduce it. --> <!-- Youtube Videos and Screenshots are recommended!!! --> <!-- Start writing below this line --> 1. Place the ingots for the recipe 'Mixed Metal Ingot' into the metal forge. 2. Click the forge to create the mixed metal ingot. 3. if no output chest is used, you will get the 'Sorry, my Inventory is too full!' message. 4. The mixed metal ingot will be created if an output chest is used, but the consumed ingots will vary depending on the amount used in each slot. When I first encountered this issue, I happened to be using 3 of each ingot in each slot and Became concerned that it was simply consuming ALL of the ingots from the slots on the left side, but, after testing this thoroughly, I've realized this is mostly a cosmetic issue, since the proper amount of each ingredient are being consumed, but, I went though the trouble to document my findings that I'll go ahead and just submit it so that I don't feel like I wasted my afternoon. Thank you. ## :bulb: Expected behavior (REQUIRED) <!-- What were you expecting to happen? --> <!-- What do you think would have been the correct behaviour? --> <!-- Start writing below this line --> If I used this recipe without an output chest and one of the slots had a single ingot, I would expect the resulting mixed metal ingot to be placed into that slot when crafted, instead of the inventory too full message. If I were using the output chest, I would expect the mixed metal ingot to be placed into the chest, but with only a single ingot ingredient taken from each slot. -Perhaps the fact that this Multiblock machine requires a diamond block as 'fuel' for each combination has something to with this. ## :scroll: Server Log <!-- Take a look at your Server Log and post any errors you can find via https://pastebin.com/ --> <!-- If you are unsure about it, post your full log, you can find it under /logs/latest.log --> <!-- Paste your link(s) below this line --> No access to server logs. ## :open_file_folder: /error-reports/ Folder <!-- Check the folder /plugins/Slimefun/error-reports/ and upload any files inside that folder. --> <!-- You can also post these files via https://pastebin.com/ --> <!-- Paste your link(s) below this line --> No access to the error reports folder. ## :compass: Environment (REQUIRED) <!-- Any info without the exact version numbers will be closed! --> <!-- "latest" IS NOT A VERSION NUMBER. --> <!-- We recommend running "/sf versions" and showing us a screenshot of that. --> <!-- Make sure that the screenshot covers the entire output of that command. --> <!-- If your issue is related to other plugins, make sure to include the versions of these plugins too! --> I do not have the permissions to use the /sf versions command, however, I have tried this on two different servers with the same results on both. ![image](https://user-images.githubusercontent.com/58162831/104386107-0ce7b100-54fa-11eb-8ff2-6af88f45c9f0.png)
non_priority
litexpansion metal forge issues when crafting round pushpin description required when attempting to make mixed metal ingot in the metal forge from litexpansion without using an output chest each of these combinations give the message sorry my inventory is too full message if an output chest is used the mixed metal ingot is created and placed into the output chest properly however the forge uses three ingots from each of the left side slots as seen in these three examples if this configuration is used then this is the result which shows that the forge is consuming ingots from the left side first instead of uniformly bookmark tabs steps to reproduce the issue required place the ingots for the recipe mixed metal ingot into the metal forge click the forge to create the mixed metal ingot if no output chest is used you will get the sorry my inventory is too full message the mixed metal ingot will be created if an output chest is used but the consumed ingots will vary depending on the amount used in each slot when i first encountered this issue i happened to be using of each ingot in each slot and became concerned that it was simply consuming all of the ingots from the slots on the left side but after testing this thoroughly i ve realized this is mostly a cosmetic issue since the proper amount of each ingredient are being consumed but i went though the trouble to document my findings that i ll go ahead and just submit it so that i don t feel like i wasted my afternoon thank you bulb expected behavior required if i used this recipe without an output chest and one of the slots had a single ingot i would expect the resulting mixed metal ingot to be placed into that slot when crafted instead of the inventory too full message if i were using the output chest i would expect the mixed metal ingot to be placed into the chest but with only a single ingot ingredient taken from each slot perhaps the fact that this multiblock machine requires a diamond block as fuel for each combination has something to with this scroll server log no access to server logs open file folder error reports folder no access to the error reports folder compass environment required i do not have the permissions to use the sf versions command however i have tried this on two different servers with the same results on both
0
126,244
26,809,316,877
IssuesEvent
2023-02-01 20:55:43
appsmithorg/appsmith
https://api.github.com/repos/appsmithorg/appsmith
closed
[Bug]: Custom JS Lib return 401 error for anonymous user
Bug High Needs Triaging Mongo BE Coders Pod Integrations Pod Custom JS Libraries
### Is there an existing issue for this? - [X] I have searched the existing issues ### Description https://theappsmith.slack.com/archives/CGBPVEJ5C/p1674729997721399 ### Steps To Reproduce https://theappsmith.slack.com/archives/CGBPVEJ5C/p1674729997721399 ### Public Sample App _No response_ ### Issue video log _No response_ ### Version cloud, self hosted
1.0
[Bug]: Custom JS Lib return 401 error for anonymous user - ### Is there an existing issue for this? - [X] I have searched the existing issues ### Description https://theappsmith.slack.com/archives/CGBPVEJ5C/p1674729997721399 ### Steps To Reproduce https://theappsmith.slack.com/archives/CGBPVEJ5C/p1674729997721399 ### Public Sample App _No response_ ### Issue video log _No response_ ### Version cloud, self hosted
non_priority
custom js lib return error for anonymous user is there an existing issue for this i have searched the existing issues description steps to reproduce public sample app no response issue video log no response version cloud self hosted
0
91,504
10,721,988,189
IssuesEvent
2019-10-27 08:22:14
swsnu/swpp2019-team12
https://api.github.com/repos/swsnu/swpp2019-team12
opened
API 수정사항
Backend documentation
1. /profile/:id/ 추가 - GET: 특정 프로필 반환 - PATCH: 특정 프로필 정보 수정 2. /user/:id/ 삭제 - /profile/:id/로 통합
1.0
API 수정사항 - 1. /profile/:id/ 추가 - GET: 특정 프로필 반환 - PATCH: 특정 프로필 정보 수정 2. /user/:id/ 삭제 - /profile/:id/로 통합
non_priority
api 수정사항 profile id 추가 get 특정 프로필 반환 patch 특정 프로필 정보 수정 user id 삭제 profile id 로 통합
0
18,543
10,253,218,649
IssuesEvent
2019-08-21 10:44:39
HumanCellAtlas/ingest-central
https://api.github.com/repos/HumanCellAtlas/ingest-central
closed
NGINX HTTP/2 vulnerability
security
This was [brought to our attention](https://humancellatlas.slack.com/archives/GBCT6QECF/p1564063446022500) by the security team. The NGINX version Ingest uses as ingress controller has a known vulnerability with its implementation of HTTP/2. Any version after 1.15.6 is said to have already the fix for this.
True
NGINX HTTP/2 vulnerability - This was [brought to our attention](https://humancellatlas.slack.com/archives/GBCT6QECF/p1564063446022500) by the security team. The NGINX version Ingest uses as ingress controller has a known vulnerability with its implementation of HTTP/2. Any version after 1.15.6 is said to have already the fix for this.
non_priority
nginx http vulnerability this was by the security team the nginx version ingest uses as ingress controller has a known vulnerability with its implementation of http any version after is said to have already the fix for this
0
412,842
27,876,351,788
IssuesEvent
2023-03-21 16:15:12
xarray-contrib/xpublish
https://api.github.com/repos/xarray-contrib/xpublish
closed
Add methods to API auto-documentation
documentation
Right now methods like `Rest.register_plugin()` don't get full documentation, and only the summary shows up (first line of the doc string). I noticed this when looking for the expanded docstring from the changes in #158 . One method would be to explicitly add these methods to `api.rst` the same way `Rest.serve()` is. Then they will show with summary on both the [`Rest` class page](https://xpublish.readthedocs.io/en/latest/generated/xpublish.Rest.html), and have their own subpage ([`Rest.serve`](https://xpublish.readthedocs.io/en/latest/generated/xpublish.Rest.serve.html#xpublish.Rest.serve)). https://github.com/xarray-contrib/xpublish/blob/8a2e5d3482e54b426b8371c23b333d96deae6541/docs/source/api.rst?plain=1#L13-L19
1.0
Add methods to API auto-documentation - Right now methods like `Rest.register_plugin()` don't get full documentation, and only the summary shows up (first line of the doc string). I noticed this when looking for the expanded docstring from the changes in #158 . One method would be to explicitly add these methods to `api.rst` the same way `Rest.serve()` is. Then they will show with summary on both the [`Rest` class page](https://xpublish.readthedocs.io/en/latest/generated/xpublish.Rest.html), and have their own subpage ([`Rest.serve`](https://xpublish.readthedocs.io/en/latest/generated/xpublish.Rest.serve.html#xpublish.Rest.serve)). https://github.com/xarray-contrib/xpublish/blob/8a2e5d3482e54b426b8371c23b333d96deae6541/docs/source/api.rst?plain=1#L13-L19
non_priority
add methods to api auto documentation right now methods like rest register plugin don t get full documentation and only the summary shows up first line of the doc string i noticed this when looking for the expanded docstring from the changes in one method would be to explicitly add these methods to api rst the same way rest serve is then they will show with summary on both the and have their own subpage
0
123,277
10,261,689,094
IssuesEvent
2019-08-22 10:33:22
viszerale-therapie/vt.at-drupal
https://api.github.com/repos/viszerale-therapie/vt.at-drupal
closed
Menüveränderungen css
ready for test
*Sent by Ursula Feuerherdt. Created by [fire](https://fire.fundersclub.com/).* --- Lieber Andreas, wir haben jetzt auch gleich noch ein paar Usability-Optimierungen geplant. Könntest Du die bitte erledigen laut Screenshots anbei. Wenn was unklar ist melde Dich bitte. Vielen lieben Dank! Liebe Grüße, Ursula ![](https://firebot-prod-media.s3.amazonaws.com:443/email-attachments/5cd1fae2-09eb-40f5-83fc-adcb502c59c8/Left_Menu_style.jpg)![](https://firebot-prod-media.s3.amazonaws.com:443/email-attachments/5cd1fae2-09eb-40f5-83fc-adcb502c59c8/Left_Menu_style2.jpg)![](https://firebot-prod-media.s3.amazonaws.com:443/email-attachments/5cd1fae2-09eb-40f5-83fc-adcb502c59c8/Main_Menu_style.jpg)
1.0
Menüveränderungen css - *Sent by Ursula Feuerherdt. Created by [fire](https://fire.fundersclub.com/).* --- Lieber Andreas, wir haben jetzt auch gleich noch ein paar Usability-Optimierungen geplant. Könntest Du die bitte erledigen laut Screenshots anbei. Wenn was unklar ist melde Dich bitte. Vielen lieben Dank! Liebe Grüße, Ursula ![](https://firebot-prod-media.s3.amazonaws.com:443/email-attachments/5cd1fae2-09eb-40f5-83fc-adcb502c59c8/Left_Menu_style.jpg)![](https://firebot-prod-media.s3.amazonaws.com:443/email-attachments/5cd1fae2-09eb-40f5-83fc-adcb502c59c8/Left_Menu_style2.jpg)![](https://firebot-prod-media.s3.amazonaws.com:443/email-attachments/5cd1fae2-09eb-40f5-83fc-adcb502c59c8/Main_Menu_style.jpg)
non_priority
menüveränderungen css sent by ursula feuerherdt created by lieber andreas wir haben jetzt auch gleich noch ein paar usability optimierungen geplant könntest du die bitte erledigen laut screenshots anbei wenn was unklar ist melde dich bitte vielen lieben dank liebe grüße ursula
0
350,385
31,883,286,854
IssuesEvent
2023-09-16 16:39:33
lake-wg/edhoc
https://api.github.com/repos/lake-wg/edhoc
closed
Difference betwen Test Vector 1 and Test Vector 2
Close? test vectors
What is the difference between the two test vectors - Vector 1 and Vector 2 in https://github.com/lake-wg/edhoc/blob/master/test-vectors-15/vectors-p256.txt? Does the value of COSE header vary and thus the sizes of all the EDHOC messages?
1.0
Difference betwen Test Vector 1 and Test Vector 2 - What is the difference between the two test vectors - Vector 1 and Vector 2 in https://github.com/lake-wg/edhoc/blob/master/test-vectors-15/vectors-p256.txt? Does the value of COSE header vary and thus the sizes of all the EDHOC messages?
non_priority
difference betwen test vector and test vector what is the difference between the two test vectors vector and vector in does the value of cose header vary and thus the sizes of all the edhoc messages
0
96,644
10,958,417,736
IssuesEvent
2019-11-27 09:22:05
Jogans/Gruppe1Semester4
https://api.github.com/repos/Jogans/Gruppe1Semester4
opened
Overfører / oprette et statisk klassediagram
documentation
Her tænkes at lave et stort klassediagram, som kombinere de funde klasser
1.0
Overfører / oprette et statisk klassediagram - Her tænkes at lave et stort klassediagram, som kombinere de funde klasser
non_priority
overfører oprette et statisk klassediagram her tænkes at lave et stort klassediagram som kombinere de funde klasser
0
139,769
12,879,457,858
IssuesEvent
2020-07-11 22:20:24
rrousselGit/river_pod
https://api.github.com/repos/rrousselGit/river_pod
closed
example for StreamProvider
documentation
I would like to see a more detailed example with StreamProvider in the docs. I think in the flutter / firebase context, a really nice example would be the equivalent of this from the provider way of things: ``` StreamProvider<FirebaseUser>.value( value: FirebaseAuth.instance.onAuthStateChanged); ``` translated to River Pod, it might look like this, but i could be wrong (since i am trying to clean up here, its not 1:1 translation as is): ``` final firebaseAuthProvider = StreamProvider<FirebaseUser>((ref) { ref.onDispose(() { // Closes the StreamController when the state of this provider is destroyed. FirebaseAuth.instance.signOut(); }); return FirebaseAuth.instance.onAuthStateChanged; }); ``` Please correct me if i am wrong on this.
1.0
example for StreamProvider - I would like to see a more detailed example with StreamProvider in the docs. I think in the flutter / firebase context, a really nice example would be the equivalent of this from the provider way of things: ``` StreamProvider<FirebaseUser>.value( value: FirebaseAuth.instance.onAuthStateChanged); ``` translated to River Pod, it might look like this, but i could be wrong (since i am trying to clean up here, its not 1:1 translation as is): ``` final firebaseAuthProvider = StreamProvider<FirebaseUser>((ref) { ref.onDispose(() { // Closes the StreamController when the state of this provider is destroyed. FirebaseAuth.instance.signOut(); }); return FirebaseAuth.instance.onAuthStateChanged; }); ``` Please correct me if i am wrong on this.
non_priority
example for streamprovider i would like to see a more detailed example with streamprovider in the docs i think in the flutter firebase context a really nice example would be the equivalent of this from the provider way of things streamprovider value value firebaseauth instance onauthstatechanged translated to river pod it might look like this but i could be wrong since i am trying to clean up here its not translation as is final firebaseauthprovider streamprovider ref ref ondispose closes the streamcontroller when the state of this provider is destroyed firebaseauth instance signout return firebaseauth instance onauthstatechanged please correct me if i am wrong on this
0
229,764
25,368,496,330
IssuesEvent
2022-11-21 08:47:36
jmservera/opendonita-fork
https://api.github.com/repos/jmservera/opendonita-fork
closed
Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl: 1 vulnerabilities (highest severity is: 7.5) - autoclosed
security vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl</b></p></summary> <p>Python Imaging Library (Fork)</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/86/d2/ca178ad71dcd1dcddbe2a3f7983639d2f8a20e723d9a978ab978ed08c874/Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl">https://files.pythonhosted.org/packages/86/d2/ca178ad71dcd1dcddbe2a3f7983639d2f8a20e723d9a978ab978ed08c874/Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl</a></p> <p>Path to dependency file: /requirements.txt</p> <p>Path to vulnerable library: /requirements.txt</p> <p> <p>Found in HEAD commit: <a href="https://github.com/jmservera/opendonita-fork/commit/1abe7c53b187ec5c5e16cd37e058b4e62ea6cd66">1abe7c53b187ec5c5e16cd37e058b4e62ea6cd66</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (Pillow version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2022-45199](https://www.mend.io/vulnerability-database/CVE-2022-45199) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl | Direct | Pillow - 9.3.0 | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-45199</summary> ### Vulnerable Library - <b>Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl</b></p> <p>Python Imaging Library (Fork)</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/86/d2/ca178ad71dcd1dcddbe2a3f7983639d2f8a20e723d9a978ab978ed08c874/Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl">https://files.pythonhosted.org/packages/86/d2/ca178ad71dcd1dcddbe2a3f7983639d2f8a20e723d9a978ab978ed08c874/Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl</a></p> <p>Path to dependency file: /requirements.txt</p> <p>Path to vulnerable library: /requirements.txt</p> <p> Dependency Hierarchy: - :x: **Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/jmservera/opendonita-fork/commit/1abe7c53b187ec5c5e16cd37e058b4e62ea6cd66">1abe7c53b187ec5c5e16cd37e058b4e62ea6cd66</a></p> <p>Found in base branch: <b>master</b></p> </p> <p></p> ### Vulnerability Details <p> Pillow before 9.3.0 allows denial of service via SAMPLESPERPIXEL. <p>Publish Date: 2022-11-14 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-45199>CVE-2022-45199</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>7.5</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Release Date: 2022-11-14</p> <p>Fix Resolution: Pillow - 9.3.0</p> </p> <p></p> Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) </details>
True
Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl: 1 vulnerabilities (highest severity is: 7.5) - autoclosed - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl</b></p></summary> <p>Python Imaging Library (Fork)</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/86/d2/ca178ad71dcd1dcddbe2a3f7983639d2f8a20e723d9a978ab978ed08c874/Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl">https://files.pythonhosted.org/packages/86/d2/ca178ad71dcd1dcddbe2a3f7983639d2f8a20e723d9a978ab978ed08c874/Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl</a></p> <p>Path to dependency file: /requirements.txt</p> <p>Path to vulnerable library: /requirements.txt</p> <p> <p>Found in HEAD commit: <a href="https://github.com/jmservera/opendonita-fork/commit/1abe7c53b187ec5c5e16cd37e058b4e62ea6cd66">1abe7c53b187ec5c5e16cd37e058b4e62ea6cd66</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (Pillow version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2022-45199](https://www.mend.io/vulnerability-database/CVE-2022-45199) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl | Direct | Pillow - 9.3.0 | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-45199</summary> ### Vulnerable Library - <b>Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl</b></p> <p>Python Imaging Library (Fork)</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/86/d2/ca178ad71dcd1dcddbe2a3f7983639d2f8a20e723d9a978ab978ed08c874/Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl">https://files.pythonhosted.org/packages/86/d2/ca178ad71dcd1dcddbe2a3f7983639d2f8a20e723d9a978ab978ed08c874/Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl</a></p> <p>Path to dependency file: /requirements.txt</p> <p>Path to vulnerable library: /requirements.txt</p> <p> Dependency Hierarchy: - :x: **Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/jmservera/opendonita-fork/commit/1abe7c53b187ec5c5e16cd37e058b4e62ea6cd66">1abe7c53b187ec5c5e16cd37e058b4e62ea6cd66</a></p> <p>Found in base branch: <b>master</b></p> </p> <p></p> ### Vulnerability Details <p> Pillow before 9.3.0 allows denial of service via SAMPLESPERPIXEL. <p>Publish Date: 2022-11-14 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-45199>CVE-2022-45199</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>7.5</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Release Date: 2022-11-14</p> <p>Fix Resolution: Pillow - 9.3.0</p> </p> <p></p> Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) </details>
non_priority
pillow manylinux whl vulnerabilities highest severity is autoclosed vulnerable library pillow manylinux whl python imaging library fork library home page a href path to dependency file requirements txt path to vulnerable library requirements txt found in head commit a href vulnerabilities cve severity cvss dependency type fixed in pillow version remediation available high pillow manylinux whl direct pillow details cve vulnerable library pillow manylinux whl python imaging library fork library home page a href path to dependency file requirements txt path to vulnerable library requirements txt dependency hierarchy x pillow manylinux whl vulnerable library found in head commit a href found in base branch master vulnerability details pillow before allows denial of service via samplesperpixel publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution pillow step up your open source security game with mend
0
207,366
23,441,862,263
IssuesEvent
2022-08-15 15:36:51
KaterinaOrg/WebGoat
https://api.github.com/repos/KaterinaOrg/WebGoat
opened
CVE-2021-42550 (Medium) detected in logback-classic-1.2.3.jar, logback-core-1.2.3.jar
security vulnerability
## CVE-2021-42550 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>logback-classic-1.2.3.jar</b>, <b>logback-core-1.2.3.jar</b></p></summary> <p> <details><summary><b>logback-classic-1.2.3.jar</b></p></summary> <p>logback-classic module</p> <p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p> <p>Path to dependency file: /webgoat-lessons/webgoat-introduction/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-validation-2.4.3.jar (Root Library) - spring-boot-starter-2.4.3.jar - spring-boot-starter-logging-2.4.3.jar - :x: **logback-classic-1.2.3.jar** (Vulnerable Library) </details> <details><summary><b>logback-core-1.2.3.jar</b></p></summary> <p>logback-core module</p> <p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p> <p>Path to dependency file: /webgoat-lessons/auth-bypass/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-validation-2.4.3.jar (Root Library) - spring-boot-starter-2.4.3.jar - spring-boot-starter-logging-2.4.3.jar - logback-classic-1.2.3.jar - :x: **logback-core-1.2.3.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/KaterinaOrg/WebGoat/commit/f18e43fbc2d56c28b38b6d440d202f7327efd240">f18e43fbc2d56c28b38b6d440d202f7327efd240</a></p> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In logback version 1.2.7 and prior versions, an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from LDAP servers. <p>Publish Date: 2021-12-16 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-42550>CVE-2021-42550</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: High - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550">https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550</a></p> <p>Release Date: 2021-12-16</p> <p>Fix Resolution (ch.qos.logback:logback-classic): 1.2.8</p> <p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.5.8</p><p>Fix Resolution (ch.qos.logback:logback-core): 1.2.8</p> <p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.5.8</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue
True
CVE-2021-42550 (Medium) detected in logback-classic-1.2.3.jar, logback-core-1.2.3.jar - ## CVE-2021-42550 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>logback-classic-1.2.3.jar</b>, <b>logback-core-1.2.3.jar</b></p></summary> <p> <details><summary><b>logback-classic-1.2.3.jar</b></p></summary> <p>logback-classic module</p> <p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p> <p>Path to dependency file: /webgoat-lessons/webgoat-introduction/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-validation-2.4.3.jar (Root Library) - spring-boot-starter-2.4.3.jar - spring-boot-starter-logging-2.4.3.jar - :x: **logback-classic-1.2.3.jar** (Vulnerable Library) </details> <details><summary><b>logback-core-1.2.3.jar</b></p></summary> <p>logback-core module</p> <p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p> <p>Path to dependency file: /webgoat-lessons/auth-bypass/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-validation-2.4.3.jar (Root Library) - spring-boot-starter-2.4.3.jar - spring-boot-starter-logging-2.4.3.jar - logback-classic-1.2.3.jar - :x: **logback-core-1.2.3.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/KaterinaOrg/WebGoat/commit/f18e43fbc2d56c28b38b6d440d202f7327efd240">f18e43fbc2d56c28b38b6d440d202f7327efd240</a></p> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In logback version 1.2.7 and prior versions, an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from LDAP servers. <p>Publish Date: 2021-12-16 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-42550>CVE-2021-42550</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: High - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550">https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550</a></p> <p>Release Date: 2021-12-16</p> <p>Fix Resolution (ch.qos.logback:logback-classic): 1.2.8</p> <p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.5.8</p><p>Fix Resolution (ch.qos.logback:logback-core): 1.2.8</p> <p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-validation): 2.5.8</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue
non_priority
cve medium detected in logback classic jar logback core jar cve medium severity vulnerability vulnerable libraries logback classic jar logback core jar logback classic jar logback classic module library home page a href path to dependency file webgoat lessons webgoat introduction pom xml path to vulnerable library home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar dependency hierarchy spring boot starter validation jar root library spring boot starter jar spring boot starter logging jar x logback classic jar vulnerable library logback core jar logback core module library home page a href path to dependency file webgoat lessons auth bypass pom xml path to vulnerable library home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar dependency hierarchy spring boot starter validation jar root library spring boot starter jar spring boot starter logging jar logback classic jar x logback core jar vulnerable library found in head commit a href found in base branch develop vulnerability details in logback version and prior versions an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from ldap servers publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ch qos logback logback classic direct dependency fix resolution org springframework boot spring boot starter validation fix resolution ch qos logback logback core direct dependency fix resolution org springframework boot spring boot starter validation rescue worker helmet automatic remediation is available for this issue
0
130,446
10,616,223,327
IssuesEvent
2019-10-12 10:05:33
rust-lang/rust
https://api.github.com/repos/rust-lang/rust
closed
Returning Self in impl Trait does not resolve member types (+ Compiler Crash)
C-bug E-needstest I-ICE T-compiler
A code sample demonstrating the problem: ``` trait T { type T; } impl T for i32 { type T = u32; } struct S<A> { a: A, } /*// Works impl From<u32> for S<<i32 as T>::T> { fn from(a: u32) -> S<<i32 as T>::T> { S::<<i32 as T>::T> {a} } }*/ /*// Works impl From<u32> for S<<i32 as T>::T> { fn from(a: u32) -> Self { S::<<i32 as T>::T> {a} } }*/ /*// Fails impl From<u32> for S<<i32 as T>::T> { fn from(a: u32) -> Self { Self {a} } }*/ // Fails impl From<u32> for S<<i32 as T>::T> { fn from(a: u32) -> S<<i32 as T>::T> { Self {a} } } fn main() { } ``` [Playground](https://play.rust-lang.org/?version=stable&mode=debug&edition=2018&gist=2928e352133d80b14a80b74fe2f81091) The error: ```error[E0308]: mismatched types --> src/main.rs:37:9 | 36 | fn from(a: u32) -> S<<i32 as T>::T> { | ---------------- expected `S<u32>` because of return type 37 | Self {a} | ^^^^^^^^ expected u32, found associated type | = note: expected type `S<u32>` found type `S<<i32 as T>::T>` ``` The Observed behaviour occurs on stable 1.31.1. I would expect this to compile, as `Self` is just an alias for `S<<i32 as T>::T>` in this case. On the current nightly (`nightly-x86_64-unknown-linux-gnu unchanged - rustc 1.33.0-nightly (b92552d55 2019-01-06)`), using the same construction in a more complicated environment leads to a compiler crash: ```error: internal compiler error: broken MIR in DefId(0/0:390 ~ proof_of_concept[a4c9]::graph[0]::td_cch[0]::{{impl}}[6]::from[0]) (_103 = graph::td_cch::TDCCH<<plf::PLFunctionBuilder as graph::WeightFunctionBuilder>::Weight, plf::PLFunctionBuilder> { source_nodes: move _104, first_in: move _105, first_out: move _106, target_nodes: move _107, out_to_in_edges: move _108, in_to_out_edges: move _109, weight_builders: move _110, weights: move _111 }): bad user type on rvalue (TypeOf(DefId(0/0:1848 ~ proof_of_concept[a4c9]::graph[0]::td_cch[0]::TDCCH[0]), UserSubsts { substs: [<plf::PLFunctionBuilder as graph::WeightFunctionBuilder>::Weight, plf::PLFunctionBuilder], user_self_ty: None }) = graph::td_cch::TDCCH<<plf::PLFunctionBuilder as graph::WeightFunctionBuilder>::Weight, plf::PLFunctionBuilder>): NoSolution --> src/graph/td_cch/mod.rs:973:26 | 973 | let mut result = Self { | __________________________^ 974 | | source_nodes, 975 | | first_in, 976 | | first_out, ... | 982 | | weights: functions, 983 | | }; | |_________^ error: internal compiler error: broken MIR in DefId(0/0:390 ~ proof_of_concept[a4c9]::graph[0]::td_cch[0]::{{impl}}[6]::from[0]) (_0 = move _103): bad assignment (graph::td_cch::TDCCH<plf::Point, plf::PLFunctionBuilder> = graph::td_cch::TDCCH<<plf::PLFunctionBuilder as graph::WeightFunctionBuilder>::Weight, plf::PLFunctionBuilder>): NoSolution --> src/graph/td_cch/mod.rs:984:9 | 984 | result | ^^^^^^ thread 'rustc' panicked at 'no errors encountered even though `delay_span_bug` issued', src/librustc_errors/lib.rs:324:17 note: Run with `RUST_BACKTRACE=1` environment variable to display a backtrace. error: internal compiler error: unexpected panic note: the compiler unexpectedly panicked. this is a bug. note: we would appreciate a bug report: https://github.com/rust-lang/rust/blob/master/CONTRIBUTING.md#bug-reports note: rustc 1.33.0-nightly (b92552d55 2019-01-06) running on x86_64-unknown-linux-gnu note: compiler flags: -C debuginfo=2 -C incremental -C target-cpu=native --crate-type bin note: some of the compiler flags provided by cargo are hidden error: Could not compile `proof_of_concept`. ``` I cannot disclose the full code, but here are some samples I think are relevant: The method around the crash: ``` impl From<EdgeList<Point, ()>> for PLTDCCH { fn from(edge_list: EdgeList<Point, ()>) -> Self { let edges = edge_list.edges; let functions = edge_list.weights; let mut source_nodes = vec![0; edges.len()]; let mut first_in = vec![0; edge_list.node_count as usize + 2]; let mut first_out = vec![0; edge_list.node_count as usize + 1]; let mut target_nodes = Vec::with_capacity(edges.len()); let mut weight_builders = Vec::with_capacity(edges.len()); for edge in &edges { first_in[edge.end as usize + 1] += 1; first_out[edge.start as usize] += 1; target_nodes.push(edge.end); weight_builders.push(PLFunctionBuilder::new(edge.weight_start, edge.weight_end)); } first_in.prefix_sum(); first_out.prefix_sum(); for edge in &edges { let index = &mut first_in[edge.end as usize + 1]; source_nodes[*index as usize] = edge.start; let weights = &functions[edge.weight_start as usize..edge.weight_end as usize]; *index += 1; } first_in.pop(); let mut result = Self { source_nodes, first_in, first_out, target_nodes, out_to_in_edges: Vec::new(), in_to_out_edges: Vec::new(), weight_builders: weight_builders, weights: functions, }; result } } ``` The PLTDCCH type: ``` #[derive(Clone, PartialEq, Eq, Serialize, Deserialize)] pub struct TDCCH<W, WFB> { source_nodes: Vec<NodeId>, first_in: Vec<EdgeId>, first_out: Vec<EdgeId>, target_nodes: Vec<NodeId>, out_to_in_edges: Vec<EdgeId>, in_to_out_edges: Vec<EdgeId>, weight_builders: Vec<WFB>, weights: Vec<W>, } pub type PLTDCCH = TDCCH<<PLFunctionBuilder as WeightFunctionBuilder>::Weight, PLFunctionBuilder>; ``` The WeightFunctionBuilder trait: ``` pub trait WeightFunctionBuilder: Link<Self> + Merge<Self> + Clone + Serialize + for<'de> Deserialize<'de> { type Weight: Clone + Debug; // [...] } ```
1.0
Returning Self in impl Trait does not resolve member types (+ Compiler Crash) - A code sample demonstrating the problem: ``` trait T { type T; } impl T for i32 { type T = u32; } struct S<A> { a: A, } /*// Works impl From<u32> for S<<i32 as T>::T> { fn from(a: u32) -> S<<i32 as T>::T> { S::<<i32 as T>::T> {a} } }*/ /*// Works impl From<u32> for S<<i32 as T>::T> { fn from(a: u32) -> Self { S::<<i32 as T>::T> {a} } }*/ /*// Fails impl From<u32> for S<<i32 as T>::T> { fn from(a: u32) -> Self { Self {a} } }*/ // Fails impl From<u32> for S<<i32 as T>::T> { fn from(a: u32) -> S<<i32 as T>::T> { Self {a} } } fn main() { } ``` [Playground](https://play.rust-lang.org/?version=stable&mode=debug&edition=2018&gist=2928e352133d80b14a80b74fe2f81091) The error: ```error[E0308]: mismatched types --> src/main.rs:37:9 | 36 | fn from(a: u32) -> S<<i32 as T>::T> { | ---------------- expected `S<u32>` because of return type 37 | Self {a} | ^^^^^^^^ expected u32, found associated type | = note: expected type `S<u32>` found type `S<<i32 as T>::T>` ``` The Observed behaviour occurs on stable 1.31.1. I would expect this to compile, as `Self` is just an alias for `S<<i32 as T>::T>` in this case. On the current nightly (`nightly-x86_64-unknown-linux-gnu unchanged - rustc 1.33.0-nightly (b92552d55 2019-01-06)`), using the same construction in a more complicated environment leads to a compiler crash: ```error: internal compiler error: broken MIR in DefId(0/0:390 ~ proof_of_concept[a4c9]::graph[0]::td_cch[0]::{{impl}}[6]::from[0]) (_103 = graph::td_cch::TDCCH<<plf::PLFunctionBuilder as graph::WeightFunctionBuilder>::Weight, plf::PLFunctionBuilder> { source_nodes: move _104, first_in: move _105, first_out: move _106, target_nodes: move _107, out_to_in_edges: move _108, in_to_out_edges: move _109, weight_builders: move _110, weights: move _111 }): bad user type on rvalue (TypeOf(DefId(0/0:1848 ~ proof_of_concept[a4c9]::graph[0]::td_cch[0]::TDCCH[0]), UserSubsts { substs: [<plf::PLFunctionBuilder as graph::WeightFunctionBuilder>::Weight, plf::PLFunctionBuilder], user_self_ty: None }) = graph::td_cch::TDCCH<<plf::PLFunctionBuilder as graph::WeightFunctionBuilder>::Weight, plf::PLFunctionBuilder>): NoSolution --> src/graph/td_cch/mod.rs:973:26 | 973 | let mut result = Self { | __________________________^ 974 | | source_nodes, 975 | | first_in, 976 | | first_out, ... | 982 | | weights: functions, 983 | | }; | |_________^ error: internal compiler error: broken MIR in DefId(0/0:390 ~ proof_of_concept[a4c9]::graph[0]::td_cch[0]::{{impl}}[6]::from[0]) (_0 = move _103): bad assignment (graph::td_cch::TDCCH<plf::Point, plf::PLFunctionBuilder> = graph::td_cch::TDCCH<<plf::PLFunctionBuilder as graph::WeightFunctionBuilder>::Weight, plf::PLFunctionBuilder>): NoSolution --> src/graph/td_cch/mod.rs:984:9 | 984 | result | ^^^^^^ thread 'rustc' panicked at 'no errors encountered even though `delay_span_bug` issued', src/librustc_errors/lib.rs:324:17 note: Run with `RUST_BACKTRACE=1` environment variable to display a backtrace. error: internal compiler error: unexpected panic note: the compiler unexpectedly panicked. this is a bug. note: we would appreciate a bug report: https://github.com/rust-lang/rust/blob/master/CONTRIBUTING.md#bug-reports note: rustc 1.33.0-nightly (b92552d55 2019-01-06) running on x86_64-unknown-linux-gnu note: compiler flags: -C debuginfo=2 -C incremental -C target-cpu=native --crate-type bin note: some of the compiler flags provided by cargo are hidden error: Could not compile `proof_of_concept`. ``` I cannot disclose the full code, but here are some samples I think are relevant: The method around the crash: ``` impl From<EdgeList<Point, ()>> for PLTDCCH { fn from(edge_list: EdgeList<Point, ()>) -> Self { let edges = edge_list.edges; let functions = edge_list.weights; let mut source_nodes = vec![0; edges.len()]; let mut first_in = vec![0; edge_list.node_count as usize + 2]; let mut first_out = vec![0; edge_list.node_count as usize + 1]; let mut target_nodes = Vec::with_capacity(edges.len()); let mut weight_builders = Vec::with_capacity(edges.len()); for edge in &edges { first_in[edge.end as usize + 1] += 1; first_out[edge.start as usize] += 1; target_nodes.push(edge.end); weight_builders.push(PLFunctionBuilder::new(edge.weight_start, edge.weight_end)); } first_in.prefix_sum(); first_out.prefix_sum(); for edge in &edges { let index = &mut first_in[edge.end as usize + 1]; source_nodes[*index as usize] = edge.start; let weights = &functions[edge.weight_start as usize..edge.weight_end as usize]; *index += 1; } first_in.pop(); let mut result = Self { source_nodes, first_in, first_out, target_nodes, out_to_in_edges: Vec::new(), in_to_out_edges: Vec::new(), weight_builders: weight_builders, weights: functions, }; result } } ``` The PLTDCCH type: ``` #[derive(Clone, PartialEq, Eq, Serialize, Deserialize)] pub struct TDCCH<W, WFB> { source_nodes: Vec<NodeId>, first_in: Vec<EdgeId>, first_out: Vec<EdgeId>, target_nodes: Vec<NodeId>, out_to_in_edges: Vec<EdgeId>, in_to_out_edges: Vec<EdgeId>, weight_builders: Vec<WFB>, weights: Vec<W>, } pub type PLTDCCH = TDCCH<<PLFunctionBuilder as WeightFunctionBuilder>::Weight, PLFunctionBuilder>; ``` The WeightFunctionBuilder trait: ``` pub trait WeightFunctionBuilder: Link<Self> + Merge<Self> + Clone + Serialize + for<'de> Deserialize<'de> { type Weight: Clone + Debug; // [...] } ```
non_priority
returning self in impl trait does not resolve member types compiler crash a code sample demonstrating the problem trait t type t impl t for type t struct s a a works impl from for s t fn from a s t s t a works impl from for s t fn from a self s t a fails impl from for s t fn from a self self a fails impl from for s t fn from a s t self a fn main the error error mismatched types src main rs fn from a s t expected s because of return type self a expected found associated type note expected type s found type s t the observed behaviour occurs on stable i would expect this to compile as self is just an alias for s t in this case on the current nightly nightly unknown linux gnu unchanged rustc nightly using the same construction in a more complicated environment leads to a compiler crash error internal compiler error broken mir in defid proof of concept graph td cch impl from graph td cch tdcch weight plf plfunctionbuilder source nodes move first in move first out move target nodes move out to in edges move in to out edges move weight builders move weights move bad user type on rvalue typeof defid proof of concept graph td cch tdcch usersubsts substs user self ty none graph td cch tdcch weight plf plfunctionbuilder nosolution src graph td cch mod rs let mut result self source nodes first in first out weights functions error internal compiler error broken mir in defid proof of concept graph td cch impl from move bad assignment graph td cch tdcch graph td cch tdcch weight plf plfunctionbuilder nosolution src graph td cch mod rs result thread rustc panicked at no errors encountered even though delay span bug issued src librustc errors lib rs note run with rust backtrace environment variable to display a backtrace error internal compiler error unexpected panic note the compiler unexpectedly panicked this is a bug note we would appreciate a bug report note rustc nightly running on unknown linux gnu note compiler flags c debuginfo c incremental c target cpu native crate type bin note some of the compiler flags provided by cargo are hidden error could not compile proof of concept i cannot disclose the full code but here are some samples i think are relevant the method around the crash impl from for pltdcch fn from edge list edgelist self let edges edge list edges let functions edge list weights let mut source nodes vec let mut first in vec let mut first out vec let mut target nodes vec with capacity edges len let mut weight builders vec with capacity edges len for edge in edges first in first out target nodes push edge end weight builders push plfunctionbuilder new edge weight start edge weight end first in prefix sum first out prefix sum for edge in edges let index mut first in source nodes edge start let weights functions index first in pop let mut result self source nodes first in first out target nodes out to in edges vec new in to out edges vec new weight builders weight builders weights functions result the pltdcch type pub struct tdcch source nodes vec first in vec first out vec target nodes vec out to in edges vec in to out edges vec weight builders vec weights vec pub type pltdcch tdcch weight plfunctionbuilder the weightfunctionbuilder trait pub trait weightfunctionbuilder link merge clone serialize for deserialize type weight clone debug
0
56,112
13,757,598,811
IssuesEvent
2020-10-06 21:58:28
spack/spack
https://api.github.com/repos/spack/spack
opened
Installation issue: atlas
build-error
From `UNKNOWN COMPILER '/srv/beegfs/cluster/home/clusterbuild/spack/lib/spack/env/gcc/gcc' for ICC: you must also supply flags!` It seems like atlas is refusing to accept spack provided compiler as `gcc` ... ### Steps to reproduce the issue ```console $ spack install atlas@3.11.41 ==> Installing atlas ==> No binary for atlas found: installing from source ==> Moving resource stage source : /tmp/clusterbuild/spack-stage/resource-lapack-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-src/ destination : /tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-src/spack-resource-lapack/lapack-3.5.0 ==> atlas: Executing phase: 'install' ==> Error: ProcessError: Command exited with status 2: 'make' 13 errors found in build log: 52 /tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-src/spack-build/..//CONF IG/include/atlas_sys.h:224: warning: the use of `tmpnam' is dangerous, better use `mkstemp' 53 rm -f config1.out 54 make atlas_run atldir=/tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-sr c/spack-build exe=xprobe_comp redir=config1.out \ 55 args="-v 0 -o atlconf.txt -O 1 -A 35 -Si nof77 0 -V 1968 -C ic '/srv/beegfs/cluster/home/clusterbuild/ spack/lib/spack/env/gcc/gcc' -C if '/srv/beegfs/cluster/home/clusterbuild/spack/lib/spack/env/gcc/gfortran' -b 64 -d b /tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-src/spack-build" 56 make[1]: Entering directory '/tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/s pack-src/spack-build' 57 cd /tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-src/spack-build ; ./x probe_comp -v 0 -o atlconf.txt -O 1 -A 35 -Si nof77 0 -V 1968 -C ic '/srv/beegfs/cluster/home/clusterbuild/spack/lib/s pack/env/gcc/gcc' -C if '/srv/beegfs/cluster/home/clusterbuild/spack/lib/spack/env/gcc/gfortran' -b 64 -d b /tmp/cluste rbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-src/spack-build > config1.out >> 58 find: '/opt/slurm/20.02/bin': No such file or directory >> 59 find: '/usr/users/clusterbuild/local': No such file or directory >> 60 find: '/opt/bin': No such file or directory >> 61 find: '/opt/sbin': No such file or directory 62 UNKNOWN COMPILER '/srv/beegfs/cluster/home/clusterbuild/spack/lib/spack/env/gcc/gcc' for ICC: you must also supply flag s! 63 Makefile:109: recipe for target 'atlas_run' failed >> 64 make[1]: *** [atlas_run] Error 1 65 make[1]: Leaving directory '/tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/sp ack-src/spack-build' 66 Makefile:123: recipe for target 'IRun_comp' failed >> 67 make: *** [IRun_comp] Error 2 68 ERROR 512 IN SYSCMND: 'make IRun_comp args="-v 0 -o atlconf.txt -O 1 -A 35 -Si nof77 0 -V 1968 -C ic '/srv/beegfs/clus ter/home/clusterbuild/spack/lib/spack/env/gcc/gcc' -C if '/srv/beegfs/cluster/home/clusterbuild/spack/lib/spack/env/gcc /gfortran' -b 64"' 69 70 OS configured as Linux (1) 71 72 Assembly configured as GAS_x8664 (2) 73 ... 105 cd interfaces/blas/F77 ; mkdir src testing 106 cd interfaces/lapack ; mkdir C2F 107 cd interfaces/lapack/C2F ; mkdir src 108 mkdir ARCHS 109 make -f Make.top startup 110 make[1]: Entering directory '/tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/s pack-src/spack-build' >> 111 Make.top:1: Make.inc: No such file or directory >> 112 make[1]: *** No rule to make target 'Make.inc'. Stop. 113 make[1]: Leaving directory '/tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/sp ack-src/spack-build' 114 Makefile:560: recipe for target 'startup' failed >> 115 make: *** [startup] Error 2 >> 116 mv: cannot stat 'lib/Makefile': No such file or directory 117 ../configure: 573: ../configure: cannot create lib/Makefile: Directory nonexistent 118 ../configure: 574: ../configure: cannot create lib/Makefile: Directory nonexistent 119 ../configure: 575: ../configure: cannot create lib/Makefile: Directory nonexistent 120 ../configure: 576: ../configure: cannot create lib/Makefile: Directory nonexistent 121 ../configure: 580: ../configure: cannot create include/atlas_maxmalloc.h: Directory nonexistent 122 ../configure: 585: ../configure: cannot create include/atlas_maxmalloc.h: Directory nonexistent ... 127 ../configure: 643: ../configure: cannot create lib/Makefile: Directory nonexistent 128 ../configure: 661: ../configure: cannot create /tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmu chd3ebxixecv3znu/spack-src/spack-build/tune/threads/res/aff.h: Directory nonexistent 129 DONE configure 130 ==> [2020-10-06-23:44:30.540475] 'make' 131 make -f Make.top build 132 make[1]: Entering directory '/tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/s pack-src/spack-build' >> 133 Make.top:1: Make.inc: No such file or directory >> 134 make[1]: *** No rule to make target 'Make.inc'. Stop. 135 make[1]: Leaving directory '/tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/sp ack-src/spack-build' 136 Makefile:555: recipe for target 'build' failed >> 137 make: *** [build] Error 2 See build log for details: /tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-build-out.txt ... ``` ### Information on your system <!-- Please include the output of `spack debug report` --> ```console clusterbuild@r3d3:~/local_modules/Modules$ spack debug report * **Spack:** 0.15.4 * **Python:** 3.6.9 * **Platform:** linux-ubuntu18.04-cascadelake ``` ### Additional information * [spack-build-out.txt](https://github.com/spack/spack/files/5337026/spack-build-out.txt) * [spack-build-env.txt](https://github.com/spack/spack/files/5337027/spack-build-env.txt)
1.0
Installation issue: atlas - From `UNKNOWN COMPILER '/srv/beegfs/cluster/home/clusterbuild/spack/lib/spack/env/gcc/gcc' for ICC: you must also supply flags!` It seems like atlas is refusing to accept spack provided compiler as `gcc` ... ### Steps to reproduce the issue ```console $ spack install atlas@3.11.41 ==> Installing atlas ==> No binary for atlas found: installing from source ==> Moving resource stage source : /tmp/clusterbuild/spack-stage/resource-lapack-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-src/ destination : /tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-src/spack-resource-lapack/lapack-3.5.0 ==> atlas: Executing phase: 'install' ==> Error: ProcessError: Command exited with status 2: 'make' 13 errors found in build log: 52 /tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-src/spack-build/..//CONF IG/include/atlas_sys.h:224: warning: the use of `tmpnam' is dangerous, better use `mkstemp' 53 rm -f config1.out 54 make atlas_run atldir=/tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-sr c/spack-build exe=xprobe_comp redir=config1.out \ 55 args="-v 0 -o atlconf.txt -O 1 -A 35 -Si nof77 0 -V 1968 -C ic '/srv/beegfs/cluster/home/clusterbuild/ spack/lib/spack/env/gcc/gcc' -C if '/srv/beegfs/cluster/home/clusterbuild/spack/lib/spack/env/gcc/gfortran' -b 64 -d b /tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-src/spack-build" 56 make[1]: Entering directory '/tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/s pack-src/spack-build' 57 cd /tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-src/spack-build ; ./x probe_comp -v 0 -o atlconf.txt -O 1 -A 35 -Si nof77 0 -V 1968 -C ic '/srv/beegfs/cluster/home/clusterbuild/spack/lib/s pack/env/gcc/gcc' -C if '/srv/beegfs/cluster/home/clusterbuild/spack/lib/spack/env/gcc/gfortran' -b 64 -d b /tmp/cluste rbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-src/spack-build > config1.out >> 58 find: '/opt/slurm/20.02/bin': No such file or directory >> 59 find: '/usr/users/clusterbuild/local': No such file or directory >> 60 find: '/opt/bin': No such file or directory >> 61 find: '/opt/sbin': No such file or directory 62 UNKNOWN COMPILER '/srv/beegfs/cluster/home/clusterbuild/spack/lib/spack/env/gcc/gcc' for ICC: you must also supply flag s! 63 Makefile:109: recipe for target 'atlas_run' failed >> 64 make[1]: *** [atlas_run] Error 1 65 make[1]: Leaving directory '/tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/sp ack-src/spack-build' 66 Makefile:123: recipe for target 'IRun_comp' failed >> 67 make: *** [IRun_comp] Error 2 68 ERROR 512 IN SYSCMND: 'make IRun_comp args="-v 0 -o atlconf.txt -O 1 -A 35 -Si nof77 0 -V 1968 -C ic '/srv/beegfs/clus ter/home/clusterbuild/spack/lib/spack/env/gcc/gcc' -C if '/srv/beegfs/cluster/home/clusterbuild/spack/lib/spack/env/gcc /gfortran' -b 64"' 69 70 OS configured as Linux (1) 71 72 Assembly configured as GAS_x8664 (2) 73 ... 105 cd interfaces/blas/F77 ; mkdir src testing 106 cd interfaces/lapack ; mkdir C2F 107 cd interfaces/lapack/C2F ; mkdir src 108 mkdir ARCHS 109 make -f Make.top startup 110 make[1]: Entering directory '/tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/s pack-src/spack-build' >> 111 Make.top:1: Make.inc: No such file or directory >> 112 make[1]: *** No rule to make target 'Make.inc'. Stop. 113 make[1]: Leaving directory '/tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/sp ack-src/spack-build' 114 Makefile:560: recipe for target 'startup' failed >> 115 make: *** [startup] Error 2 >> 116 mv: cannot stat 'lib/Makefile': No such file or directory 117 ../configure: 573: ../configure: cannot create lib/Makefile: Directory nonexistent 118 ../configure: 574: ../configure: cannot create lib/Makefile: Directory nonexistent 119 ../configure: 575: ../configure: cannot create lib/Makefile: Directory nonexistent 120 ../configure: 576: ../configure: cannot create lib/Makefile: Directory nonexistent 121 ../configure: 580: ../configure: cannot create include/atlas_maxmalloc.h: Directory nonexistent 122 ../configure: 585: ../configure: cannot create include/atlas_maxmalloc.h: Directory nonexistent ... 127 ../configure: 643: ../configure: cannot create lib/Makefile: Directory nonexistent 128 ../configure: 661: ../configure: cannot create /tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmu chd3ebxixecv3znu/spack-src/spack-build/tune/threads/res/aff.h: Directory nonexistent 129 DONE configure 130 ==> [2020-10-06-23:44:30.540475] 'make' 131 make -f Make.top build 132 make[1]: Entering directory '/tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/s pack-src/spack-build' >> 133 Make.top:1: Make.inc: No such file or directory >> 134 make[1]: *** No rule to make target 'Make.inc'. Stop. 135 make[1]: Leaving directory '/tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/sp ack-src/spack-build' 136 Makefile:555: recipe for target 'build' failed >> 137 make: *** [build] Error 2 See build log for details: /tmp/clusterbuild/spack-stage/spack-stage-atlas-3.11.41-txlzl2b3typr2xmuchd3ebxixecv3znu/spack-build-out.txt ... ``` ### Information on your system <!-- Please include the output of `spack debug report` --> ```console clusterbuild@r3d3:~/local_modules/Modules$ spack debug report * **Spack:** 0.15.4 * **Python:** 3.6.9 * **Platform:** linux-ubuntu18.04-cascadelake ``` ### Additional information * [spack-build-out.txt](https://github.com/spack/spack/files/5337026/spack-build-out.txt) * [spack-build-env.txt](https://github.com/spack/spack/files/5337027/spack-build-env.txt)
non_priority
installation issue atlas from unknown compiler srv beegfs cluster home clusterbuild spack lib spack env gcc gcc for icc you must also supply flags it seems like atlas is refusing to accept spack provided compiler as gcc steps to reproduce the issue console spack install atlas installing atlas no binary for atlas found installing from source moving resource stage source tmp clusterbuild spack stage resource lapack spack src destination tmp clusterbuild spack stage spack stage atlas spack src spack resource lapack lapack atlas executing phase install error processerror command exited with status make errors found in build log tmp clusterbuild spack stage spack stage atlas spack src spack build conf ig include atlas sys h warning the use of tmpnam is dangerous better use mkstemp rm f out make atlas run atldir tmp clusterbuild spack stage spack stage atlas spack sr c spack build exe xprobe comp redir out args v o atlconf txt o a si v c ic srv beegfs cluster home clusterbuild spack lib spack env gcc gcc c if srv beegfs cluster home clusterbuild spack lib spack env gcc gfortran b d b tmp clusterbuild spack stage spack stage atlas spack src spack build make entering directory tmp clusterbuild spack stage spack stage atlas s pack src spack build cd tmp clusterbuild spack stage spack stage atlas spack src spack build x probe comp v o atlconf txt o a si v c ic srv beegfs cluster home clusterbuild spack lib s pack env gcc gcc c if srv beegfs cluster home clusterbuild spack lib spack env gcc gfortran b d b tmp cluste rbuild spack stage spack stage atlas spack src spack build out find opt slurm bin no such file or directory find usr users clusterbuild local no such file or directory find opt bin no such file or directory find opt sbin no such file or directory unknown compiler srv beegfs cluster home clusterbuild spack lib spack env gcc gcc for icc you must also supply flag s makefile recipe for target atlas run failed make error make leaving directory tmp clusterbuild spack stage spack stage atlas sp ack src spack build makefile recipe for target irun comp failed make error error in syscmnd make irun comp args v o atlconf txt o a si v c ic srv beegfs clus ter home clusterbuild spack lib spack env gcc gcc c if srv beegfs cluster home clusterbuild spack lib spack env gcc gfortran b os configured as linux assembly configured as gas cd interfaces blas mkdir src testing cd interfaces lapack mkdir cd interfaces lapack mkdir src mkdir archs make f make top startup make entering directory tmp clusterbuild spack stage spack stage atlas s pack src spack build make top make inc no such file or directory make no rule to make target make inc stop make leaving directory tmp clusterbuild spack stage spack stage atlas sp ack src spack build makefile recipe for target startup failed make error mv cannot stat lib makefile no such file or directory configure configure cannot create lib makefile directory nonexistent configure configure cannot create lib makefile directory nonexistent configure configure cannot create lib makefile directory nonexistent configure configure cannot create lib makefile directory nonexistent configure configure cannot create include atlas maxmalloc h directory nonexistent configure configure cannot create include atlas maxmalloc h directory nonexistent configure configure cannot create lib makefile directory nonexistent configure configure cannot create tmp clusterbuild spack stage spack stage atlas spack src spack build tune threads res aff h directory nonexistent done configure make make f make top build make entering directory tmp clusterbuild spack stage spack stage atlas s pack src spack build make top make inc no such file or directory make no rule to make target make inc stop make leaving directory tmp clusterbuild spack stage spack stage atlas sp ack src spack build makefile recipe for target build failed make error see build log for details tmp clusterbuild spack stage spack stage atlas spack build out txt information on your system console clusterbuild local modules modules spack debug report spack python platform linux cascadelake additional information
0
443,312
30,884,250,551
IssuesEvent
2023-08-03 20:14:12
orcasound/.github
https://api.github.com/repos/orcasound/.github
opened
2023 UX/Design strategy
documentation
### Discussed in https://github.com/orgs/orcasound/discussions/12 <div type='discussions-op-text'> <sup>Originally posted by **scottveirs** December 13, 2022</sup> Met with @UXBrendan and Oya this evening to discuss UX/Design strategy. I'm not sure we stuck with it, but Brendan started us off with a great aspiration for managing the steady stream of UX/design volunteers in 2023: "focusing on quality rather than quantity, and finding a simple management solution that doesn't require extended time and effort, would be best." Brendan also did a nice job of summarizing our conversation for the rest of the UX management team, including @shaneuxd: **UX strategy for 2023, there was some clarity about what projects to focus on:** - Expanding the Professional Marine Scientist persona - May include subpersonae - Researching and designing the OrcaLearn app - Running research and design projects for the bioacoustic dashboard - Integrating Acartia ([acartia.io](http://acartia.io/)) into the Orcasound ecosystem - Understanding why, how, cadence, and where to implement notifications - Beyond hydrophone orca listening events - Moving forward with the Shipnoise app My main suggestion regarding a management solution was (mostly my gut feeling) that it could help to move more and more of the UX/design process (e.g. asynchronous collaboration of different teams spread across many time zones) and products (mockups, feature requests, user data analyses) closer to the dev teams by using Github's growing suite of tools. Maybe mockups can live within a repo's `Wiki`, feature requests can be labeled as such within `Issues,` key threads from Slack can be archived and discussed more deeply in `Discusssions,` some project management can migrate (from Trello, Asana, etc) towards `Projects,` and perhaps even repos could be established for versioning key assets (e.g. logos, audio libraries, graphics, email templates) ?</div>
1.0
2023 UX/Design strategy - ### Discussed in https://github.com/orgs/orcasound/discussions/12 <div type='discussions-op-text'> <sup>Originally posted by **scottveirs** December 13, 2022</sup> Met with @UXBrendan and Oya this evening to discuss UX/Design strategy. I'm not sure we stuck with it, but Brendan started us off with a great aspiration for managing the steady stream of UX/design volunteers in 2023: "focusing on quality rather than quantity, and finding a simple management solution that doesn't require extended time and effort, would be best." Brendan also did a nice job of summarizing our conversation for the rest of the UX management team, including @shaneuxd: **UX strategy for 2023, there was some clarity about what projects to focus on:** - Expanding the Professional Marine Scientist persona - May include subpersonae - Researching and designing the OrcaLearn app - Running research and design projects for the bioacoustic dashboard - Integrating Acartia ([acartia.io](http://acartia.io/)) into the Orcasound ecosystem - Understanding why, how, cadence, and where to implement notifications - Beyond hydrophone orca listening events - Moving forward with the Shipnoise app My main suggestion regarding a management solution was (mostly my gut feeling) that it could help to move more and more of the UX/design process (e.g. asynchronous collaboration of different teams spread across many time zones) and products (mockups, feature requests, user data analyses) closer to the dev teams by using Github's growing suite of tools. Maybe mockups can live within a repo's `Wiki`, feature requests can be labeled as such within `Issues,` key threads from Slack can be archived and discussed more deeply in `Discusssions,` some project management can migrate (from Trello, Asana, etc) towards `Projects,` and perhaps even repos could be established for versioning key assets (e.g. logos, audio libraries, graphics, email templates) ?</div>
non_priority
ux design strategy discussed in originally posted by scottveirs december met with uxbrendan and oya this evening to discuss ux design strategy i m not sure we stuck with it but brendan started us off with a great aspiration for managing the steady stream of ux design volunteers in focusing on quality rather than quantity and finding a simple management solution that doesn t require extended time and effort would be best brendan also did a nice job of summarizing our conversation for the rest of the ux management team including shaneuxd ux strategy for there was some clarity about what projects to focus on expanding the professional marine scientist persona may include subpersonae researching and designing the orcalearn app running research and design projects for the bioacoustic dashboard integrating acartia into the orcasound ecosystem understanding why how cadence and where to implement notifications beyond hydrophone orca listening events moving forward with the shipnoise app my main suggestion regarding a management solution was mostly my gut feeling that it could help to move more and more of the ux design process e g asynchronous collaboration of different teams spread across many time zones and products mockups feature requests user data analyses closer to the dev teams by using github s growing suite of tools maybe mockups can live within a repo s wiki feature requests can be labeled as such within issues key threads from slack can be archived and discussed more deeply in discusssions some project management can migrate from trello asana etc towards projects and perhaps even repos could be established for versioning key assets e g logos audio libraries graphics email templates
0
29,058
13,041,319,659
IssuesEvent
2020-07-28 20:09:34
Azure/azure-sdk-for-net
https://api.github.com/repos/Azure/azure-sdk-for-net
closed
[QUERY] Close Message Session in the middle of processing messages for the session
Client Service Bus customer-reported question
**Query/Question** How can we help? Currently I am using the new Service Bus library and am using message sessions. I am using the message sessions to separate out other clients data in a topic without multiple subscriptions and have run into an issue I'm not sure exactly how to solve. If one of the messages in the session doesn't get processed for a particular reason I need to close the session and have it pull in the first message it failed to process again. I don't want to abandon because that increments the delivery count which in my specific case I don't want to do because I don't want the message to be put into dead letter queue. According to documentation delivery count won't be incremented if the messages within the session aren't completed but there doesn't appear to be a good way to close the session. ![image](https://user-images.githubusercontent.com/40215033/87485598-29686680-c607-11ea-9d21-e43e0d35379a.png) I am using the session processor currently for my implementation but there doesn't seem to be a way to close a session unless you move through all messages in the session and then have the library close the receiver. It would be helpful to have the ability to close the session receiver while processing messages so that it can be spun back up and process the first message in the session again without incrementing the delivery count. I've thought about abandoning messages but I don't want the delivery count to increment in this situation and currently the only way to have a session closed is by processing all messages in the session. I've come up with a bit of a hacky solution involving storing the sessionId as the key of a dictionary along with the messageId of the message that failed first. This way I can check the messageId and skip all messages if this scenario occurs and have the session receiver close and spin up again and pull in that failed message. There is quite a bit of overhead in this method so I was wondering if there was a recommendation as to how to close sessions before all messages in the session is processed? **Environment:** - Name and version of the Library package used: Azure.Messaging.ServiceBus preview-4 - Hosting platform or OS and .NET runtime version .net core 3.1
1.0
[QUERY] Close Message Session in the middle of processing messages for the session - **Query/Question** How can we help? Currently I am using the new Service Bus library and am using message sessions. I am using the message sessions to separate out other clients data in a topic without multiple subscriptions and have run into an issue I'm not sure exactly how to solve. If one of the messages in the session doesn't get processed for a particular reason I need to close the session and have it pull in the first message it failed to process again. I don't want to abandon because that increments the delivery count which in my specific case I don't want to do because I don't want the message to be put into dead letter queue. According to documentation delivery count won't be incremented if the messages within the session aren't completed but there doesn't appear to be a good way to close the session. ![image](https://user-images.githubusercontent.com/40215033/87485598-29686680-c607-11ea-9d21-e43e0d35379a.png) I am using the session processor currently for my implementation but there doesn't seem to be a way to close a session unless you move through all messages in the session and then have the library close the receiver. It would be helpful to have the ability to close the session receiver while processing messages so that it can be spun back up and process the first message in the session again without incrementing the delivery count. I've thought about abandoning messages but I don't want the delivery count to increment in this situation and currently the only way to have a session closed is by processing all messages in the session. I've come up with a bit of a hacky solution involving storing the sessionId as the key of a dictionary along with the messageId of the message that failed first. This way I can check the messageId and skip all messages if this scenario occurs and have the session receiver close and spin up again and pull in that failed message. There is quite a bit of overhead in this method so I was wondering if there was a recommendation as to how to close sessions before all messages in the session is processed? **Environment:** - Name and version of the Library package used: Azure.Messaging.ServiceBus preview-4 - Hosting platform or OS and .NET runtime version .net core 3.1
non_priority
close message session in the middle of processing messages for the session query question how can we help currently i am using the new service bus library and am using message sessions i am using the message sessions to separate out other clients data in a topic without multiple subscriptions and have run into an issue i m not sure exactly how to solve if one of the messages in the session doesn t get processed for a particular reason i need to close the session and have it pull in the first message it failed to process again i don t want to abandon because that increments the delivery count which in my specific case i don t want to do because i don t want the message to be put into dead letter queue according to documentation delivery count won t be incremented if the messages within the session aren t completed but there doesn t appear to be a good way to close the session i am using the session processor currently for my implementation but there doesn t seem to be a way to close a session unless you move through all messages in the session and then have the library close the receiver it would be helpful to have the ability to close the session receiver while processing messages so that it can be spun back up and process the first message in the session again without incrementing the delivery count i ve thought about abandoning messages but i don t want the delivery count to increment in this situation and currently the only way to have a session closed is by processing all messages in the session i ve come up with a bit of a hacky solution involving storing the sessionid as the key of a dictionary along with the messageid of the message that failed first this way i can check the messageid and skip all messages if this scenario occurs and have the session receiver close and spin up again and pull in that failed message there is quite a bit of overhead in this method so i was wondering if there was a recommendation as to how to close sessions before all messages in the session is processed environment name and version of the library package used azure messaging servicebus preview hosting platform or os and net runtime version net core
0
5,203
5,544,745,662
IssuesEvent
2017-03-22 19:54:48
girder/girder
https://api.github.com/repos/girder/girder
closed
girder_client: Fix InsecurePlatformWarning
security
Uploading to an secured server (https://data.kitware.com) using `girder_client` returns the following warning: ``` /home/jcfr/.virtualenvs/ipython/local/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning. InsecurePlatformWarning ``` client should be updated to depend on `certifi` See https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.InsecurePlatformWarning
True
girder_client: Fix InsecurePlatformWarning - Uploading to an secured server (https://data.kitware.com) using `girder_client` returns the following warning: ``` /home/jcfr/.virtualenvs/ipython/local/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning. InsecurePlatformWarning ``` client should be updated to depend on `certifi` See https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.InsecurePlatformWarning
non_priority
girder client fix insecureplatformwarning uploading to an secured server using girder client returns the following warning home jcfr virtualenvs ipython local lib site packages requests packages util ssl py insecureplatformwarning a true sslcontext object is not available this prevents from configuring ssl appropriately and may cause certain ssl connections to fail for more information see insecureplatformwarning client should be updated to depend on certifi see
0
92,814
8,378,803,396
IssuesEvent
2018-10-06 17:57:13
istio/istio
https://api.github.com/repos/istio/istio
closed
[test-framework] Add support to connect to istio-policy backend to DeployedMixer instances
area/test and release
The DeployedMixer facade currently only connects to istio-telemetry backend. We should add support to connect to the istio-policy backend as well, to perform quota/check related operations.
1.0
[test-framework] Add support to connect to istio-policy backend to DeployedMixer instances - The DeployedMixer facade currently only connects to istio-telemetry backend. We should add support to connect to the istio-policy backend as well, to perform quota/check related operations.
non_priority
add support to connect to istio policy backend to deployedmixer instances the deployedmixer facade currently only connects to istio telemetry backend we should add support to connect to the istio policy backend as well to perform quota check related operations
0
10,309
7,147,336,353
IssuesEvent
2018-01-25 00:11:19
tensorflow/tensorflow
https://api.github.com/repos/tensorflow/tensorflow
closed
could not set cudnn filter descriptor: CUDNN_STATUS_BAD_PARAM
stat:awaiting tensorflower type:bug/performance
The version of cuda and cudnn meets the requirement, but still cannot use cudnn properly. ### What related GitHub issues or StackOverflow threads have you found by searching the web for your problem? ### Environment info Operating System: Linux version 3.16.0-30-generic (buildd@kissel) (gcc version 4.8.2 (Ubuntu 4.8.2-19ubuntu1) ) #40~14.04.1-Ubuntu Installed version of CUDA and cuDNN: (please attach the output of `ls -l /path/to/cuda/lib/libcud*`): -rw-r--r-- 1 root root 558720 Sep 15 07:02 /usr/local/cuda/lib64/libcudadevrt.a lrwxrwxrwx 1 root root 16 Sep 15 07:05 /usr/local/cuda/lib64/libcudart.so -> libcudart.so.8.0 lrwxrwxrwx 1 root root 19 Sep 15 07:05 /usr/local/cuda/lib64/libcudart.so.8.0 -> libcudart.so.8.0.44 -rw-r--r-- 1 root root 415432 Sep 15 07:02 /usr/local/cuda/lib64/libcudart.so.8.0.44 -rw-r--r-- 1 root root 775162 Sep 15 07:02 /usr/local/cuda/lib64/libcudart_static.a lrwxrwxrwx 1 root root 13 Nov 22 10:55 /usr/local/cuda/lib64/libcudnn.so -> libcudnn.so.5 lrwxrwxrwx 1 root root 17 Nov 22 10:55 /usr/local/cuda/lib64/libcudnn.so.5 -> libcudnn.so.5.1.5 -rw-r--r-- 1 root root 78065952 Nov 22 10:09 /usr/local/cuda/lib64/libcudnn.so.5.0.5 -rw-r--r-- 1 root root 79337624 Nov 22 10:17 /usr/local/cuda/lib64/libcudnn.so.5.1.5 -rw-r--r-- 1 root root 69756172 Nov 22 10:17 /usr/local/cuda/lib64/libcudnn_static.a If installed from binary pip package, provide: 1. A link to the pip package you installed: export TF_BINARY_URL=https://storage.googleapis.com/tensorflow/linux/gpu/tensorflow-0.11.0-cp27-none-linux_x86_64.whl 2. The output from `python -c "import tensorflow; print(tensorflow.__version__)"`. I tensorflow/stream_executor/dso_loader.cc:111] successfully opened CUDA library libcublas.so locally I tensorflow/stream_executor/dso_loader.cc:111] successfully opened CUDA library libcudnn.so locally I tensorflow/stream_executor/dso_loader.cc:111] successfully opened CUDA library libcufft.so locally I tensorflow/stream_executor/dso_loader.cc:111] successfully opened CUDA library libcuda.so.1 locally I tensorflow/stream_executor/dso_loader.cc:111] successfully opened CUDA library libcurand.so locally 0.11.0 ### If possible, provide a minimal reproducible example (We usually don't have time to read hundreds of lines of your code) when trying to call a function that is only supported by cudnn, for example conv2d
True
could not set cudnn filter descriptor: CUDNN_STATUS_BAD_PARAM - The version of cuda and cudnn meets the requirement, but still cannot use cudnn properly. ### What related GitHub issues or StackOverflow threads have you found by searching the web for your problem? ### Environment info Operating System: Linux version 3.16.0-30-generic (buildd@kissel) (gcc version 4.8.2 (Ubuntu 4.8.2-19ubuntu1) ) #40~14.04.1-Ubuntu Installed version of CUDA and cuDNN: (please attach the output of `ls -l /path/to/cuda/lib/libcud*`): -rw-r--r-- 1 root root 558720 Sep 15 07:02 /usr/local/cuda/lib64/libcudadevrt.a lrwxrwxrwx 1 root root 16 Sep 15 07:05 /usr/local/cuda/lib64/libcudart.so -> libcudart.so.8.0 lrwxrwxrwx 1 root root 19 Sep 15 07:05 /usr/local/cuda/lib64/libcudart.so.8.0 -> libcudart.so.8.0.44 -rw-r--r-- 1 root root 415432 Sep 15 07:02 /usr/local/cuda/lib64/libcudart.so.8.0.44 -rw-r--r-- 1 root root 775162 Sep 15 07:02 /usr/local/cuda/lib64/libcudart_static.a lrwxrwxrwx 1 root root 13 Nov 22 10:55 /usr/local/cuda/lib64/libcudnn.so -> libcudnn.so.5 lrwxrwxrwx 1 root root 17 Nov 22 10:55 /usr/local/cuda/lib64/libcudnn.so.5 -> libcudnn.so.5.1.5 -rw-r--r-- 1 root root 78065952 Nov 22 10:09 /usr/local/cuda/lib64/libcudnn.so.5.0.5 -rw-r--r-- 1 root root 79337624 Nov 22 10:17 /usr/local/cuda/lib64/libcudnn.so.5.1.5 -rw-r--r-- 1 root root 69756172 Nov 22 10:17 /usr/local/cuda/lib64/libcudnn_static.a If installed from binary pip package, provide: 1. A link to the pip package you installed: export TF_BINARY_URL=https://storage.googleapis.com/tensorflow/linux/gpu/tensorflow-0.11.0-cp27-none-linux_x86_64.whl 2. The output from `python -c "import tensorflow; print(tensorflow.__version__)"`. I tensorflow/stream_executor/dso_loader.cc:111] successfully opened CUDA library libcublas.so locally I tensorflow/stream_executor/dso_loader.cc:111] successfully opened CUDA library libcudnn.so locally I tensorflow/stream_executor/dso_loader.cc:111] successfully opened CUDA library libcufft.so locally I tensorflow/stream_executor/dso_loader.cc:111] successfully opened CUDA library libcuda.so.1 locally I tensorflow/stream_executor/dso_loader.cc:111] successfully opened CUDA library libcurand.so locally 0.11.0 ### If possible, provide a minimal reproducible example (We usually don't have time to read hundreds of lines of your code) when trying to call a function that is only supported by cudnn, for example conv2d
non_priority
could not set cudnn filter descriptor cudnn status bad param the version of cuda and cudnn meets the requirement but still cannot use cudnn properly what related github issues or stackoverflow threads have you found by searching the web for your problem environment info operating system linux version generic buildd kissel gcc version ubuntu ubuntu installed version of cuda and cudnn please attach the output of ls l path to cuda lib libcud rw r r root root sep usr local cuda libcudadevrt a lrwxrwxrwx root root sep usr local cuda libcudart so libcudart so lrwxrwxrwx root root sep usr local cuda libcudart so libcudart so rw r r root root sep usr local cuda libcudart so rw r r root root sep usr local cuda libcudart static a lrwxrwxrwx root root nov usr local cuda libcudnn so libcudnn so lrwxrwxrwx root root nov usr local cuda libcudnn so libcudnn so rw r r root root nov usr local cuda libcudnn so rw r r root root nov usr local cuda libcudnn so rw r r root root nov usr local cuda libcudnn static a if installed from binary pip package provide a link to the pip package you installed export tf binary url the output from python c import tensorflow print tensorflow version i tensorflow stream executor dso loader cc successfully opened cuda library libcublas so locally i tensorflow stream executor dso loader cc successfully opened cuda library libcudnn so locally i tensorflow stream executor dso loader cc successfully opened cuda library libcufft so locally i tensorflow stream executor dso loader cc successfully opened cuda library libcuda so locally i tensorflow stream executor dso loader cc successfully opened cuda library libcurand so locally if possible provide a minimal reproducible example we usually don t have time to read hundreds of lines of your code when trying to call a function that is only supported by cudnn for example
0
190,785
22,157,023,373
IssuesEvent
2022-06-04 00:59:42
ibm-cio-vulnerability-scanning/insomnia
https://api.github.com/repos/ibm-cio-vulnerability-scanning/insomnia
closed
CVE-2021-3757 (High) detected in immer-8.0.4.tgz - autoclosed
security vulnerability
## CVE-2021-3757 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>immer-8.0.4.tgz</b></p></summary> <p>Create your next immutable state by mutating the current one</p> <p>Library home page: <a href="https://registry.npmjs.org/immer/-/immer-8.0.4.tgz">https://registry.npmjs.org/immer/-/immer-8.0.4.tgz</a></p> <p> Dependency Hierarchy: - spectral-5.9.1.tgz (Root Library) - json-ref-resolver-3.1.1.tgz - :x: **immer-8.0.4.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ibm-cio-vulnerability-scanning/insomnia/commit/6584b84b5580d875cdc382437add0bd24e27b39e">6584b84b5580d875cdc382437add0bd24e27b39e</a></p> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> immer is vulnerable to Improperly Controlled Modification of Object Prototype Attributes ('Prototype Pollution') <p>Publish Date: 2021-09-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3757>CVE-2021-3757</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://huntr.dev/bounties/23d38099-71cd-42ed-a77a-71e68094adfa/">https://huntr.dev/bounties/23d38099-71cd-42ed-a77a-71e68094adfa/</a></p> <p>Release Date: 2021-09-02</p> <p>Fix Resolution (immer): 9.0.6</p> <p>Direct dependency fix Resolution (@stoplight/spectral): 6.0.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-3757 (High) detected in immer-8.0.4.tgz - autoclosed - ## CVE-2021-3757 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>immer-8.0.4.tgz</b></p></summary> <p>Create your next immutable state by mutating the current one</p> <p>Library home page: <a href="https://registry.npmjs.org/immer/-/immer-8.0.4.tgz">https://registry.npmjs.org/immer/-/immer-8.0.4.tgz</a></p> <p> Dependency Hierarchy: - spectral-5.9.1.tgz (Root Library) - json-ref-resolver-3.1.1.tgz - :x: **immer-8.0.4.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ibm-cio-vulnerability-scanning/insomnia/commit/6584b84b5580d875cdc382437add0bd24e27b39e">6584b84b5580d875cdc382437add0bd24e27b39e</a></p> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> immer is vulnerable to Improperly Controlled Modification of Object Prototype Attributes ('Prototype Pollution') <p>Publish Date: 2021-09-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3757>CVE-2021-3757</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://huntr.dev/bounties/23d38099-71cd-42ed-a77a-71e68094adfa/">https://huntr.dev/bounties/23d38099-71cd-42ed-a77a-71e68094adfa/</a></p> <p>Release Date: 2021-09-02</p> <p>Fix Resolution (immer): 9.0.6</p> <p>Direct dependency fix Resolution (@stoplight/spectral): 6.0.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve high detected in immer tgz autoclosed cve high severity vulnerability vulnerable library immer tgz create your next immutable state by mutating the current one library home page a href dependency hierarchy spectral tgz root library json ref resolver tgz x immer tgz vulnerable library found in head commit a href found in base branch develop vulnerability details immer is vulnerable to improperly controlled modification of object prototype attributes prototype pollution publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution immer direct dependency fix resolution stoplight spectral step up your open source security game with mend
0
44,772
7,127,437,699
IssuesEvent
2018-01-20 21:40:02
lee-dohm/staff-notes
https://api.github.com/repos/lee-dohm/staff-notes
closed
Make documentation available via GitHub Pages
documentation enhancement
* [ ] Stop ignoring docs output * [ ] Change documentation output directory to `docs/` * [ ] Enable GitHub Pages for the repo using `docs/` directory on `master`
1.0
Make documentation available via GitHub Pages - * [ ] Stop ignoring docs output * [ ] Change documentation output directory to `docs/` * [ ] Enable GitHub Pages for the repo using `docs/` directory on `master`
non_priority
make documentation available via github pages stop ignoring docs output change documentation output directory to docs enable github pages for the repo using docs directory on master
0
4,413
6,950,254,214
IssuesEvent
2017-12-06 10:07:22
clarkj93/Automatic-Review-Console
https://api.github.com/repos/clarkj93/Automatic-Review-Console
opened
Added new reviewedfeedback
requirement change
I've added a new reviewfeedback which indicates it was a potential hack, called SuspectHack, value of 4. https://github.com/clarkj93/Automatic-Review-Console/blob/38daf3946b9a8bfdf5a750790f4249db00e9728a/Automatic%20Review/ReviewTools.cs#L50
1.0
Added new reviewedfeedback - I've added a new reviewfeedback which indicates it was a potential hack, called SuspectHack, value of 4. https://github.com/clarkj93/Automatic-Review-Console/blob/38daf3946b9a8bfdf5a750790f4249db00e9728a/Automatic%20Review/ReviewTools.cs#L50
non_priority
added new reviewedfeedback i ve added a new reviewfeedback which indicates it was a potential hack called suspecthack value of
0
106,728
13,357,994,140
IssuesEvent
2020-08-31 10:51:03
sourcegraph/sourcegraph
https://api.github.com/repos/sourcegraph/sourcegraph
closed
'About Sourcegraph' and 'Help' pages should be opened in the separate tab
design estimate/0.5d team/web webapp
Current state: - Authorized users: 'Help' and 'About Sourcegraph' links, accessible from the 'Profile' dropdown, opens in the new tab - Un-authorized users: 'Help' and 'About' links, accessible from the top navigation, opens in the same tab. Proposed changes: - **Open Help and About links in the separate tab for all users.** We would avoid taking them out of the context of the app. - **Add 'open in the new tab' icon** in the dropdown menu visible for authorized users. @rrhyne @felixfbecker please, review. Thank you! See in [Figma](https://www.figma.com/file/iXNTQ4rsggVtH8ASe3Vnzg/About-Sourcegraph-and-Help-pages-should-be-opened-in-the-separate-tab?node-id=0%3A1) ![Menu-icon](https://user-images.githubusercontent.com/20326070/87757861-ae23d200-c80b-11ea-94cf-ea03752295b8.png)
1.0
'About Sourcegraph' and 'Help' pages should be opened in the separate tab - Current state: - Authorized users: 'Help' and 'About Sourcegraph' links, accessible from the 'Profile' dropdown, opens in the new tab - Un-authorized users: 'Help' and 'About' links, accessible from the top navigation, opens in the same tab. Proposed changes: - **Open Help and About links in the separate tab for all users.** We would avoid taking them out of the context of the app. - **Add 'open in the new tab' icon** in the dropdown menu visible for authorized users. @rrhyne @felixfbecker please, review. Thank you! See in [Figma](https://www.figma.com/file/iXNTQ4rsggVtH8ASe3Vnzg/About-Sourcegraph-and-Help-pages-should-be-opened-in-the-separate-tab?node-id=0%3A1) ![Menu-icon](https://user-images.githubusercontent.com/20326070/87757861-ae23d200-c80b-11ea-94cf-ea03752295b8.png)
non_priority
about sourcegraph and help pages should be opened in the separate tab current state authorized users help and about sourcegraph links accessible from the profile dropdown opens in the new tab un authorized users help and about links accessible from the top navigation opens in the same tab proposed changes open help and about links in the separate tab for all users we would avoid taking them out of the context of the app add open in the new tab icon in the dropdown menu visible for authorized users rrhyne felixfbecker please review thank you see in
0
27,878
8,050,150,339
IssuesEvent
2018-08-01 12:36:34
ShaikASK/Testing
https://api.github.com/repos/ShaikASK/Testing
closed
Candidate Dashboard : C2C / 1099 : (Intermittently) user is able to navigate to "Dashboard" screen without filling all the web forms
Candidate Dashboard Candidate Module Defect P3 Release #3 Build #16
Steps To Replicate : 1. Launch the URL 2. Sign in as HR candidate credentials (c2c / 1099 Hire category) 3. Click on Get Started form 4. Sign the Offer letter 5. Navigate to common details web from 6. Click on submit navigate to contractor web form click on next Experienced Behavior : Observed that user is able to navigate to "Dashboard" screen without filling all the web forms when check with C2C/1099 category Expected Behavior : Ensure that application should not allow the user to navigate to "Dashboard" screen without filling all the web forms
1.0
Candidate Dashboard : C2C / 1099 : (Intermittently) user is able to navigate to "Dashboard" screen without filling all the web forms - Steps To Replicate : 1. Launch the URL 2. Sign in as HR candidate credentials (c2c / 1099 Hire category) 3. Click on Get Started form 4. Sign the Offer letter 5. Navigate to common details web from 6. Click on submit navigate to contractor web form click on next Experienced Behavior : Observed that user is able to navigate to "Dashboard" screen without filling all the web forms when check with C2C/1099 category Expected Behavior : Ensure that application should not allow the user to navigate to "Dashboard" screen without filling all the web forms
non_priority
candidate dashboard intermittently user is able to navigate to dashboard screen without filling all the web forms steps to replicate launch the url sign in as hr candidate credentials hire category click on get started form sign the offer letter navigate to common details web from click on submit navigate to contractor web form click on next experienced behavior observed that user is able to navigate to dashboard screen without filling all the web forms when check with category expected behavior ensure that application should not allow the user to navigate to dashboard screen without filling all the web forms
0
83,905
24,167,005,795
IssuesEvent
2022-09-22 15:47:25
elastic/elastic-agent
https://api.github.com/repos/elastic/elastic-agent
closed
Build 50 for 8.3 with status FAILURE
Team:Elastic-Agent-Control-Plane ci-reported automation build-failures
## :broken_heart: Tests Failed <!-- BUILD BADGES--> > _the below badges are clickable and redirect to their specific view in the CI or DOCS_ [![Pipeline View](https://img.shields.io/badge/pipeline-pipeline%20-green)](https://fleet-ci.elastic.co/blue/organizations/jenkins/elastic-agent%2Felastic-agent-mbp%2F8.3/detail/8.3/50//pipeline) [![Test View](https://img.shields.io/badge/test-test-green)](https://fleet-ci.elastic.co/blue/organizations/jenkins/elastic-agent%2Felastic-agent-mbp%2F8.3/detail/8.3/50//tests) [![Changes](https://img.shields.io/badge/changes-changes-green)](https://fleet-ci.elastic.co/blue/organizations/jenkins/elastic-agent%2Felastic-agent-mbp%2F8.3/detail/8.3/50//changes) [![Artifacts](https://img.shields.io/badge/artifacts-artifacts-yellow)](https://fleet-ci.elastic.co/blue/organizations/jenkins/elastic-agent%2Felastic-agent-mbp%2F8.3/detail/8.3/50//artifacts) [![preview](https://img.shields.io/badge/docs-preview-yellowgreen)](http://elastic-agent_null.docs-preview.app.elstc.co/diff) [![preview](https://img.shields.io/badge/elastic-observability-blue)](https://ci-stats.elastic.co/app/apm/services/fleet-ci/transactions/view?rangeFrom=2022-07-14T03:28:28.489Z&rangeTo=2022-07-14T03:48:28.489Z&transactionName=elastic-agent/elastic-agent-mbp/8.3&transactionType=job&latencyAggregationType=avg&traceId=92398573f15185256141bc13c255e139&transactionId=dbd189d7281c6f99) <!-- BUILD SUMMARY--> <details><summary>Expand to view the summary</summary> <p> #### Build stats * Start Time: 2022-07-14T03:38:28.489+0000 * Duration: 33 min 26 sec #### Test stats :test_tube: | Test | Results | | ------------ | :-----------------------------: | | Failed | 1 | | Passed | 6010 | | Skipped | 23 | | Total | 6034 | </p> </details> <!-- TEST RESULTS IF ANY--> ### Test errors [![1](https://img.shields.io/badge/1%20-red)](https://fleet-ci.elastic.co/blue/organizations/jenkins/elastic-agent%2Felastic-agent-mbp%2F8.3/detail/8.3/50//tests) <details><summary>Expand to view the tests failures</summary><p> ##### `Test / Matrix - PLATFORM = "windows-2022 && windows-immutable" / Test / TestDownloadBodyError – github.com/elastic/elastic-agent/internal/pkg/artifact/download/http` <ul> <details><summary>Expand to view the error details</summary><p> ``` Failed ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` === RUN TestDownloadBodyError downloader_test.go:67: Error Trace: downloader_test.go:67 Error: Not equal: expected: "download progress from %s has fetched %s @ %sps" actual : "download from %s failed at %s @ %sps: %s" Diff: --- Expected +++ Actual @@ -1 +1 @@ -download progress from %s has fetched %s @ %sps +download from %s failed at %s @ %sps: %s Test: TestDownloadBodyError downloader_test.go:69: Error Trace: downloader_test.go:69 Error: Not equal: expected: "download progress from %s has fetched %s @ %sps" actual : "download from %s failed at %s @ %sps: %s" Diff: --- Expected +++ Actual @@ -1 +1 @@ -download progress from %s has fetched %s @ %sps +download from %s failed at %s @ %sps: %s Test: TestDownloadBodyError --- FAIL: TestDownloadBodyError (0.02s) ``` </p></details> </ul> </p></details> <!-- STEPS ERRORS IF ANY --> ### Steps errors [![2](https://img.shields.io/badge/2%20-red)](https://fleet-ci.elastic.co/blue/organizations/jenkins/elastic-agent%2Felastic-agent-mbp%2F8.3/detail/8.3/50//pipeline) <details><summary>Expand to view the steps failures</summary> <p> ##### `Go unitTest` <ul> <li>Took 2 min 28 sec . View more details <a href="https://fleet-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/elastic-agent/pipelines/elastic-agent-mbp/pipelines/8.3/runs/50/steps/810/log/?start=0">here</a></li> <li>Description: <code>mage unitTest</code></l1> </ul> ##### `Checks if running on a Unix-like node` <ul> <li>Took 0 min 0 sec . View more details <a href="https://fleet-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/elastic-agent/pipelines/elastic-agent-mbp/pipelines/8.3/runs/50/steps/1076/log/?start=0">here</a></li> <li>Description: <code>script returned exit code 1</code></l1> </ul> </p> </details>
1.0
Build 50 for 8.3 with status FAILURE - ## :broken_heart: Tests Failed <!-- BUILD BADGES--> > _the below badges are clickable and redirect to their specific view in the CI or DOCS_ [![Pipeline View](https://img.shields.io/badge/pipeline-pipeline%20-green)](https://fleet-ci.elastic.co/blue/organizations/jenkins/elastic-agent%2Felastic-agent-mbp%2F8.3/detail/8.3/50//pipeline) [![Test View](https://img.shields.io/badge/test-test-green)](https://fleet-ci.elastic.co/blue/organizations/jenkins/elastic-agent%2Felastic-agent-mbp%2F8.3/detail/8.3/50//tests) [![Changes](https://img.shields.io/badge/changes-changes-green)](https://fleet-ci.elastic.co/blue/organizations/jenkins/elastic-agent%2Felastic-agent-mbp%2F8.3/detail/8.3/50//changes) [![Artifacts](https://img.shields.io/badge/artifacts-artifacts-yellow)](https://fleet-ci.elastic.co/blue/organizations/jenkins/elastic-agent%2Felastic-agent-mbp%2F8.3/detail/8.3/50//artifacts) [![preview](https://img.shields.io/badge/docs-preview-yellowgreen)](http://elastic-agent_null.docs-preview.app.elstc.co/diff) [![preview](https://img.shields.io/badge/elastic-observability-blue)](https://ci-stats.elastic.co/app/apm/services/fleet-ci/transactions/view?rangeFrom=2022-07-14T03:28:28.489Z&rangeTo=2022-07-14T03:48:28.489Z&transactionName=elastic-agent/elastic-agent-mbp/8.3&transactionType=job&latencyAggregationType=avg&traceId=92398573f15185256141bc13c255e139&transactionId=dbd189d7281c6f99) <!-- BUILD SUMMARY--> <details><summary>Expand to view the summary</summary> <p> #### Build stats * Start Time: 2022-07-14T03:38:28.489+0000 * Duration: 33 min 26 sec #### Test stats :test_tube: | Test | Results | | ------------ | :-----------------------------: | | Failed | 1 | | Passed | 6010 | | Skipped | 23 | | Total | 6034 | </p> </details> <!-- TEST RESULTS IF ANY--> ### Test errors [![1](https://img.shields.io/badge/1%20-red)](https://fleet-ci.elastic.co/blue/organizations/jenkins/elastic-agent%2Felastic-agent-mbp%2F8.3/detail/8.3/50//tests) <details><summary>Expand to view the tests failures</summary><p> ##### `Test / Matrix - PLATFORM = "windows-2022 && windows-immutable" / Test / TestDownloadBodyError – github.com/elastic/elastic-agent/internal/pkg/artifact/download/http` <ul> <details><summary>Expand to view the error details</summary><p> ``` Failed ``` </p></details> <details><summary>Expand to view the stacktrace</summary><p> ``` === RUN TestDownloadBodyError downloader_test.go:67: Error Trace: downloader_test.go:67 Error: Not equal: expected: "download progress from %s has fetched %s @ %sps" actual : "download from %s failed at %s @ %sps: %s" Diff: --- Expected +++ Actual @@ -1 +1 @@ -download progress from %s has fetched %s @ %sps +download from %s failed at %s @ %sps: %s Test: TestDownloadBodyError downloader_test.go:69: Error Trace: downloader_test.go:69 Error: Not equal: expected: "download progress from %s has fetched %s @ %sps" actual : "download from %s failed at %s @ %sps: %s" Diff: --- Expected +++ Actual @@ -1 +1 @@ -download progress from %s has fetched %s @ %sps +download from %s failed at %s @ %sps: %s Test: TestDownloadBodyError --- FAIL: TestDownloadBodyError (0.02s) ``` </p></details> </ul> </p></details> <!-- STEPS ERRORS IF ANY --> ### Steps errors [![2](https://img.shields.io/badge/2%20-red)](https://fleet-ci.elastic.co/blue/organizations/jenkins/elastic-agent%2Felastic-agent-mbp%2F8.3/detail/8.3/50//pipeline) <details><summary>Expand to view the steps failures</summary> <p> ##### `Go unitTest` <ul> <li>Took 2 min 28 sec . View more details <a href="https://fleet-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/elastic-agent/pipelines/elastic-agent-mbp/pipelines/8.3/runs/50/steps/810/log/?start=0">here</a></li> <li>Description: <code>mage unitTest</code></l1> </ul> ##### `Checks if running on a Unix-like node` <ul> <li>Took 0 min 0 sec . View more details <a href="https://fleet-ci.elastic.co//blue/rest/organizations/jenkins/pipelines/elastic-agent/pipelines/elastic-agent-mbp/pipelines/8.3/runs/50/steps/1076/log/?start=0">here</a></li> <li>Description: <code>script returned exit code 1</code></l1> </ul> </p> </details>
non_priority
build for with status failure broken heart tests failed the below badges are clickable and redirect to their specific view in the ci or docs expand to view the summary build stats start time duration min sec test stats test tube test results failed passed skipped total test errors expand to view the tests failures test matrix platform windows windows immutable test testdownloadbodyerror – github com elastic elastic agent internal pkg artifact download http expand to view the error details failed expand to view the stacktrace run testdownloadbodyerror downloader test go error trace downloader test go error not equal expected download progress from s has fetched s sps actual download from s failed at s sps s diff expected actual download progress from s has fetched s sps download from s failed at s sps s test testdownloadbodyerror downloader test go error trace downloader test go error not equal expected download progress from s has fetched s sps actual download from s failed at s sps s diff expected actual download progress from s has fetched s sps download from s failed at s sps s test testdownloadbodyerror fail testdownloadbodyerror steps errors expand to view the steps failures go unittest took min sec view more details a href description mage unittest checks if running on a unix like node took min sec view more details a href description script returned exit code
0
294,536
22,154,950,655
IssuesEvent
2022-06-03 21:17:37
cyberartclub/cyberartclub-html-site
https://api.github.com/repos/cyberartclub/cyberartclub-html-site
closed
First build
documentation
First build tasks left: - [ ] #5 - [ ] #6 - [ ] #7 - [ ] #8 - [ ] #9 - [ ] #10 - [ ] #11 - [ ] #12 - [ ] #13 - [ ] #14 - [ ] #15 - [ ] #16 - [ ] #17 - [ ] #18 _Originally posted by @cyberartclub in https://github.com/cyberartclub/cyberartclub-html-site/issues/2#issuecomment-1144039106_
1.0
First build - First build tasks left: - [ ] #5 - [ ] #6 - [ ] #7 - [ ] #8 - [ ] #9 - [ ] #10 - [ ] #11 - [ ] #12 - [ ] #13 - [ ] #14 - [ ] #15 - [ ] #16 - [ ] #17 - [ ] #18 _Originally posted by @cyberartclub in https://github.com/cyberartclub/cyberartclub-html-site/issues/2#issuecomment-1144039106_
non_priority
first build first build tasks left originally posted by cyberartclub in
0
162,439
25,538,282,923
IssuesEvent
2022-11-29 13:37:32
ProjectSidewalk/SidewalkWebpage
https://api.github.com/repos/ProjectSidewalk/SidewalkWebpage
opened
One digit of precision accuracy in sidebar
Easy Fix UI Design Gamified Sidebar
Don't need two digits of precision for accuracy. Let's just have %.1f for float ![image](https://user-images.githubusercontent.com/1621749/204543149-4fc4c6bb-6a54-4835-b3a1-ce38a808b3dd.png)
1.0
One digit of precision accuracy in sidebar - Don't need two digits of precision for accuracy. Let's just have %.1f for float ![image](https://user-images.githubusercontent.com/1621749/204543149-4fc4c6bb-6a54-4835-b3a1-ce38a808b3dd.png)
non_priority
one digit of precision accuracy in sidebar don t need two digits of precision for accuracy let s just have for float
0
163,424
20,363,752,131
IssuesEvent
2022-02-21 01:23:35
howlr-me/howlr-front
https://api.github.com/repos/howlr-me/howlr-front
opened
CVE-2021-27515 (Medium) detected in url-parse-1.4.7.tgz
security vulnerability
## CVE-2021-27515 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>url-parse-1.4.7.tgz</b></p></summary> <p>Small footprint URL parser that works seamlessly across Node.js and browser environments</p> <p>Library home page: <a href="https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz">https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz</a></p> <p>Path to dependency file: /howlr-front/package.json</p> <p>Path to vulnerable library: /node_modules/url-parse/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.0.1.tgz (Root Library) - react-dev-utils-9.0.1.tgz - sockjs-client-1.3.0.tgz - :x: **url-parse-1.4.7.tgz** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> url-parse before 1.5.0 mishandles certain uses of backslash such as http:\/ and interprets the URI as a relative path. <p>Publish Date: 2021-02-22 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-27515>CVE-2021-27515</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-27515">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-27515</a></p> <p>Release Date: 2021-02-22</p> <p>Fix Resolution (url-parse): 1.5.0</p> <p>Direct dependency fix Resolution (react-scripts): 3.1.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-27515 (Medium) detected in url-parse-1.4.7.tgz - ## CVE-2021-27515 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>url-parse-1.4.7.tgz</b></p></summary> <p>Small footprint URL parser that works seamlessly across Node.js and browser environments</p> <p>Library home page: <a href="https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz">https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz</a></p> <p>Path to dependency file: /howlr-front/package.json</p> <p>Path to vulnerable library: /node_modules/url-parse/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.0.1.tgz (Root Library) - react-dev-utils-9.0.1.tgz - sockjs-client-1.3.0.tgz - :x: **url-parse-1.4.7.tgz** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> url-parse before 1.5.0 mishandles certain uses of backslash such as http:\/ and interprets the URI as a relative path. <p>Publish Date: 2021-02-22 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-27515>CVE-2021-27515</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-27515">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-27515</a></p> <p>Release Date: 2021-02-22</p> <p>Fix Resolution (url-parse): 1.5.0</p> <p>Direct dependency fix Resolution (react-scripts): 3.1.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve medium detected in url parse tgz cve medium severity vulnerability vulnerable library url parse tgz small footprint url parser that works seamlessly across node js and browser environments library home page a href path to dependency file howlr front package json path to vulnerable library node modules url parse package json dependency hierarchy react scripts tgz root library react dev utils tgz sockjs client tgz x url parse tgz vulnerable library vulnerability details url parse before mishandles certain uses of backslash such as http and interprets the uri as a relative path publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution url parse direct dependency fix resolution react scripts step up your open source security game with whitesource
0
120,344
17,644,176,951
IssuesEvent
2021-08-20 01:52:53
SmartBear/soapui
https://api.github.com/repos/SmartBear/soapui
closed
CVE-2014-3623 (Medium) detected in wss4j-1.6.16.jar - autoclosed
security vulnerability
## CVE-2014-3623 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>wss4j-1.6.16.jar</b></p></summary> <p>The Apache WSS4J project provides a Java implementation of the primary security standards for Web Services, namely the OASIS Web Services Security (WS-Security) specifications from the OASIS Web Services Security TC.</p> <p>Library home page: <a href="http://ws.apache.org/wss4j/">http://ws.apache.org/wss4j/</a></p> <p>Path to dependency file: soapui/soapui/pom.xml</p> <p>Path to vulnerable library: canner/.m2/repository/org/apache/ws/security/wss4j/1.6.16/wss4j-1.6.16.jar</p> <p> Dependency Hierarchy: - :x: **wss4j-1.6.16.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/SmartBear/soapui/commit/3fc3395705d03e968d0c41c6466e63c1cd280a4e">3fc3395705d03e968d0c41c6466e63c1cd280a4e</a></p> <p>Found in base branch: <b>next</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Apache WSS4J before 1.6.17 and 2.x before 2.0.2, as used in Apache CXF 2.7.x before 2.7.13 and 3.0.x before 3.0.2, when using TransportBinding, does not properly enforce the SAML SubjectConfirmation method security semantics, which allows remote attackers to conduct spoofing attacks via unspecified vectors. <p>Publish Date: 2014-10-30 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2014-3623>CVE-2014-3623</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: N/A - Attack Complexity: N/A - Privileges Required: N/A - User Interaction: N/A - Scope: N/A - Impact Metrics: - Confidentiality Impact: N/A - Integrity Impact: N/A - Availability Impact: N/A </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2014-3623">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2014-3623</a></p> <p>Release Date: 2014-10-30</p> <p>Fix Resolution: org.apache.wss4j:wss4j-ws-security-stax:2.0.3,org.apache.wss4j:wss4j-ws-security-dom:2.0.3,org.apache.ws.security:wss4j:2.0.3</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END --> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.ws.security","packageName":"wss4j","packageVersion":"1.6.16","packageFilePaths":["/soapui/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"org.apache.ws.security:wss4j:1.6.16","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.wss4j:wss4j-ws-security-stax:2.0.3,org.apache.wss4j:wss4j-ws-security-dom:2.0.3,org.apache.ws.security:wss4j:2.0.3"}],"baseBranches":["next"],"vulnerabilityIdentifier":"CVE-2014-3623","vulnerabilityDetails":"Apache WSS4J before 1.6.17 and 2.x before 2.0.2, as used in Apache CXF 2.7.x before 2.7.13 and 3.0.x before 3.0.2, when using TransportBinding, does not properly enforce the SAML SubjectConfirmation method security semantics, which allows remote attackers to conduct spoofing attacks via unspecified vectors.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2014-3623","cvss3Severity":"medium","cvss3Score":"5.5","cvss3Metrics":{"A":"N/A","AC":"N/A","PR":"N/A","S":"N/A","C":"N/A","UI":"N/A","AV":"N/A","I":"N/A"},"extraData":{}}</REMEDIATE> -->
True
CVE-2014-3623 (Medium) detected in wss4j-1.6.16.jar - autoclosed - ## CVE-2014-3623 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>wss4j-1.6.16.jar</b></p></summary> <p>The Apache WSS4J project provides a Java implementation of the primary security standards for Web Services, namely the OASIS Web Services Security (WS-Security) specifications from the OASIS Web Services Security TC.</p> <p>Library home page: <a href="http://ws.apache.org/wss4j/">http://ws.apache.org/wss4j/</a></p> <p>Path to dependency file: soapui/soapui/pom.xml</p> <p>Path to vulnerable library: canner/.m2/repository/org/apache/ws/security/wss4j/1.6.16/wss4j-1.6.16.jar</p> <p> Dependency Hierarchy: - :x: **wss4j-1.6.16.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/SmartBear/soapui/commit/3fc3395705d03e968d0c41c6466e63c1cd280a4e">3fc3395705d03e968d0c41c6466e63c1cd280a4e</a></p> <p>Found in base branch: <b>next</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Apache WSS4J before 1.6.17 and 2.x before 2.0.2, as used in Apache CXF 2.7.x before 2.7.13 and 3.0.x before 3.0.2, when using TransportBinding, does not properly enforce the SAML SubjectConfirmation method security semantics, which allows remote attackers to conduct spoofing attacks via unspecified vectors. <p>Publish Date: 2014-10-30 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2014-3623>CVE-2014-3623</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: N/A - Attack Complexity: N/A - Privileges Required: N/A - User Interaction: N/A - Scope: N/A - Impact Metrics: - Confidentiality Impact: N/A - Integrity Impact: N/A - Availability Impact: N/A </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2014-3623">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2014-3623</a></p> <p>Release Date: 2014-10-30</p> <p>Fix Resolution: org.apache.wss4j:wss4j-ws-security-stax:2.0.3,org.apache.wss4j:wss4j-ws-security-dom:2.0.3,org.apache.ws.security:wss4j:2.0.3</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END --> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.ws.security","packageName":"wss4j","packageVersion":"1.6.16","packageFilePaths":["/soapui/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"org.apache.ws.security:wss4j:1.6.16","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.wss4j:wss4j-ws-security-stax:2.0.3,org.apache.wss4j:wss4j-ws-security-dom:2.0.3,org.apache.ws.security:wss4j:2.0.3"}],"baseBranches":["next"],"vulnerabilityIdentifier":"CVE-2014-3623","vulnerabilityDetails":"Apache WSS4J before 1.6.17 and 2.x before 2.0.2, as used in Apache CXF 2.7.x before 2.7.13 and 3.0.x before 3.0.2, when using TransportBinding, does not properly enforce the SAML SubjectConfirmation method security semantics, which allows remote attackers to conduct spoofing attacks via unspecified vectors.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2014-3623","cvss3Severity":"medium","cvss3Score":"5.5","cvss3Metrics":{"A":"N/A","AC":"N/A","PR":"N/A","S":"N/A","C":"N/A","UI":"N/A","AV":"N/A","I":"N/A"},"extraData":{}}</REMEDIATE> -->
non_priority
cve medium detected in jar autoclosed cve medium severity vulnerability vulnerable library jar the apache project provides a java implementation of the primary security standards for web services namely the oasis web services security ws security specifications from the oasis web services security tc library home page a href path to dependency file soapui soapui pom xml path to vulnerable library canner repository org apache ws security jar dependency hierarchy x jar vulnerable library found in head commit a href found in base branch next vulnerability details apache before and x before as used in apache cxf x before and x before when using transportbinding does not properly enforce the saml subjectconfirmation method security semantics which allows remote attackers to conduct spoofing attacks via unspecified vectors publish date url a href cvss score details base score metrics exploitability metrics attack vector n a attack complexity n a privileges required n a user interaction n a scope n a impact metrics confidentiality impact n a integrity impact n a availability impact n a for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache ws security stax org apache ws security dom org apache ws security check this box to open an automated fix pr isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree org apache ws security isminimumfixversionavailable true minimumfixversion org apache ws security stax org apache ws security dom org apache ws security basebranches vulnerabilityidentifier cve vulnerabilitydetails apache before and x before as used in apache cxf x before and x before when using transportbinding does not properly enforce the saml subjectconfirmation method security semantics which allows remote attackers to conduct spoofing attacks via unspecified vectors vulnerabilityurl
0
76,335
14,597,714,908
IssuesEvent
2020-12-20 21:23:13
joomla/joomla-cms
https://api.github.com/repos/joomla/joomla-cms
closed
[4.0] Notice in user account rendering tabs
No Code Attached Yet bug
### Steps to reproduce the issue Go to backend, edit user accout. Notice: Undefined index: Joomla\CMS\HTML\Helpers\Bootstrap::startTabSet in libraries\src\HTML\Helpers\Bootstrap.php on line 537
1.0
[4.0] Notice in user account rendering tabs - ### Steps to reproduce the issue Go to backend, edit user accout. Notice: Undefined index: Joomla\CMS\HTML\Helpers\Bootstrap::startTabSet in libraries\src\HTML\Helpers\Bootstrap.php on line 537
non_priority
notice in user account rendering tabs steps to reproduce the issue go to backend edit user accout notice undefined index joomla cms html helpers bootstrap starttabset in libraries src html helpers bootstrap php on line
0