Unnamed: 0 int64 1 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 7 112 | repo_url stringlengths 36 141 | action stringclasses 3 values | title stringlengths 3 438 | labels stringlengths 4 308 | body stringlengths 7 254k | index stringclasses 7 values | text_combine stringlengths 96 254k | label stringclasses 2 values | text stringlengths 96 246k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
663,854 | 22,208,950,824 | IssuesEvent | 2022-06-07 17:16:54 | ooni/probe | https://api.github.com/repos/ooni/probe | closed | mobile: explore sharing code using Kotlin Native | ooni/probe-mobile priority/low | Kotlin Native seems great. We should figure out how practical it is to use Kotlin Native to share code between the Android and iOS app. It's also worth asking the question of whether some CLI code could be in Kotlin Native (seems harder). If we're able to share more code with Kotlin Native, then we reduce the duplication in our codebase. | 1.0 | mobile: explore sharing code using Kotlin Native - Kotlin Native seems great. We should figure out how practical it is to use Kotlin Native to share code between the Android and iOS app. It's also worth asking the question of whether some CLI code could be in Kotlin Native (seems harder). If we're able to share more code with Kotlin Native, then we reduce the duplication in our codebase. | non_main | mobile explore sharing code using kotlin native kotlin native seems great we should figure out how practical it is to use kotlin native to share code between the android and ios app it s also worth asking the question of whether some cli code could be in kotlin native seems harder if we re able to share more code with kotlin native then we reduce the duplication in our codebase | 0 |
5,871 | 5,195,844,048 | IssuesEvent | 2017-01-23 10:44:54 | IDotD/Userscript | https://api.github.com/repos/IDotD/Userscript | opened | Massive performance problems of the Tiers tab | Bug Performance | I noticed there is quite a problem with performance when using the Tiers tab to search for tiers of a raid. When starting to enter something into the bossname field, the performance goes down to nothing.
When the raid is found, as example entering 'horth' for Horthania, it throws 1000 and 200 error messages of type:
`[IDotDS] TypeError: idrinth.tier[listKey] is undefined`
The first time one tries to open the tierbox of the raid ( in this example Horthania), it throws 100 errors of the same type as above and the tierbox won't show up.
Afterwards, next tries, it will always show up properly without error message.
Entering 'horg' for Horgrak will throw lots of errors like:
```
[IDotDS] TypeError: idrinth.tier.list[listKey].epics[difficulty] is undefined Ardenian:1358:9
[IDotDS] TypeError: idrinth.tier[listKey] is undefined
```
and Hiorgrak won't show up as raid boss in the Tiers tab.
When entering a single character, like 'a', the browser will freeze and Firefox will ask you eventually whether it should stop the script. | True | Massive performance problems of the Tiers tab - I noticed there is quite a problem with performance when using the Tiers tab to search for tiers of a raid. When starting to enter something into the bossname field, the performance goes down to nothing.
When the raid is found, as example entering 'horth' for Horthania, it throws 1000 and 200 error messages of type:
`[IDotDS] TypeError: idrinth.tier[listKey] is undefined`
The first time one tries to open the tierbox of the raid ( in this example Horthania), it throws 100 errors of the same type as above and the tierbox won't show up.
Afterwards, next tries, it will always show up properly without error message.
Entering 'horg' for Horgrak will throw lots of errors like:
```
[IDotDS] TypeError: idrinth.tier.list[listKey].epics[difficulty] is undefined Ardenian:1358:9
[IDotDS] TypeError: idrinth.tier[listKey] is undefined
```
and Hiorgrak won't show up as raid boss in the Tiers tab.
When entering a single character, like 'a', the browser will freeze and Firefox will ask you eventually whether it should stop the script. | non_main | massive performance problems of the tiers tab i noticed there is quite a problem with performance when using the tiers tab to search for tiers of a raid when starting to enter something into the bossname field the performance goes down to nothing when the raid is found as example entering horth for horthania it throws and error messages of type typeerror idrinth tier is undefined the first time one tries to open the tierbox of the raid in this example horthania it throws errors of the same type as above and the tierbox won t show up afterwards next tries it will always show up properly without error message entering horg for horgrak will throw lots of errors like typeerror idrinth tier list epics is undefined ardenian typeerror idrinth tier is undefined and hiorgrak won t show up as raid boss in the tiers tab when entering a single character like a the browser will freeze and firefox will ask you eventually whether it should stop the script | 0 |
1,133 | 4,998,450,274 | IssuesEvent | 2016-12-09 19:53:36 | ansible/ansible-modules-core | https://api.github.com/repos/ansible/ansible-modules-core | closed | docker_container sends volume to docker API erroneously | affects_2.2 bug_report cloud docker waiting_on_maintainer | <!--- Verify first that your issue/request is not already reported in GitHub -->
##### ISSUE TYPE
<!--- Pick one below and delete the rest: -->
- Bug Report
##### COMPONENT NAME
<!--- Name of the plugin/module/task -->
docker_container
##### ANSIBLE VERSION
<!--- Paste verbatim output from “ansible --version” between quotes below -->
```
ansible 2.2.0 (devel 3874e653c1)
```
##### CONFIGURATION
<!---
Mention any settings you have changed/added/removed in ansible.cfg
(or using the ANSIBLE_* environment variables).
-->
##### OS / ENVIRONMENT
<!---
Mention the OS you are running Ansible from, and the OS you are
managing, or say “N/A” for anything that is not platform-specific.
-->
N/A
##### SUMMARY
<!--- Explain the problem briefly -->
docker_container erroneously sends `volume` in addition to `binds` to docker API during create container call.
##### STEPS TO REPRODUCE
<!---
For bugs, show exactly how to reproduce the problem.
For new features, show how the feature would be used.
-->
<!--- Paste example playbooks or commands between quotes below -->
```
- name: assert state of my container
docker_container:
name: mc
image: "my_container:1.0"
command: "execute"
state: started
restart_policy: always
exposed_ports:
- 12345
network_mode: host
detach: True
volumes:
- "/a/b:/a/b"
log_driver: "json-file"
log_options:
max-size: "10m"
max-file: "10"
```
<!--- You can also paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- What did you expect to happen when running the steps above? -->
Docker daemon logs w/ debug level
```
DEBU[0111] Calling POST /v1.23/containers/create?name=mc
DEBU[0111] form data: {"AttachStderr":false,"AttachStdin":false,"AttachStdout":false,"Cmd":["execute"],"Env":[],"ExposedPorts":{"12345/tcp":{}},"HostConfig":{"Binds":["/a/b:/a/b:rw"],"LogConfig":{"Config":{"max-file":"10","max-size":"10m"},"Type":"json-file"},"Memory":0,"NetworkMode":"host","ReadonlyRootfs":false,"RestartPolicy":{"MaximumRetryCount":null,"Name":"always"}},"Image":"my_container:1.0","NetworkDisabled":false,"OpenStdin":false,"StdinOnce":false,"Tty":false,"Volumes":{}}
```
##### ACTUAL RESULTS
<!--- What actually happened? If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes below -->
```
DEBU[0111] Calling POST /v1.23/containers/create?name=mc
DEBU[0111] form data: {"AttachStderr":false,"AttachStdin":false,"AttachStdout":false,"Cmd":["execute"],"Env":[],"ExposedPorts":{"12345/tcp":{}},"HostConfig":{"Binds":["/a/b:/a/b:rw"],"LogConfig":{"Config":{"max-file":"10","max-size":"10m"},"Type":"json-file"},"Memory":0,"NetworkMode":"host","ReadonlyRootfs":false,"RestartPolicy":{"MaximumRetryCount":null,"Name":"always"}},"Image":"my_container:1.0","NetworkDisabled":false,"OpenStdin":false,"StdinOnce":false,"Tty":false,"Volumes":{"/a/b":{}}}
```
When the same task (adapted for) is run using the deprecated docker module we see the volume is not sent:
```
DEBU[0328] Calling POST /v1.22/containers/create?name=mc
DEBU[0328] form data: {"AttachStderr":false,"AttachStdin":false,"AttachStdout":false,"Cmd":["execute"],"CpuShares":0,"Env":[],"ExposedPorts":{"12345/tcp":{}},"HostConfig":{"Binds":["/a/b:/a/b:rw"],"LogConfig":{"Config":{"max-file":"10","max-size":"10m"},"Type":"json-file"},"Memory":0,"NetworkMode":"host","RestartPolicy":{"Name":"always"}},"Image":"my_container:1.0","Labels":{},"NetworkDisabled":false,"OpenStdin":false,"StdinOnce":false,"Tty":false,"Volumes":{}}
```
Note pinning API version to 1.22 on `docker_container` made no difference. It still ends up sending volume as part of the request. Based on the thread below it seems that volume shouldn't be sent in this scenario:
https://github.com/docker/docker/issues/2949#issuecomment-230883544
If my understating is accurate can this be looked into? Thanks.
| True | docker_container sends volume to docker API erroneously - <!--- Verify first that your issue/request is not already reported in GitHub -->
##### ISSUE TYPE
<!--- Pick one below and delete the rest: -->
- Bug Report
##### COMPONENT NAME
<!--- Name of the plugin/module/task -->
docker_container
##### ANSIBLE VERSION
<!--- Paste verbatim output from “ansible --version” between quotes below -->
```
ansible 2.2.0 (devel 3874e653c1)
```
##### CONFIGURATION
<!---
Mention any settings you have changed/added/removed in ansible.cfg
(or using the ANSIBLE_* environment variables).
-->
##### OS / ENVIRONMENT
<!---
Mention the OS you are running Ansible from, and the OS you are
managing, or say “N/A” for anything that is not platform-specific.
-->
N/A
##### SUMMARY
<!--- Explain the problem briefly -->
docker_container erroneously sends `volume` in addition to `binds` to docker API during create container call.
##### STEPS TO REPRODUCE
<!---
For bugs, show exactly how to reproduce the problem.
For new features, show how the feature would be used.
-->
<!--- Paste example playbooks or commands between quotes below -->
```
- name: assert state of my container
docker_container:
name: mc
image: "my_container:1.0"
command: "execute"
state: started
restart_policy: always
exposed_ports:
- 12345
network_mode: host
detach: True
volumes:
- "/a/b:/a/b"
log_driver: "json-file"
log_options:
max-size: "10m"
max-file: "10"
```
<!--- You can also paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- What did you expect to happen when running the steps above? -->
Docker daemon logs w/ debug level
```
DEBU[0111] Calling POST /v1.23/containers/create?name=mc
DEBU[0111] form data: {"AttachStderr":false,"AttachStdin":false,"AttachStdout":false,"Cmd":["execute"],"Env":[],"ExposedPorts":{"12345/tcp":{}},"HostConfig":{"Binds":["/a/b:/a/b:rw"],"LogConfig":{"Config":{"max-file":"10","max-size":"10m"},"Type":"json-file"},"Memory":0,"NetworkMode":"host","ReadonlyRootfs":false,"RestartPolicy":{"MaximumRetryCount":null,"Name":"always"}},"Image":"my_container:1.0","NetworkDisabled":false,"OpenStdin":false,"StdinOnce":false,"Tty":false,"Volumes":{}}
```
##### ACTUAL RESULTS
<!--- What actually happened? If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes below -->
```
DEBU[0111] Calling POST /v1.23/containers/create?name=mc
DEBU[0111] form data: {"AttachStderr":false,"AttachStdin":false,"AttachStdout":false,"Cmd":["execute"],"Env":[],"ExposedPorts":{"12345/tcp":{}},"HostConfig":{"Binds":["/a/b:/a/b:rw"],"LogConfig":{"Config":{"max-file":"10","max-size":"10m"},"Type":"json-file"},"Memory":0,"NetworkMode":"host","ReadonlyRootfs":false,"RestartPolicy":{"MaximumRetryCount":null,"Name":"always"}},"Image":"my_container:1.0","NetworkDisabled":false,"OpenStdin":false,"StdinOnce":false,"Tty":false,"Volumes":{"/a/b":{}}}
```
When the same task (adapted for) is run using the deprecated docker module we see the volume is not sent:
```
DEBU[0328] Calling POST /v1.22/containers/create?name=mc
DEBU[0328] form data: {"AttachStderr":false,"AttachStdin":false,"AttachStdout":false,"Cmd":["execute"],"CpuShares":0,"Env":[],"ExposedPorts":{"12345/tcp":{}},"HostConfig":{"Binds":["/a/b:/a/b:rw"],"LogConfig":{"Config":{"max-file":"10","max-size":"10m"},"Type":"json-file"},"Memory":0,"NetworkMode":"host","RestartPolicy":{"Name":"always"}},"Image":"my_container:1.0","Labels":{},"NetworkDisabled":false,"OpenStdin":false,"StdinOnce":false,"Tty":false,"Volumes":{}}
```
Note pinning API version to 1.22 on `docker_container` made no difference. It still ends up sending volume as part of the request. Based on the thread below it seems that volume shouldn't be sent in this scenario:
https://github.com/docker/docker/issues/2949#issuecomment-230883544
If my understating is accurate can this be looked into? Thanks.
| main | docker container sends volume to docker api erroneously issue type bug report component name docker container ansible version ansible devel configuration mention any settings you have changed added removed in ansible cfg or using the ansible environment variables os environment mention the os you are running ansible from and the os you are managing or say “n a” for anything that is not platform specific n a summary docker container erroneously sends volume in addition to binds to docker api during create container call steps to reproduce for bugs show exactly how to reproduce the problem for new features show how the feature would be used name assert state of my container docker container name mc image my container command execute state started restart policy always exposed ports network mode host detach true volumes a b a b log driver json file log options max size max file expected results docker daemon logs w debug level debu calling post containers create name mc debu form data attachstderr false attachstdin false attachstdout false cmd env exposedports tcp hostconfig binds logconfig config max file max size type json file memory networkmode host readonlyrootfs false restartpolicy maximumretrycount null name always image my container networkdisabled false openstdin false stdinonce false tty false volumes actual results debu calling post containers create name mc debu form data attachstderr false attachstdin false attachstdout false cmd env exposedports tcp hostconfig binds logconfig config max file max size type json file memory networkmode host readonlyrootfs false restartpolicy maximumretrycount null name always image my container networkdisabled false openstdin false stdinonce false tty false volumes a b when the same task adapted for is run using the deprecated docker module we see the volume is not sent debu calling post containers create name mc debu form data attachstderr false attachstdin false attachstdout false cmd cpushares env exposedports tcp hostconfig binds logconfig config max file max size type json file memory networkmode host restartpolicy name always image my container labels networkdisabled false openstdin false stdinonce false tty false volumes note pinning api version to on docker container made no difference it still ends up sending volume as part of the request based on the thread below it seems that volume shouldn t be sent in this scenario if my understating is accurate can this be looked into thanks | 1 |
4,628 | 23,980,900,947 | IssuesEvent | 2022-09-13 14:59:00 | exercism/python | https://api.github.com/repos/exercism/python | closed | [New Concept Exercise] : dict-methods | x:action/create x:knowledge/advanced x:module/concept-exercise x:status/claimed x:size/large claimed 🐾 maintainer action required❕ new exercise ✨ | This issue describes how to implement the `dict-methods` (dictionary methods) concept exercise for the python track.
## Getting started
**Please please please read the docs before starting.** Posting PRs without reading these docs will be a lot more frustrating for you during the review cycle, and exhaust Exercism's maintainers' time. So, before diving into the implementation, please read up on the following documents:
- [Contributing to Exercism](https://exercism.org/docs/building) | [Exercism and GitHub](https://exercism.org/docs/building/github) | [Contributor Pull Request Guide](https://exercism.org/docs/building/github/contributors-pull-request-guide)
- [What are those Weird Task Tags about?](https://exercism.org/docs/building/product/tasks)
- [Building Language Tracks: An Overview](https://exercism.org/docs/building/tracks)
- [What are Concepts?](https://exercism.org/docs/building/tracks/concepts)
- [Concept Exercise Specifications](https://exercism.org/docs/building/tracks/concept-exercises)
- [Concept Specifications](https://exercism.org/docs/building/tracks/concepts)
- [Exercism Formatting and Style Guide](https://exercism.org/docs/building/markdown/style-guide)
- [Exercism Markdown Specification](https://exercism.org/docs/building/markdown/markdown)
- [Reputation](https://exercism.org/docs/using/product/reputation)
## Goal
This concept exercise is meant to teach an understanding/use of useful `dict-methods` (methods from the `dict` - dictionary class) and some additional practice/techniques for working with `dicts`.
## Learning objectives
Cover useful `dict` methods and a few techniques for operating on/manipulating `dicts`.
- `dict` methods :
- `get()`
- `clear()`
- `popitem()`
- `update()`
- *classmethod*[ `fromkeys`(*iterable*[, *value*])](https://docs.python.org/3/library/stdtypes.html#dict.fromkeys)
- Pythong 3.9 pipe or union operator (`d | other`, `d |= other`) - **note:** may not be included. Decision still TBD.
- Working more with the `dict` views `items()` , `keys()` or `values()`. (e.g, by sorting information using `sorted()` or by swapping `keys` and `values`, etc.)
- Knowing that Dictionaries can be _nested_, _-- e.g._ ' a dictionary of dictionaries'.
- Considerations when `updating()` or using `union` with dictionaries.
## Out of scope
Please take a look at the `dicts` concept exercise [design.md file](https://github.com/exercism/python/edit/main/exercises/concept/inventory-management/.meta/design.md) for `dict` features taught thus far. While those methods can be used for solutions to this exercise, it isn't necessary to cover them again in detail. Additionally, the following is out of scope:
- Dictionary comprehensions
- Built-in functions as they relate to this data structure (*e.g.* `len()`, or `enumerate()`
- Considerations of Mutability
- `copy()` vs `deepcopy()`
- Memory and performance characteristics.
- Related `collections` module with `Counter()` and `defaultdict()`
## Concepts
- `dicts`
- `dict-methods`
## Prerequisites
These are the concepts/concept exercises the student needs to complete/understand before solving this concept exercise.
- `basics`
- `bools`
- `conditionals`
- `comparisons`
- `dicts`
- `lists`
- `loops`
- `numbers`
- `strings`
- `tuples`
## Resources to refer to
- [Python docs: Tutorial - Dictionaries](https://docs.python.org/3/tutorial/datastructures.html#dictionaries)
- [Python docs: Mapping Type `dict`](https://docs.python.org/3/library/stdtypes.html#mapping-types-dict)
- [Real Python: Dicts](https://realpython.com/python-dicts/)
- [Digital Ocean: Understanding dictionaries in python 3](https://www.digitalocean.com/community/tutorials/understanding-dictionaries-in-python-3)
- [Stack Overflow: exchanging keys with values in a `dict` in Python](https://stackoverflow.com/questions/1031851/how-do-i-exchange-keys-with-values-in-a-dictionary)
- [kite: how to sort a dictionary by key in python](https://www.kite.com/python/answers/how-to-sort-a-dictionary-by-key-in-python)
- [medium: 16 Python Dictionary Tips](https://medium.com/python-in-plain-english/16-intermediate-level-python-dictionary-tips-tricks-and-shortcuts-1376859e1adc) _**note:** this is a good resource for ideas and writing this exericse, but is a subscription-based service, so not the best for linking to_
* ### Hints
For more information on writing hints see [hints](https://github.com/exercism/docs/blob/main/anatomy/tracks/concept-exercises.md#file-docshintsmd)
* You can refer to one or more of the resources linked above, or analogous resources from a trusted source. We prefer using links within the [Python Docs](https://docs.python.org/3/) as the primary go-to, but other resources listed above are also good. Please try to avoid paid or subscription-based links if possible.
* ### `links.json`
For more information, see [concept links file](https://github.com/exercism/docs/blob/main/anatomy/tracks/concepts.md#file-linksjson)
* The same resources listed in this issue can be used as a starting point for the [ `concepts/links.json`](https://github.com/exercism/docs/blob/main/anatomy/tracks/concepts.md#file-linksjson) file, if it doesn't already exist.
* If there are particularly good/interesting information sources for this concept that extend or supplement the concept exercise material & the resources already listed -- please add them to the `links.json` document.
## Concept Description
Please see the following for more details on these files: [concepts](https://github.com/exercism/docs/blob/main/anatomy/tracks/concepts.md#file-linksjson) & [concept exercises](https://github.com/exercism/docs/blob/main/anatomy/tracks/concept-exercises.md)
* ### Concept `about.md`
**Concept file/issue**: There is a PR pending for this concept that includes `about.md`, `introduction.md`, and `links.json` : [PR 2374](https://github.com/exercism/python/pull/2347). If the PR has been merged, the files can be found under `python/concepts/dict-methods/`
For more information, see [Concept `about.md`](https://github.com/exercism/docs/blob/main/anatomy/tracks/concepts.md#file-aboutmd)
- This file provides information about this concept for a student who has completed the corresponding concept exercise. It is intended as a reference for continued learning.
* ### Concept `introduction.md`
For more information, see [Concept `introduction.md`](https://github.com/exercism/docs/blob/main/anatomy/tracks/concepts.md#file-introductionmd)
* This can also be a summary/paraphrase of the document listed above, and will provide a brief introduction of the concept for a student who has **not yet** completed the concept exercise. It should contain a good summation of the concept, but not go into lots of detail.
* ### Exercise `introduction.md`
For more information, see [Exercise `introduction.md`](https://github.com/exercism/docs/blob/main/anatomy/tracks/concept-exercises.md#file-docsintroductionmd)
* This should also summarize/paraphrase the above document, but with enough information and examples for the student to complete the tasks outlined in this concept exercise.
## Test-runner
No changes required to the [Python Test Runner](https://github.com/exercism/python-test-runner) at this time.
## Representer
No changes required to the [Python Representer](https://github.com/exercism/python-representer/) at this time.
## Analyzer
No changes required to the [Python Analyzer](https://github.com/exercism/python-analyzer/) at this time.
## Exercise Metadata - Track
For more information on concept exercises and formatting for the Python track `config.json` , please see [concept exercise metadata](https://github.com/exercism/docs/blob/main/anatomy/tracks/concept-exercises.md#metadata). The track `config.json` file can be found in the root of the Python repo.
You can use the below for the exercise **UUID**. You can also generate a new one via [exercism configlet](https://github.com/exercism/configlet), [uuidgenerator.net](https://www.uuidgenerator.net/version4), or any other favorite method. The UUID must be a valid [**V4 UUID**](https://en.wikipedia.org/wiki/Universally_unique_identifier).
* **Exercise UUID** : `1f67781e-07a3-4d00-a49f-59493e7b7f2c`
* **concepts** should be filled in from the Concepts section in this issue
* **prerequisites** should be filled in from the Prerequisites section in this issue
## Exercise Metadata Files Under `.meta/config.json`
For more information on exercise `.meta/` files and formatting, see [concept exercise metadata files](https://github.com/exercism/docs/blob/main/anatomy/tracks/concept-exercises.md#metadata-files)
* `.meta/config.json` - see [this link](https://github.com/exercism/docs/blob/main/anatomy/tracks/concept-exercises.md#file-metaconfigjson) for the fields and formatting of this file.
* `.meta/design.md` - see [this link](https://github.com/exercism/docs/blob/main/anatomy/tracks/concept-exercises.md#file-metadesignmd) for the formatting of this file. Please use the **Goal**, **Learning Objectives**,**Concepts**, **Prerequisites** and , **Out of Scope** sections from this issue.
## Implementation Notes
Code in the `.meta/examplar.py` file should **only use syntax & concepts introduced in this exercise or one of its prerequisite exercises.**
Please **do not use** comprehensions, generator expressions, or other syntax not previously covered. Please also follow [PEP8](https://www.python.org/dev/peps/pep-0008/) guidelines.
In General, tests should be written using `unittest.TestCase` and the test file should be named `<EXERCISE-NAME>_test.py`.
While we do use [PyTest](https://docs.pytest.org/en/stable/) as our test runner and for some implementation tests, please check with a maintainer before using a PyTest test method, fixture, or feature.
Our markdown and JSON files are checked against [prettier](https://prettier.io/) . We recommend [setting prettier up locally](https://prettier.io/docs/en/install.html) and running it prior to submitting your PR to avoid any CI errors.
## Help
If you have any questions while implementing the exercise, please post the questions as comments in this issue, or contact one of the maintainers on our Slack channel. | True | [New Concept Exercise] : dict-methods - This issue describes how to implement the `dict-methods` (dictionary methods) concept exercise for the python track.
## Getting started
**Please please please read the docs before starting.** Posting PRs without reading these docs will be a lot more frustrating for you during the review cycle, and exhaust Exercism's maintainers' time. So, before diving into the implementation, please read up on the following documents:
- [Contributing to Exercism](https://exercism.org/docs/building) | [Exercism and GitHub](https://exercism.org/docs/building/github) | [Contributor Pull Request Guide](https://exercism.org/docs/building/github/contributors-pull-request-guide)
- [What are those Weird Task Tags about?](https://exercism.org/docs/building/product/tasks)
- [Building Language Tracks: An Overview](https://exercism.org/docs/building/tracks)
- [What are Concepts?](https://exercism.org/docs/building/tracks/concepts)
- [Concept Exercise Specifications](https://exercism.org/docs/building/tracks/concept-exercises)
- [Concept Specifications](https://exercism.org/docs/building/tracks/concepts)
- [Exercism Formatting and Style Guide](https://exercism.org/docs/building/markdown/style-guide)
- [Exercism Markdown Specification](https://exercism.org/docs/building/markdown/markdown)
- [Reputation](https://exercism.org/docs/using/product/reputation)
## Goal
This concept exercise is meant to teach an understanding/use of useful `dict-methods` (methods from the `dict` - dictionary class) and some additional practice/techniques for working with `dicts`.
## Learning objectives
Cover useful `dict` methods and a few techniques for operating on/manipulating `dicts`.
- `dict` methods :
- `get()`
- `clear()`
- `popitem()`
- `update()`
- *classmethod*[ `fromkeys`(*iterable*[, *value*])](https://docs.python.org/3/library/stdtypes.html#dict.fromkeys)
- Pythong 3.9 pipe or union operator (`d | other`, `d |= other`) - **note:** may not be included. Decision still TBD.
- Working more with the `dict` views `items()` , `keys()` or `values()`. (e.g, by sorting information using `sorted()` or by swapping `keys` and `values`, etc.)
- Knowing that Dictionaries can be _nested_, _-- e.g._ ' a dictionary of dictionaries'.
- Considerations when `updating()` or using `union` with dictionaries.
## Out of scope
Please take a look at the `dicts` concept exercise [design.md file](https://github.com/exercism/python/edit/main/exercises/concept/inventory-management/.meta/design.md) for `dict` features taught thus far. While those methods can be used for solutions to this exercise, it isn't necessary to cover them again in detail. Additionally, the following is out of scope:
- Dictionary comprehensions
- Built-in functions as they relate to this data structure (*e.g.* `len()`, or `enumerate()`
- Considerations of Mutability
- `copy()` vs `deepcopy()`
- Memory and performance characteristics.
- Related `collections` module with `Counter()` and `defaultdict()`
## Concepts
- `dicts`
- `dict-methods`
## Prerequisites
These are the concepts/concept exercises the student needs to complete/understand before solving this concept exercise.
- `basics`
- `bools`
- `conditionals`
- `comparisons`
- `dicts`
- `lists`
- `loops`
- `numbers`
- `strings`
- `tuples`
## Resources to refer to
- [Python docs: Tutorial - Dictionaries](https://docs.python.org/3/tutorial/datastructures.html#dictionaries)
- [Python docs: Mapping Type `dict`](https://docs.python.org/3/library/stdtypes.html#mapping-types-dict)
- [Real Python: Dicts](https://realpython.com/python-dicts/)
- [Digital Ocean: Understanding dictionaries in python 3](https://www.digitalocean.com/community/tutorials/understanding-dictionaries-in-python-3)
- [Stack Overflow: exchanging keys with values in a `dict` in Python](https://stackoverflow.com/questions/1031851/how-do-i-exchange-keys-with-values-in-a-dictionary)
- [kite: how to sort a dictionary by key in python](https://www.kite.com/python/answers/how-to-sort-a-dictionary-by-key-in-python)
- [medium: 16 Python Dictionary Tips](https://medium.com/python-in-plain-english/16-intermediate-level-python-dictionary-tips-tricks-and-shortcuts-1376859e1adc) _**note:** this is a good resource for ideas and writing this exericse, but is a subscription-based service, so not the best for linking to_
* ### Hints
For more information on writing hints see [hints](https://github.com/exercism/docs/blob/main/anatomy/tracks/concept-exercises.md#file-docshintsmd)
* You can refer to one or more of the resources linked above, or analogous resources from a trusted source. We prefer using links within the [Python Docs](https://docs.python.org/3/) as the primary go-to, but other resources listed above are also good. Please try to avoid paid or subscription-based links if possible.
* ### `links.json`
For more information, see [concept links file](https://github.com/exercism/docs/blob/main/anatomy/tracks/concepts.md#file-linksjson)
* The same resources listed in this issue can be used as a starting point for the [ `concepts/links.json`](https://github.com/exercism/docs/blob/main/anatomy/tracks/concepts.md#file-linksjson) file, if it doesn't already exist.
* If there are particularly good/interesting information sources for this concept that extend or supplement the concept exercise material & the resources already listed -- please add them to the `links.json` document.
## Concept Description
Please see the following for more details on these files: [concepts](https://github.com/exercism/docs/blob/main/anatomy/tracks/concepts.md#file-linksjson) & [concept exercises](https://github.com/exercism/docs/blob/main/anatomy/tracks/concept-exercises.md)
* ### Concept `about.md`
**Concept file/issue**: There is a PR pending for this concept that includes `about.md`, `introduction.md`, and `links.json` : [PR 2374](https://github.com/exercism/python/pull/2347). If the PR has been merged, the files can be found under `python/concepts/dict-methods/`
For more information, see [Concept `about.md`](https://github.com/exercism/docs/blob/main/anatomy/tracks/concepts.md#file-aboutmd)
- This file provides information about this concept for a student who has completed the corresponding concept exercise. It is intended as a reference for continued learning.
* ### Concept `introduction.md`
For more information, see [Concept `introduction.md`](https://github.com/exercism/docs/blob/main/anatomy/tracks/concepts.md#file-introductionmd)
* This can also be a summary/paraphrase of the document listed above, and will provide a brief introduction of the concept for a student who has **not yet** completed the concept exercise. It should contain a good summation of the concept, but not go into lots of detail.
* ### Exercise `introduction.md`
For more information, see [Exercise `introduction.md`](https://github.com/exercism/docs/blob/main/anatomy/tracks/concept-exercises.md#file-docsintroductionmd)
* This should also summarize/paraphrase the above document, but with enough information and examples for the student to complete the tasks outlined in this concept exercise.
## Test-runner
No changes required to the [Python Test Runner](https://github.com/exercism/python-test-runner) at this time.
## Representer
No changes required to the [Python Representer](https://github.com/exercism/python-representer/) at this time.
## Analyzer
No changes required to the [Python Analyzer](https://github.com/exercism/python-analyzer/) at this time.
## Exercise Metadata - Track
For more information on concept exercises and formatting for the Python track `config.json` , please see [concept exercise metadata](https://github.com/exercism/docs/blob/main/anatomy/tracks/concept-exercises.md#metadata). The track `config.json` file can be found in the root of the Python repo.
You can use the below for the exercise **UUID**. You can also generate a new one via [exercism configlet](https://github.com/exercism/configlet), [uuidgenerator.net](https://www.uuidgenerator.net/version4), or any other favorite method. The UUID must be a valid [**V4 UUID**](https://en.wikipedia.org/wiki/Universally_unique_identifier).
* **Exercise UUID** : `1f67781e-07a3-4d00-a49f-59493e7b7f2c`
* **concepts** should be filled in from the Concepts section in this issue
* **prerequisites** should be filled in from the Prerequisites section in this issue
## Exercise Metadata Files Under `.meta/config.json`
For more information on exercise `.meta/` files and formatting, see [concept exercise metadata files](https://github.com/exercism/docs/blob/main/anatomy/tracks/concept-exercises.md#metadata-files)
* `.meta/config.json` - see [this link](https://github.com/exercism/docs/blob/main/anatomy/tracks/concept-exercises.md#file-metaconfigjson) for the fields and formatting of this file.
* `.meta/design.md` - see [this link](https://github.com/exercism/docs/blob/main/anatomy/tracks/concept-exercises.md#file-metadesignmd) for the formatting of this file. Please use the **Goal**, **Learning Objectives**,**Concepts**, **Prerequisites** and , **Out of Scope** sections from this issue.
## Implementation Notes
Code in the `.meta/examplar.py` file should **only use syntax & concepts introduced in this exercise or one of its prerequisite exercises.**
Please **do not use** comprehensions, generator expressions, or other syntax not previously covered. Please also follow [PEP8](https://www.python.org/dev/peps/pep-0008/) guidelines.
In General, tests should be written using `unittest.TestCase` and the test file should be named `<EXERCISE-NAME>_test.py`.
While we do use [PyTest](https://docs.pytest.org/en/stable/) as our test runner and for some implementation tests, please check with a maintainer before using a PyTest test method, fixture, or feature.
Our markdown and JSON files are checked against [prettier](https://prettier.io/) . We recommend [setting prettier up locally](https://prettier.io/docs/en/install.html) and running it prior to submitting your PR to avoid any CI errors.
## Help
If you have any questions while implementing the exercise, please post the questions as comments in this issue, or contact one of the maintainers on our Slack channel. | main | dict methods this issue describes how to implement the dict methods dictionary methods concept exercise for the python track getting started please please please read the docs before starting posting prs without reading these docs will be a lot more frustrating for you during the review cycle and exhaust exercism s maintainers time so before diving into the implementation please read up on the following documents goal this concept exercise is meant to teach an understanding use of useful dict methods methods from the dict dictionary class and some additional practice techniques for working with dicts learning objectives cover useful dict methods and a few techniques for operating on manipulating dicts dict methods get clear popitem update classmethod pythong pipe or union operator d other d other note may not be included decision still tbd working more with the dict views items keys or values e g by sorting information using sorted or by swapping keys and values etc knowing that dictionaries can be nested e g a dictionary of dictionaries considerations when updating or using union with dictionaries out of scope please take a look at the dicts concept exercise for dict features taught thus far while those methods can be used for solutions to this exercise it isn t necessary to cover them again in detail additionally the following is out of scope dictionary comprehensions built in functions as they relate to this data structure e g len or enumerate considerations of mutability copy vs deepcopy memory and performance characteristics related collections module with counter and defaultdict concepts dicts dict methods prerequisites these are the concepts concept exercises the student needs to complete understand before solving this concept exercise basics bools conditionals comparisons dicts lists loops numbers strings tuples resources to refer to note this is a good resource for ideas and writing this exericse but is a subscription based service so not the best for linking to hints for more information on writing hints see you can refer to one or more of the resources linked above or analogous resources from a trusted source we prefer using links within the as the primary go to but other resources listed above are also good please try to avoid paid or subscription based links if possible links json for more information see the same resources listed in this issue can be used as a starting point for the file if it doesn t already exist if there are particularly good interesting information sources for this concept that extend or supplement the concept exercise material the resources already listed please add them to the links json document concept description please see the following for more details on these files concept about md concept file issue there is a pr pending for this concept that includes about md introduction md and links json if the pr has been merged the files can be found under python concepts dict methods for more information see this file provides information about this concept for a student who has completed the corresponding concept exercise it is intended as a reference for continued learning concept introduction md for more information see this can also be a summary paraphrase of the document listed above and will provide a brief introduction of the concept for a student who has not yet completed the concept exercise it should contain a good summation of the concept but not go into lots of detail exercise introduction md for more information see this should also summarize paraphrase the above document but with enough information and examples for the student to complete the tasks outlined in this concept exercise test runner no changes required to the at this time representer no changes required to the at this time analyzer no changes required to the at this time exercise metadata track for more information on concept exercises and formatting for the python track config json please see the track config json file can be found in the root of the python repo you can use the below for the exercise uuid you can also generate a new one via or any other favorite method the uuid must be a valid exercise uuid concepts should be filled in from the concepts section in this issue prerequisites should be filled in from the prerequisites section in this issue exercise metadata files under meta config json for more information on exercise meta files and formatting see meta config json see for the fields and formatting of this file meta design md see for the formatting of this file please use the goal learning objectives concepts prerequisites and out of scope sections from this issue implementation notes code in the meta examplar py file should only use syntax concepts introduced in this exercise or one of its prerequisite exercises please do not use comprehensions generator expressions or other syntax not previously covered please also follow guidelines in general tests should be written using unittest testcase and the test file should be named test py while we do use as our test runner and for some implementation tests please check with a maintainer before using a pytest test method fixture or feature our markdown and json files are checked against we recommend and running it prior to submitting your pr to avoid any ci errors help if you have any questions while implementing the exercise please post the questions as comments in this issue or contact one of the maintainers on our slack channel | 1 |
3,167 | 12,226,711,661 | IssuesEvent | 2020-05-03 12:12:57 | gfleetwood/asteres | https://api.github.com/repos/gfleetwood/asteres | opened | uber/hypothesis-gufunc (199058779) | Python maintain | https://github.com/uber/hypothesis-gufunc
Extension to hypothesis for testing numpy general universal functions | True | uber/hypothesis-gufunc (199058779) - https://github.com/uber/hypothesis-gufunc
Extension to hypothesis for testing numpy general universal functions | main | uber hypothesis gufunc extension to hypothesis for testing numpy general universal functions | 1 |
134,613 | 10,924,444,287 | IssuesEvent | 2019-11-22 10:09:01 | servo/servo | https://api.github.com/repos/servo/servo | closed | Make it so test-tidy with clang-format run on the CI | A-testing | If I understand correctly, test-tidy only run on Linux. We need to figure out how to run the same clang-format version on both linux and windows, so we can activate CPP formatting tests on the CI. | 1.0 | Make it so test-tidy with clang-format run on the CI - If I understand correctly, test-tidy only run on Linux. We need to figure out how to run the same clang-format version on both linux and windows, so we can activate CPP formatting tests on the CI. | non_main | make it so test tidy with clang format run on the ci if i understand correctly test tidy only run on linux we need to figure out how to run the same clang format version on both linux and windows so we can activate cpp formatting tests on the ci | 0 |
371,658 | 25,959,433,221 | IssuesEvent | 2022-12-18 17:48:24 | BenRogersWPG/VSCode-Transparent-Minimap | https://api.github.com/repos/BenRogersWPG/VSCode-Transparent-Minimap | closed | Update Badge Source on Visual Studio Marketplace | documentation | The badge provider I am using has stopped working, so use new badges. | 1.0 | Update Badge Source on Visual Studio Marketplace - The badge provider I am using has stopped working, so use new badges. | non_main | update badge source on visual studio marketplace the badge provider i am using has stopped working so use new badges | 0 |
3,472 | 13,312,468,071 | IssuesEvent | 2020-08-26 09:46:43 | digitalpardoe/isyncit | https://api.github.com/repos/digitalpardoe/isyncit | closed | Show Connected Devices | Improvement No Longer Maintained | Add a menu list that shows all devices that are currently connected via bluetooth (more appropriate replacement for the Apple bluetooth menu bar item).
| True | Show Connected Devices - Add a menu list that shows all devices that are currently connected via bluetooth (more appropriate replacement for the Apple bluetooth menu bar item).
| main | show connected devices add a menu list that shows all devices that are currently connected via bluetooth more appropriate replacement for the apple bluetooth menu bar item | 1 |
139,610 | 11,274,790,848 | IssuesEvent | 2020-01-14 19:20:33 | eclipse/openj9 | https://api.github.com/repos/eclipse/openj9 | closed | openjdk8_j9_extended.functional_x86-64_windows Test the sub-option createLayer Segmentation error at j9shr29.dll | test failure | Failure link
------------
https://ci.eclipse.org/openj9/job/Test_openjdk8_j9_extended.functional_x86-64_windows_Nightly/228/tapResults/
testSCCMLTests3_1
Optional info
-------------
Failure output (captured from console output)
---------------------------------------------
```
Testing: Test 181: Test the sub-option createLayer
Test start time: 2019/12/17 23:50:07 Central Standard Time
Running command: "C:/Users/jenkins/workspace/Test_openjdk8_j9_extended.functional_x86-64_windows_Nightly/openjdkbinary/j2sdk-image\\bin\\java" -Xcompressedrefs -Xcompressedrefs -Xjit -Xgcpolicy:gencon -Xshareclasses:name=ShareClassesCMLTests,createLayer -version
Time spent starting: 0 milliseconds
Time spent executing: 359 milliseconds
Test result: FAILED
[ERR] openjdk version "1.8.0_242-internal"
[ERR] OpenJDK Runtime Environment (build 1.8.0_242-internal-jenkins_2019_12_17_21_17-b00)
[ERR] Eclipse OpenJ9 VM (build master-f1717341f, JRE 1.8.0 Windows Server 2012 R2 amd64-64-Bit Compressed References 20191217_241 (JIT enabled, AOT enabled)
[ERR] OpenJ9 - f1717341f
[ERR] OMR - 9989d0c8e
[ERR] JCL - b2a1abf9261 based on jdk8u242-b04)
[ERR] Unhandled exception
[ERR] Type=Segmentation error vmState=0x00000000
[ERR] Windows_ExceptionCode=c0000005 J9Generic_Signal=00000004 ExceptionAddress=00007FFCCBC6B96B ContextFlags=0010005f
[ERR] Handler1=00007FFCCC21CAA0 Handler2=00007FFCCC15C910 InaccessibleReadAddress=000000000E7A5E18
[ERR] RDI=0000000001469A00 RSI=00007FFCCC357A90 RAX=0000000001469A00 RBX=000000000E7A5DF0
[ERR] RCX=0000000000000000 RDX=00007FFCCBCC7B60 R8=0000000000036902 R9=0000000000000000
[ERR] R10=0000000000000060 R11=0000000000000002 R12=000000000EAF2050 R13=0000000000000000
[ERR] R14=0000000000000000 R15=000000000E84E4E0
[ERR] RIP=00007FFCCBC6B96B RSP=0000000000E7EB70 RBP=0000000001469A00 GS=002B
[ERR] FS=0053 ES=002B DS=002B
[ERR] XMM0 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM1 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM2 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM3 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM4 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM5 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM6 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM7 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM8 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM9 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM10 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM11 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM12 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM13 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM14 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM15 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] Module=C:\Users\jenkins\workspace\Test_openjdk8_j9_extended.functional_x86-64_windows_Nightly\openjdkbinary\j2sdk-image\jre\bin\compressedrefs\j9shr29.dll
[ERR] Module_base_address=00007FFCCBC40000 Offset_in_DLL=000000000002b96b
[ERR] Target=2_90_20191217_241 (Windows Server 2012 R2 6.3 build 9600)
[ERR] CPU=amd64 (8 logical CPUs) (0x1ffb9c000 RAM)
[ERR] ----------- Stack Backtrace -----------
[ERR] J9VMDllMain+0x2a96b (0x00007FFCCBC6B96B [j9shr29+0x2b96b])
[ERR] J9VMDllMain+0x16cfd (0x00007FFCCBC57CFD [j9shr29+0x17cfd])
[ERR] J9VMDllMain+0x7a39 (0x00007FFCCBC48A39 [j9shr29+0x8a39])
[ERR] J9VMDllMain+0xb52 (0x00007FFCCBC41B52 [j9shr29+0x1b52])
[ERR] J9_CreateJavaVM+0x7e66 (0x00007FFCCC22FD26 [j9vm29+0x8fd26])
[ERR] J9_CreateJavaVM+0xdd80 (0x00007FFCCC235C40 [j9vm29+0x95c40])
[ERR] J9_CreateJavaVM+0x688 (0x00007FFCCC228548 [j9vm29+0x88548])
[ERR] j9port_init_library+0x2b34f (0x00007FFCCC15DD1F [j9prt29+0x2dd1f])
[ERR] J9_CreateJavaVM+0x9f5 (0x00007FFCCC2288B5 [j9vm29+0x888b5])
[ERR] JVM_WaitForReferencePendingList+0x137f (0x00007FFCCC32B38F [jvm+0xb38f])
[ERR] (0x00007FF7FD67275E [java+0x275e])
[ERR] (0x00007FF7FD67D007 [java+0xd007])
[ERR] (0x00007FF7FD67D09B [java+0xd09b])
[ERR] BaseThreadInitThunk+0x22 (0x00007FFCDD6413D2 [KERNEL32+0x13d2])
[ERR] RtlUserThreadStart+0x34 (0x00007FFCDE8854F4 [ntdll+0x154f4])
[ERR] ---------------------------------------
[ERR] JVMDUMP039I Processing dump event "gpf", detail "" at 2019/12/17 23:50:07 - please wait.
[ERR] JVMDUMP013I Processed dump event "gpf", detail "".
>> Success condition was found: [Output match: (java|openjdk) version]
>> Failure condition was not found: [Output match: Failed to start up the shared cache]
>> Failure condition was found: [Output match: Unhandled Exception]
>> Failure condition was not found: [Output match: Exception:]
>> Failure condition was not found: [Output match: corrupt]
>> Failure condition was found: [Output match: Processing dump event]
```
fyi @hangshao0 | 1.0 | openjdk8_j9_extended.functional_x86-64_windows Test the sub-option createLayer Segmentation error at j9shr29.dll - Failure link
------------
https://ci.eclipse.org/openj9/job/Test_openjdk8_j9_extended.functional_x86-64_windows_Nightly/228/tapResults/
testSCCMLTests3_1
Optional info
-------------
Failure output (captured from console output)
---------------------------------------------
```
Testing: Test 181: Test the sub-option createLayer
Test start time: 2019/12/17 23:50:07 Central Standard Time
Running command: "C:/Users/jenkins/workspace/Test_openjdk8_j9_extended.functional_x86-64_windows_Nightly/openjdkbinary/j2sdk-image\\bin\\java" -Xcompressedrefs -Xcompressedrefs -Xjit -Xgcpolicy:gencon -Xshareclasses:name=ShareClassesCMLTests,createLayer -version
Time spent starting: 0 milliseconds
Time spent executing: 359 milliseconds
Test result: FAILED
[ERR] openjdk version "1.8.0_242-internal"
[ERR] OpenJDK Runtime Environment (build 1.8.0_242-internal-jenkins_2019_12_17_21_17-b00)
[ERR] Eclipse OpenJ9 VM (build master-f1717341f, JRE 1.8.0 Windows Server 2012 R2 amd64-64-Bit Compressed References 20191217_241 (JIT enabled, AOT enabled)
[ERR] OpenJ9 - f1717341f
[ERR] OMR - 9989d0c8e
[ERR] JCL - b2a1abf9261 based on jdk8u242-b04)
[ERR] Unhandled exception
[ERR] Type=Segmentation error vmState=0x00000000
[ERR] Windows_ExceptionCode=c0000005 J9Generic_Signal=00000004 ExceptionAddress=00007FFCCBC6B96B ContextFlags=0010005f
[ERR] Handler1=00007FFCCC21CAA0 Handler2=00007FFCCC15C910 InaccessibleReadAddress=000000000E7A5E18
[ERR] RDI=0000000001469A00 RSI=00007FFCCC357A90 RAX=0000000001469A00 RBX=000000000E7A5DF0
[ERR] RCX=0000000000000000 RDX=00007FFCCBCC7B60 R8=0000000000036902 R9=0000000000000000
[ERR] R10=0000000000000060 R11=0000000000000002 R12=000000000EAF2050 R13=0000000000000000
[ERR] R14=0000000000000000 R15=000000000E84E4E0
[ERR] RIP=00007FFCCBC6B96B RSP=0000000000E7EB70 RBP=0000000001469A00 GS=002B
[ERR] FS=0053 ES=002B DS=002B
[ERR] XMM0 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM1 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM2 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM3 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM4 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM5 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM6 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM7 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM8 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM9 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM10 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM11 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM12 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM13 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM14 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] XMM15 0000000000000000 (f: 0.000000, d: 0.000000e+000)
[ERR] Module=C:\Users\jenkins\workspace\Test_openjdk8_j9_extended.functional_x86-64_windows_Nightly\openjdkbinary\j2sdk-image\jre\bin\compressedrefs\j9shr29.dll
[ERR] Module_base_address=00007FFCCBC40000 Offset_in_DLL=000000000002b96b
[ERR] Target=2_90_20191217_241 (Windows Server 2012 R2 6.3 build 9600)
[ERR] CPU=amd64 (8 logical CPUs) (0x1ffb9c000 RAM)
[ERR] ----------- Stack Backtrace -----------
[ERR] J9VMDllMain+0x2a96b (0x00007FFCCBC6B96B [j9shr29+0x2b96b])
[ERR] J9VMDllMain+0x16cfd (0x00007FFCCBC57CFD [j9shr29+0x17cfd])
[ERR] J9VMDllMain+0x7a39 (0x00007FFCCBC48A39 [j9shr29+0x8a39])
[ERR] J9VMDllMain+0xb52 (0x00007FFCCBC41B52 [j9shr29+0x1b52])
[ERR] J9_CreateJavaVM+0x7e66 (0x00007FFCCC22FD26 [j9vm29+0x8fd26])
[ERR] J9_CreateJavaVM+0xdd80 (0x00007FFCCC235C40 [j9vm29+0x95c40])
[ERR] J9_CreateJavaVM+0x688 (0x00007FFCCC228548 [j9vm29+0x88548])
[ERR] j9port_init_library+0x2b34f (0x00007FFCCC15DD1F [j9prt29+0x2dd1f])
[ERR] J9_CreateJavaVM+0x9f5 (0x00007FFCCC2288B5 [j9vm29+0x888b5])
[ERR] JVM_WaitForReferencePendingList+0x137f (0x00007FFCCC32B38F [jvm+0xb38f])
[ERR] (0x00007FF7FD67275E [java+0x275e])
[ERR] (0x00007FF7FD67D007 [java+0xd007])
[ERR] (0x00007FF7FD67D09B [java+0xd09b])
[ERR] BaseThreadInitThunk+0x22 (0x00007FFCDD6413D2 [KERNEL32+0x13d2])
[ERR] RtlUserThreadStart+0x34 (0x00007FFCDE8854F4 [ntdll+0x154f4])
[ERR] ---------------------------------------
[ERR] JVMDUMP039I Processing dump event "gpf", detail "" at 2019/12/17 23:50:07 - please wait.
[ERR] JVMDUMP013I Processed dump event "gpf", detail "".
>> Success condition was found: [Output match: (java|openjdk) version]
>> Failure condition was not found: [Output match: Failed to start up the shared cache]
>> Failure condition was found: [Output match: Unhandled Exception]
>> Failure condition was not found: [Output match: Exception:]
>> Failure condition was not found: [Output match: corrupt]
>> Failure condition was found: [Output match: Processing dump event]
```
fyi @hangshao0 | non_main | extended functional windows test the sub option createlayer segmentation error at dll failure link optional info failure output captured from console output testing test test the sub option createlayer test start time central standard time running command c users jenkins workspace test extended functional windows nightly openjdkbinary image bin java xcompressedrefs xcompressedrefs xjit xgcpolicy gencon xshareclasses name shareclassescmltests createlayer version time spent starting milliseconds time spent executing milliseconds test result failed openjdk version internal openjdk runtime environment build internal jenkins eclipse vm build master jre windows server bit compressed references jit enabled aot enabled omr jcl based on unhandled exception type segmentation error vmstate windows exceptioncode signal exceptionaddress contextflags inaccessiblereadaddress rdi rsi rax rbx rcx rdx rip rsp rbp gs fs es ds f d f d f d f d f d f d f d f d f d f d f d f d f d f d f d f d module c users jenkins workspace test extended functional windows nightly openjdkbinary image jre bin compressedrefs dll module base address offset in dll target windows server build cpu logical cpus ram stack backtrace createjavavm createjavavm createjavavm init library createjavavm jvm waitforreferencependinglist basethreadinitthunk rtluserthreadstart processing dump event gpf detail at please wait processed dump event gpf detail success condition was found failure condition was not found failure condition was found failure condition was not found failure condition was not found failure condition was found fyi | 0 |
213,573 | 24,008,836,570 | IssuesEvent | 2022-09-14 16:54:50 | sast-automation-dev/NodeGoat-23 | https://api.github.com/repos/sast-automation-dev/NodeGoat-23 | opened | forever-2.0.0.tgz: 9 vulnerabilities (highest severity is: 9.8) | security vulnerability | <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>forever-2.0.0.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/mixin-deep/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2019-10747](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10747) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | detected in multiple dependencies | Transitive | 3.0.0 | ✅ |
| [CVE-2019-10746](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10746) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | mixin-deep-1.3.1.tgz | Transitive | 3.0.0 | ✅ |
| [CVE-2021-37712](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37712) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.6 | tar-4.4.8.tgz | Transitive | 3.0.0 | ❌ |
| [CVE-2019-20149](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20149) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | kind-of-6.0.2.tgz | Transitive | 3.0.0 | ✅ |
| [WS-2018-0148](https://hackerone.com/reports/321701) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | detected in multiple dependencies | Transitive | N/A | ❌ |
| [CVE-2020-7788](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7788) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.3 | ini-1.3.5.tgz | Transitive | 3.0.0 | ✅ |
| [CVE-2020-7774](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7774) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.3 | y18n-3.2.1.tgz | Transitive | 3.0.0 | ✅ |
| [CVE-2020-7598](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.6 | detected in multiple dependencies | Transitive | 3.0.0 | ❌ |
| [WS-2021-0154](https://github.com/gulpjs/glob-parent/commit/f9231168b0041fea3f8f954b3cceb56269fc6366) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | glob-parent-3.1.0.tgz | Transitive | N/A | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-10747</summary>
### Vulnerable Libraries - <b>set-value-2.0.0.tgz</b>, <b>set-value-0.4.3.tgz</b></p>
<p>
### <b>set-value-2.0.0.tgz</b></p>
<p>Create nested values and any intermediaries using dot notation (`'a.b.c'`) paths.</p>
<p>Library home page: <a href="https://registry.npmjs.org/set-value/-/set-value-2.0.0.tgz">https://registry.npmjs.org/set-value/-/set-value-2.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/set-value/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- braces-2.3.2.tgz
- snapdragon-0.8.2.tgz
- base-0.11.2.tgz
- cache-base-1.0.1.tgz
- :x: **set-value-2.0.0.tgz** (Vulnerable Library)
### <b>set-value-0.4.3.tgz</b></p>
<p>Create nested values and any intermediaries using dot notation (`'a.b.c'`) paths.</p>
<p>Library home page: <a href="https://registry.npmjs.org/set-value/-/set-value-0.4.3.tgz">https://registry.npmjs.org/set-value/-/set-value-0.4.3.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/union-value/node_modules/set-value/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- braces-2.3.2.tgz
- snapdragon-0.8.2.tgz
- base-0.11.2.tgz
- cache-base-1.0.1.tgz
- union-value-1.0.0.tgz
- :x: **set-value-0.4.3.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
set-value is vulnerable to Prototype Pollution in versions lower than 3.0.1. The function mixin-deep could be tricked into adding or modifying properties of Object.prototype using any of the constructor, prototype and _proto_ payloads.
<p>Publish Date: 2019-08-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10747>CVE-2019-10747</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2019-10-29</p>
<p>Fix Resolution (set-value): 2.0.1</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p><p>Fix Resolution (set-value): 2.0.1</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-10746</summary>
### Vulnerable Library - <b>mixin-deep-1.3.1.tgz</b></p>
<p>Deeply mix the properties of objects into the first object. Like merge-deep, but doesn't clone.</p>
<p>Library home page: <a href="https://registry.npmjs.org/mixin-deep/-/mixin-deep-1.3.1.tgz">https://registry.npmjs.org/mixin-deep/-/mixin-deep-1.3.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/mixin-deep/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- braces-2.3.2.tgz
- snapdragon-0.8.2.tgz
- base-0.11.2.tgz
- :x: **mixin-deep-1.3.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
mixin-deep is vulnerable to Prototype Pollution in versions before 1.3.2 and version 2.0.0. The function mixin-deep could be tricked into adding or modifying properties of Object.prototype using a constructor payload.
<p>Publish Date: 2019-08-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10746>CVE-2019-10746</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2019-08-23</p>
<p>Fix Resolution (mixin-deep): 1.3.2</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-37712</summary>
### Vulnerable Library - <b>tar-4.4.8.tgz</b></p>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.8.tgz">https://registry.npmjs.org/tar/-/tar-4.4.8.tgz</a></p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- fsevents-1.2.9.tgz
- node-pre-gyp-0.12.0.tgz
- :x: **tar-4.4.8.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The npm package "tar" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with names containing unicode values that normalized to the same value. Additionally, on Windows systems, long path portions would resolve to the same file system entities as their 8.3 "short path" counterparts. A specially crafted tar archive could thus include a directory with one form of the path, followed by a symbolic link with a different string that resolves to the same file system entity, followed by a file using the first form. By first creating a directory, and then replacing that directory with a symlink that had a different apparent name that resolved to the same entry in the filesystem, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-qq89-hq3f-393p.
<p>Publish Date: 2021-08-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37712>CVE-2021-37712</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.6</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p">https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p</a></p>
<p>Release Date: 2021-08-31</p>
<p>Fix Resolution (tar): 4.4.18</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-20149</summary>
### Vulnerable Library - <b>kind-of-6.0.2.tgz</b></p>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz">https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/micromatch/node_modules/kind-of/package.json,/node_modules/base/node_modules/kind-of/package.json,/node_modules/nanomatch/node_modules/kind-of/package.json,/node_modules/define-property/node_modules/kind-of/package.json,/node_modules/extglob/node_modules/kind-of/package.json,/node_modules/snapdragon-node/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- braces-2.3.2.tgz
- snapdragon-node-2.1.1.tgz
- define-property-1.0.0.tgz
- is-descriptor-1.0.2.tgz
- :x: **kind-of-6.0.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
ctorName in index.js in kind-of v6.0.2 allows external user input to overwrite certain internal attributes via a conflicting name, as demonstrated by 'constructor': {'name':'Symbol'}. Hence, a crafted payload can overwrite this builtin attribute to manipulate the type detection result.
<p>Publish Date: 2019-12-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20149>CVE-2019-20149</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149</a></p>
<p>Release Date: 2020-08-24</p>
<p>Fix Resolution (kind-of): 6.0.3</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> WS-2018-0148</summary>
### Vulnerable Libraries - <b>utile-0.2.1.tgz</b>, <b>utile-0.3.0.tgz</b></p>
<p>
### <b>utile-0.2.1.tgz</b></p>
<p>A drop-in replacement for `util` with some additional advantageous functions</p>
<p>Library home page: <a href="https://registry.npmjs.org/utile/-/utile-0.2.1.tgz">https://registry.npmjs.org/utile/-/utile-0.2.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/broadway/node_modules/utile/package.json,/node_modules/prompt/node_modules/utile/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- flatiron-0.4.3.tgz
- prompt-0.2.14.tgz
- :x: **utile-0.2.1.tgz** (Vulnerable Library)
### <b>utile-0.3.0.tgz</b></p>
<p>A drop-in replacement for `util` with some additional advantageous functions</p>
<p>Library home page: <a href="https://registry.npmjs.org/utile/-/utile-0.3.0.tgz">https://registry.npmjs.org/utile/-/utile-0.3.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/utile/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- :x: **utile-0.3.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The `utile` npm module, version 0.3.0, allows to extract sensitive data from uninitialized memory or to cause a DoS by passing in a large number, in setups where typed user input can be passed (e.g. from JSON).
<p>Publish Date: 2018-07-16
<p>URL: <a href=https://hackerone.com/reports/321701>WS-2018-0148</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/WS-2018-0148">https://nvd.nist.gov/vuln/detail/WS-2018-0148</a></p>
<p>Release Date: 2018-01-16</p>
<p>Fix Resolution: JetBrains.Rider.Frontend5 - 212.0.20210826.92917,212.0.20211008.220753</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-7788</summary>
### Vulnerable Library - <b>ini-1.3.5.tgz</b></p>
<p>An ini encoder/decoder for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/ini/-/ini-1.3.5.tgz">https://registry.npmjs.org/ini/-/ini-1.3.5.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/ini/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- nconf-0.10.0.tgz
- :x: **ini-1.3.5.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
This affects the package ini before 1.3.6. If an attacker submits a malicious INI file to an application that parses it with ini.parse, they will pollute the prototype on the application. This can be exploited further depending on the context.
<p>Publish Date: 2020-12-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7788>CVE-2020-7788</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7788">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7788</a></p>
<p>Release Date: 2020-12-11</p>
<p>Fix Resolution (ini): 1.3.6</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-7774</summary>
### Vulnerable Library - <b>y18n-3.2.1.tgz</b></p>
<p>the bare-bones internationalization library used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/y18n/-/y18n-3.2.1.tgz">https://registry.npmjs.org/y18n/-/y18n-3.2.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/y18n/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- nconf-0.10.0.tgz
- yargs-3.32.0.tgz
- :x: **y18n-3.2.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
This affects the package y18n before 3.2.2, 4.0.1 and 5.0.5. PoC by po6ix: const y18n = require('y18n')(); y18n.setLocale('__proto__'); y18n.updateLocale({polluted: true}); console.log(polluted); // true
<p>Publish Date: 2020-11-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7774>CVE-2020-7774</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1654">https://www.npmjs.com/advisories/1654</a></p>
<p>Release Date: 2020-11-17</p>
<p>Fix Resolution (y18n): 3.2.2</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-7598</summary>
### Vulnerable Libraries - <b>minimist-1.2.0.tgz</b>, <b>minimist-0.0.8.tgz</b>, <b>minimist-0.0.10.tgz</b></p>
<p>
### <b>minimist-1.2.0.tgz</b></p>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz">https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz</a></p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- fsevents-1.2.9.tgz
- node-pre-gyp-0.12.0.tgz
- rc-1.2.8.tgz
- :x: **minimist-1.2.0.tgz** (Vulnerable Library)
### <b>minimist-0.0.8.tgz</b></p>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz">https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz</a></p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- fsevents-1.2.9.tgz
- node-pre-gyp-0.12.0.tgz
- mkdirp-0.5.1.tgz
- :x: **minimist-0.0.8.tgz** (Vulnerable Library)
### <b>minimist-0.0.10.tgz</b></p>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-0.0.10.tgz">https://registry.npmjs.org/minimist/-/minimist-0.0.10.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/minimist/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- optimist-0.6.1.tgz
- :x: **minimist-0.0.10.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a "constructor" or "__proto__" payload.
<p>Publish Date: 2020-03-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598>CVE-2020-7598</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.6</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2020-03-11</p>
<p>Fix Resolution (minimist): 1.2.3</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> WS-2021-0154</summary>
### Vulnerable Library - <b>glob-parent-3.1.0.tgz</b></p>
<p>Strips glob magic from a string to provide the parent directory path</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- :x: **glob-parent-3.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Regular Expression Denial of Service (ReDoS) vulnerability was found in glob-parent before 5.1.2.
<p>Publish Date: 2021-01-27
<p>URL: <a href=https://github.com/gulpjs/glob-parent/commit/f9231168b0041fea3f8f954b3cceb56269fc6366>WS-2021-0154</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2021-01-27</p>
<p>Fix Resolution: glob-parent - 5.1.2</p>
</p>
<p></p>
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p> | True | forever-2.0.0.tgz: 9 vulnerabilities (highest severity is: 9.8) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>forever-2.0.0.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/mixin-deep/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2019-10747](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10747) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | detected in multiple dependencies | Transitive | 3.0.0 | ✅ |
| [CVE-2019-10746](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10746) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | mixin-deep-1.3.1.tgz | Transitive | 3.0.0 | ✅ |
| [CVE-2021-37712](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37712) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.6 | tar-4.4.8.tgz | Transitive | 3.0.0 | ❌ |
| [CVE-2019-20149](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20149) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | kind-of-6.0.2.tgz | Transitive | 3.0.0 | ✅ |
| [WS-2018-0148](https://hackerone.com/reports/321701) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | detected in multiple dependencies | Transitive | N/A | ❌ |
| [CVE-2020-7788](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7788) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.3 | ini-1.3.5.tgz | Transitive | 3.0.0 | ✅ |
| [CVE-2020-7774](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7774) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.3 | y18n-3.2.1.tgz | Transitive | 3.0.0 | ✅ |
| [CVE-2020-7598](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.6 | detected in multiple dependencies | Transitive | 3.0.0 | ❌ |
| [WS-2021-0154](https://github.com/gulpjs/glob-parent/commit/f9231168b0041fea3f8f954b3cceb56269fc6366) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | glob-parent-3.1.0.tgz | Transitive | N/A | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-10747</summary>
### Vulnerable Libraries - <b>set-value-2.0.0.tgz</b>, <b>set-value-0.4.3.tgz</b></p>
<p>
### <b>set-value-2.0.0.tgz</b></p>
<p>Create nested values and any intermediaries using dot notation (`'a.b.c'`) paths.</p>
<p>Library home page: <a href="https://registry.npmjs.org/set-value/-/set-value-2.0.0.tgz">https://registry.npmjs.org/set-value/-/set-value-2.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/set-value/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- braces-2.3.2.tgz
- snapdragon-0.8.2.tgz
- base-0.11.2.tgz
- cache-base-1.0.1.tgz
- :x: **set-value-2.0.0.tgz** (Vulnerable Library)
### <b>set-value-0.4.3.tgz</b></p>
<p>Create nested values and any intermediaries using dot notation (`'a.b.c'`) paths.</p>
<p>Library home page: <a href="https://registry.npmjs.org/set-value/-/set-value-0.4.3.tgz">https://registry.npmjs.org/set-value/-/set-value-0.4.3.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/union-value/node_modules/set-value/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- braces-2.3.2.tgz
- snapdragon-0.8.2.tgz
- base-0.11.2.tgz
- cache-base-1.0.1.tgz
- union-value-1.0.0.tgz
- :x: **set-value-0.4.3.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
set-value is vulnerable to Prototype Pollution in versions lower than 3.0.1. The function mixin-deep could be tricked into adding or modifying properties of Object.prototype using any of the constructor, prototype and _proto_ payloads.
<p>Publish Date: 2019-08-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10747>CVE-2019-10747</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2019-10-29</p>
<p>Fix Resolution (set-value): 2.0.1</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p><p>Fix Resolution (set-value): 2.0.1</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-10746</summary>
### Vulnerable Library - <b>mixin-deep-1.3.1.tgz</b></p>
<p>Deeply mix the properties of objects into the first object. Like merge-deep, but doesn't clone.</p>
<p>Library home page: <a href="https://registry.npmjs.org/mixin-deep/-/mixin-deep-1.3.1.tgz">https://registry.npmjs.org/mixin-deep/-/mixin-deep-1.3.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/mixin-deep/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- braces-2.3.2.tgz
- snapdragon-0.8.2.tgz
- base-0.11.2.tgz
- :x: **mixin-deep-1.3.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
mixin-deep is vulnerable to Prototype Pollution in versions before 1.3.2 and version 2.0.0. The function mixin-deep could be tricked into adding or modifying properties of Object.prototype using a constructor payload.
<p>Publish Date: 2019-08-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10746>CVE-2019-10746</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2019-08-23</p>
<p>Fix Resolution (mixin-deep): 1.3.2</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-37712</summary>
### Vulnerable Library - <b>tar-4.4.8.tgz</b></p>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.8.tgz">https://registry.npmjs.org/tar/-/tar-4.4.8.tgz</a></p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- fsevents-1.2.9.tgz
- node-pre-gyp-0.12.0.tgz
- :x: **tar-4.4.8.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The npm package "tar" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with names containing unicode values that normalized to the same value. Additionally, on Windows systems, long path portions would resolve to the same file system entities as their 8.3 "short path" counterparts. A specially crafted tar archive could thus include a directory with one form of the path, followed by a symbolic link with a different string that resolves to the same file system entity, followed by a file using the first form. By first creating a directory, and then replacing that directory with a symlink that had a different apparent name that resolved to the same entry in the filesystem, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-qq89-hq3f-393p.
<p>Publish Date: 2021-08-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37712>CVE-2021-37712</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.6</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p">https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p</a></p>
<p>Release Date: 2021-08-31</p>
<p>Fix Resolution (tar): 4.4.18</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-20149</summary>
### Vulnerable Library - <b>kind-of-6.0.2.tgz</b></p>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz">https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/micromatch/node_modules/kind-of/package.json,/node_modules/base/node_modules/kind-of/package.json,/node_modules/nanomatch/node_modules/kind-of/package.json,/node_modules/define-property/node_modules/kind-of/package.json,/node_modules/extglob/node_modules/kind-of/package.json,/node_modules/snapdragon-node/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- braces-2.3.2.tgz
- snapdragon-node-2.1.1.tgz
- define-property-1.0.0.tgz
- is-descriptor-1.0.2.tgz
- :x: **kind-of-6.0.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
ctorName in index.js in kind-of v6.0.2 allows external user input to overwrite certain internal attributes via a conflicting name, as demonstrated by 'constructor': {'name':'Symbol'}. Hence, a crafted payload can overwrite this builtin attribute to manipulate the type detection result.
<p>Publish Date: 2019-12-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20149>CVE-2019-20149</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149</a></p>
<p>Release Date: 2020-08-24</p>
<p>Fix Resolution (kind-of): 6.0.3</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> WS-2018-0148</summary>
### Vulnerable Libraries - <b>utile-0.2.1.tgz</b>, <b>utile-0.3.0.tgz</b></p>
<p>
### <b>utile-0.2.1.tgz</b></p>
<p>A drop-in replacement for `util` with some additional advantageous functions</p>
<p>Library home page: <a href="https://registry.npmjs.org/utile/-/utile-0.2.1.tgz">https://registry.npmjs.org/utile/-/utile-0.2.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/broadway/node_modules/utile/package.json,/node_modules/prompt/node_modules/utile/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- flatiron-0.4.3.tgz
- prompt-0.2.14.tgz
- :x: **utile-0.2.1.tgz** (Vulnerable Library)
### <b>utile-0.3.0.tgz</b></p>
<p>A drop-in replacement for `util` with some additional advantageous functions</p>
<p>Library home page: <a href="https://registry.npmjs.org/utile/-/utile-0.3.0.tgz">https://registry.npmjs.org/utile/-/utile-0.3.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/utile/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- :x: **utile-0.3.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The `utile` npm module, version 0.3.0, allows to extract sensitive data from uninitialized memory or to cause a DoS by passing in a large number, in setups where typed user input can be passed (e.g. from JSON).
<p>Publish Date: 2018-07-16
<p>URL: <a href=https://hackerone.com/reports/321701>WS-2018-0148</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/WS-2018-0148">https://nvd.nist.gov/vuln/detail/WS-2018-0148</a></p>
<p>Release Date: 2018-01-16</p>
<p>Fix Resolution: JetBrains.Rider.Frontend5 - 212.0.20210826.92917,212.0.20211008.220753</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-7788</summary>
### Vulnerable Library - <b>ini-1.3.5.tgz</b></p>
<p>An ini encoder/decoder for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/ini/-/ini-1.3.5.tgz">https://registry.npmjs.org/ini/-/ini-1.3.5.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/ini/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- nconf-0.10.0.tgz
- :x: **ini-1.3.5.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
This affects the package ini before 1.3.6. If an attacker submits a malicious INI file to an application that parses it with ini.parse, they will pollute the prototype on the application. This can be exploited further depending on the context.
<p>Publish Date: 2020-12-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7788>CVE-2020-7788</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7788">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7788</a></p>
<p>Release Date: 2020-12-11</p>
<p>Fix Resolution (ini): 1.3.6</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-7774</summary>
### Vulnerable Library - <b>y18n-3.2.1.tgz</b></p>
<p>the bare-bones internationalization library used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/y18n/-/y18n-3.2.1.tgz">https://registry.npmjs.org/y18n/-/y18n-3.2.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/y18n/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- nconf-0.10.0.tgz
- yargs-3.32.0.tgz
- :x: **y18n-3.2.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
This affects the package y18n before 3.2.2, 4.0.1 and 5.0.5. PoC by po6ix: const y18n = require('y18n')(); y18n.setLocale('__proto__'); y18n.updateLocale({polluted: true}); console.log(polluted); // true
<p>Publish Date: 2020-11-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7774>CVE-2020-7774</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1654">https://www.npmjs.com/advisories/1654</a></p>
<p>Release Date: 2020-11-17</p>
<p>Fix Resolution (y18n): 3.2.2</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-7598</summary>
### Vulnerable Libraries - <b>minimist-1.2.0.tgz</b>, <b>minimist-0.0.8.tgz</b>, <b>minimist-0.0.10.tgz</b></p>
<p>
### <b>minimist-1.2.0.tgz</b></p>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz">https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz</a></p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- fsevents-1.2.9.tgz
- node-pre-gyp-0.12.0.tgz
- rc-1.2.8.tgz
- :x: **minimist-1.2.0.tgz** (Vulnerable Library)
### <b>minimist-0.0.8.tgz</b></p>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz">https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz</a></p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- fsevents-1.2.9.tgz
- node-pre-gyp-0.12.0.tgz
- mkdirp-0.5.1.tgz
- :x: **minimist-0.0.8.tgz** (Vulnerable Library)
### <b>minimist-0.0.10.tgz</b></p>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-0.0.10.tgz">https://registry.npmjs.org/minimist/-/minimist-0.0.10.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/minimist/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- optimist-0.6.1.tgz
- :x: **minimist-0.0.10.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a "constructor" or "__proto__" payload.
<p>Publish Date: 2020-03-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598>CVE-2020-7598</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.6</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2020-03-11</p>
<p>Fix Resolution (minimist): 1.2.3</p>
<p>Direct dependency fix Resolution (forever): 3.0.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> WS-2021-0154</summary>
### Vulnerable Library - <b>glob-parent-3.1.0.tgz</b></p>
<p>Strips glob magic from a string to provide the parent directory path</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
- forever-2.0.0.tgz (Root Library)
- forever-monitor-2.0.0.tgz
- chokidar-2.1.8.tgz
- :x: **glob-parent-3.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/NodeGoat-23/commit/cf145ad364ecdbde0b5dd2717e7ffcf9434ce755">cf145ad364ecdbde0b5dd2717e7ffcf9434ce755</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Regular Expression Denial of Service (ReDoS) vulnerability was found in glob-parent before 5.1.2.
<p>Publish Date: 2021-01-27
<p>URL: <a href=https://github.com/gulpjs/glob-parent/commit/f9231168b0041fea3f8f954b3cceb56269fc6366>WS-2021-0154</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2021-01-27</p>
<p>Fix Resolution: glob-parent - 5.1.2</p>
</p>
<p></p>
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p> | non_main | forever tgz vulnerabilities highest severity is vulnerable library forever tgz path to dependency file package json path to vulnerable library node modules mixin deep package json found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available high detected in multiple dependencies transitive high mixin deep tgz transitive high tar tgz transitive high kind of tgz transitive high detected in multiple dependencies transitive n a high ini tgz transitive high tgz transitive medium detected in multiple dependencies transitive medium glob parent tgz transitive n a details cve vulnerable libraries set value tgz set value tgz set value tgz create nested values and any intermediaries using dot notation a b c paths library home page a href path to dependency file package json path to vulnerable library node modules set value package json dependency hierarchy forever tgz root library forever monitor tgz chokidar tgz braces tgz snapdragon tgz base tgz cache base tgz x set value tgz vulnerable library set value tgz create nested values and any intermediaries using dot notation a b c paths library home page a href path to dependency file package json path to vulnerable library node modules union value node modules set value package json dependency hierarchy forever tgz root library forever monitor tgz chokidar tgz braces tgz snapdragon tgz base tgz cache base tgz union value tgz x set value tgz vulnerable library found in head commit a href found in base branch master vulnerability details set value is vulnerable to prototype pollution in versions lower than the function mixin deep could be tricked into adding or modifying properties of object prototype using any of the constructor prototype and proto payloads publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution set value direct dependency fix resolution forever fix resolution set value direct dependency fix resolution forever rescue worker helmet automatic remediation is available for this issue cve vulnerable library mixin deep tgz deeply mix the properties of objects into the first object like merge deep but doesn t clone library home page a href path to dependency file package json path to vulnerable library node modules mixin deep package json dependency hierarchy forever tgz root library forever monitor tgz chokidar tgz braces tgz snapdragon tgz base tgz x mixin deep tgz vulnerable library found in head commit a href found in base branch master vulnerability details mixin deep is vulnerable to prototype pollution in versions before and version the function mixin deep could be tricked into adding or modifying properties of object prototype using a constructor payload publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution mixin deep direct dependency fix resolution forever rescue worker helmet automatic remediation is available for this issue cve vulnerable library tar tgz tar for node library home page a href dependency hierarchy forever tgz root library forever monitor tgz chokidar tgz fsevents tgz node pre gyp tgz x tar tgz vulnerable library found in head commit a href found in base branch master vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite and arbitrary code execution vulnerability node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with names containing unicode values that normalized to the same value additionally on windows systems long path portions would resolve to the same file system entities as their short path counterparts a specially crafted tar archive could thus include a directory with one form of the path followed by a symbolic link with a different string that resolves to the same file system entity followed by a file using the first form by first creating a directory and then replacing that directory with a symlink that had a different apparent name that resolved to the same entry in the filesystem it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite these issues were addressed in releases and the branch of node tar has been deprecated and did not receive patches for these issues if you are still using a release we recommend you update to a more recent version of node tar if this is not possible a workaround is available in the referenced ghsa publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar direct dependency fix resolution forever cve vulnerable library kind of tgz get the native type of a value library home page a href path to dependency file package json path to vulnerable library node modules micromatch node modules kind of package json node modules base node modules kind of package json node modules nanomatch node modules kind of package json node modules define property node modules kind of package json node modules extglob node modules kind of package json node modules snapdragon node node modules kind of package json dependency hierarchy forever tgz root library forever monitor tgz chokidar tgz braces tgz snapdragon node tgz define property tgz is descriptor tgz x kind of tgz vulnerable library found in head commit a href found in base branch master vulnerability details ctorname in index js in kind of allows external user input to overwrite certain internal attributes via a conflicting name as demonstrated by constructor name symbol hence a crafted payload can overwrite this builtin attribute to manipulate the type detection result publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution kind of direct dependency fix resolution forever rescue worker helmet automatic remediation is available for this issue ws vulnerable libraries utile tgz utile tgz utile tgz a drop in replacement for util with some additional advantageous functions library home page a href path to dependency file package json path to vulnerable library node modules broadway node modules utile package json node modules prompt node modules utile package json dependency hierarchy forever tgz root library flatiron tgz prompt tgz x utile tgz vulnerable library utile tgz a drop in replacement for util with some additional advantageous functions library home page a href path to dependency file package json path to vulnerable library node modules utile package json dependency hierarchy forever tgz root library x utile tgz vulnerable library found in head commit a href found in base branch master vulnerability details the utile npm module version allows to extract sensitive data from uninitialized memory or to cause a dos by passing in a large number in setups where typed user input can be passed e g from json publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jetbrains rider cve vulnerable library ini tgz an ini encoder decoder for node library home page a href path to dependency file package json path to vulnerable library node modules ini package json dependency hierarchy forever tgz root library nconf tgz x ini tgz vulnerable library found in head commit a href found in base branch master vulnerability details this affects the package ini before if an attacker submits a malicious ini file to an application that parses it with ini parse they will pollute the prototype on the application this can be exploited further depending on the context publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ini direct dependency fix resolution forever rescue worker helmet automatic remediation is available for this issue cve vulnerable library tgz the bare bones internationalization library used by yargs library home page a href path to dependency file package json path to vulnerable library node modules package json dependency hierarchy forever tgz root library nconf tgz yargs tgz x tgz vulnerable library found in head commit a href found in base branch master vulnerability details this affects the package before and poc by const require setlocale proto updatelocale polluted true console log polluted true publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution direct dependency fix resolution forever rescue worker helmet automatic remediation is available for this issue cve vulnerable libraries minimist tgz minimist tgz minimist tgz minimist tgz parse argument options library home page a href dependency hierarchy forever tgz root library forever monitor tgz chokidar tgz fsevents tgz node pre gyp tgz rc tgz x minimist tgz vulnerable library minimist tgz parse argument options library home page a href dependency hierarchy forever tgz root library forever monitor tgz chokidar tgz fsevents tgz node pre gyp tgz mkdirp tgz x minimist tgz vulnerable library minimist tgz parse argument options library home page a href path to dependency file package json path to vulnerable library node modules minimist package json dependency hierarchy forever tgz root library optimist tgz x minimist tgz vulnerable library found in head commit a href found in base branch master vulnerability details minimist before could be tricked into adding or modifying properties of object prototype using a constructor or proto payload publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version release date fix resolution minimist direct dependency fix resolution forever ws vulnerable library glob parent tgz strips glob magic from a string to provide the parent directory path library home page a href path to dependency file package json path to vulnerable library node modules glob parent package json dependency hierarchy forever tgz root library forever monitor tgz chokidar tgz x glob parent tgz vulnerable library found in head commit a href found in base branch master vulnerability details regular expression denial of service redos vulnerability was found in glob parent before publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version release date fix resolution glob parent rescue worker helmet automatic remediation is available for this issue | 0 |
237,742 | 26,085,316,826 | IssuesEvent | 2022-12-26 01:31:08 | n-devs/testTungTonScript | https://api.github.com/repos/n-devs/testTungTonScript | opened | CVE-2022-46175 (High) detected in json5-0.5.1.tgz, json5-0.4.0.tgz | security vulnerability | ## CVE-2022-46175 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>json5-0.5.1.tgz</b>, <b>json5-0.4.0.tgz</b></p></summary>
<p>
<details><summary><b>json5-0.5.1.tgz</b></p></summary>
<p>JSON for the ES5 era.</p>
<p>Library home page: <a href="https://registry.npmjs.org/json5/-/json5-0.5.1.tgz">https://registry.npmjs.org/json5/-/json5-0.5.1.tgz</a></p>
<p>Path to dependency file: /testTungTonScript/package.json</p>
<p>Path to vulnerable library: /node_modules/json5/package.json</p>
<p>
Dependency Hierarchy:
- jest-expo-26.0.0.tgz (Root Library)
- :x: **json5-0.5.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>json5-0.4.0.tgz</b></p></summary>
<p>JSON for the ES5 era.</p>
<p>Library home page: <a href="https://registry.npmjs.org/json5/-/json5-0.4.0.tgz">https://registry.npmjs.org/json5/-/json5-0.4.0.tgz</a></p>
<p>Path to dependency file: /testTungTonScript/package.json</p>
<p>Path to vulnerable library: /node_modules/metro/node_modules/json5/package.json</p>
<p>
Dependency Hierarchy:
- react-native-0.54.0.tgz (Root Library)
- metro-0.28.0.tgz
- :x: **json5-0.4.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://api.github.com/repos/n-psk/testTungTonScript/commits/c211255b7caad4da1c0539472bf4ea67c6d2d7f3">c211255b7caad4da1c0539472bf4ea67c6d2d7f3</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
JSON5 is an extension to the popular JSON file format that aims to be easier to write and maintain by hand (e.g. for config files). The `parse` method of the JSON5 library before and including version `2.2.1` does not restrict parsing of keys named `__proto__`, allowing specially crafted strings to pollute the prototype of the resulting object. This vulnerability pollutes the prototype of the object returned by `JSON5.parse` and not the global Object prototype, which is the commonly understood definition of Prototype Pollution. However, polluting the prototype of a single object can have significant security impact for an application if the object is later used in trusted operations. This vulnerability could allow an attacker to set arbitrary and unexpected keys on the object returned from `JSON5.parse`. The actual impact will depend on how applications utilize the returned object and how they filter unwanted keys, but could include denial of service, cross-site scripting, elevation of privilege, and in extreme cases, remote code execution. `JSON5.parse` should restrict parsing of `__proto__` keys when parsing JSON strings to objects. As a point of reference, the `JSON.parse` method included in JavaScript ignores `__proto__` keys. Simply changing `JSON5.parse` to `JSON.parse` in the examples above mitigates this vulnerability. This vulnerability is patched in json5 version 2.2.2 and later.
<p>Publish Date: 2022-12-24
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-46175>CVE-2022-46175</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: Low
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.cve.org/CVERecord?id=CVE-2022-46175">https://www.cve.org/CVERecord?id=CVE-2022-46175</a></p>
<p>Release Date: 2022-12-24</p>
<p>Fix Resolution: json5 - 2.2.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-46175 (High) detected in json5-0.5.1.tgz, json5-0.4.0.tgz - ## CVE-2022-46175 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>json5-0.5.1.tgz</b>, <b>json5-0.4.0.tgz</b></p></summary>
<p>
<details><summary><b>json5-0.5.1.tgz</b></p></summary>
<p>JSON for the ES5 era.</p>
<p>Library home page: <a href="https://registry.npmjs.org/json5/-/json5-0.5.1.tgz">https://registry.npmjs.org/json5/-/json5-0.5.1.tgz</a></p>
<p>Path to dependency file: /testTungTonScript/package.json</p>
<p>Path to vulnerable library: /node_modules/json5/package.json</p>
<p>
Dependency Hierarchy:
- jest-expo-26.0.0.tgz (Root Library)
- :x: **json5-0.5.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>json5-0.4.0.tgz</b></p></summary>
<p>JSON for the ES5 era.</p>
<p>Library home page: <a href="https://registry.npmjs.org/json5/-/json5-0.4.0.tgz">https://registry.npmjs.org/json5/-/json5-0.4.0.tgz</a></p>
<p>Path to dependency file: /testTungTonScript/package.json</p>
<p>Path to vulnerable library: /node_modules/metro/node_modules/json5/package.json</p>
<p>
Dependency Hierarchy:
- react-native-0.54.0.tgz (Root Library)
- metro-0.28.0.tgz
- :x: **json5-0.4.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://api.github.com/repos/n-psk/testTungTonScript/commits/c211255b7caad4da1c0539472bf4ea67c6d2d7f3">c211255b7caad4da1c0539472bf4ea67c6d2d7f3</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
JSON5 is an extension to the popular JSON file format that aims to be easier to write and maintain by hand (e.g. for config files). The `parse` method of the JSON5 library before and including version `2.2.1` does not restrict parsing of keys named `__proto__`, allowing specially crafted strings to pollute the prototype of the resulting object. This vulnerability pollutes the prototype of the object returned by `JSON5.parse` and not the global Object prototype, which is the commonly understood definition of Prototype Pollution. However, polluting the prototype of a single object can have significant security impact for an application if the object is later used in trusted operations. This vulnerability could allow an attacker to set arbitrary and unexpected keys on the object returned from `JSON5.parse`. The actual impact will depend on how applications utilize the returned object and how they filter unwanted keys, but could include denial of service, cross-site scripting, elevation of privilege, and in extreme cases, remote code execution. `JSON5.parse` should restrict parsing of `__proto__` keys when parsing JSON strings to objects. As a point of reference, the `JSON.parse` method included in JavaScript ignores `__proto__` keys. Simply changing `JSON5.parse` to `JSON.parse` in the examples above mitigates this vulnerability. This vulnerability is patched in json5 version 2.2.2 and later.
<p>Publish Date: 2022-12-24
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-46175>CVE-2022-46175</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: Low
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.cve.org/CVERecord?id=CVE-2022-46175">https://www.cve.org/CVERecord?id=CVE-2022-46175</a></p>
<p>Release Date: 2022-12-24</p>
<p>Fix Resolution: json5 - 2.2.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_main | cve high detected in tgz tgz cve high severity vulnerability vulnerable libraries tgz tgz tgz json for the era library home page a href path to dependency file testtungtonscript package json path to vulnerable library node modules package json dependency hierarchy jest expo tgz root library x tgz vulnerable library tgz json for the era library home page a href path to dependency file testtungtonscript package json path to vulnerable library node modules metro node modules package json dependency hierarchy react native tgz root library metro tgz x tgz vulnerable library found in head commit a href vulnerability details is an extension to the popular json file format that aims to be easier to write and maintain by hand e g for config files the parse method of the library before and including version does not restrict parsing of keys named proto allowing specially crafted strings to pollute the prototype of the resulting object this vulnerability pollutes the prototype of the object returned by parse and not the global object prototype which is the commonly understood definition of prototype pollution however polluting the prototype of a single object can have significant security impact for an application if the object is later used in trusted operations this vulnerability could allow an attacker to set arbitrary and unexpected keys on the object returned from parse the actual impact will depend on how applications utilize the returned object and how they filter unwanted keys but could include denial of service cross site scripting elevation of privilege and in extreme cases remote code execution parse should restrict parsing of proto keys when parsing json strings to objects as a point of reference the json parse method included in javascript ignores proto keys simply changing parse to json parse in the examples above mitigates this vulnerability this vulnerability is patched in version and later publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact low availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend | 0 |
39,888 | 5,255,354,396 | IssuesEvent | 2017-02-02 15:23:43 | LeastAuthority/leastauthority.com | https://api.github.com/repos/LeastAuthority/leastauthority.com | closed | reenable PGP after fixing it to be less variable, error prone, unrecorded | automated test needed enhancement operations reliability signup | I'm automating the process I go through after being notified of a PGP signup.
| 1.0 | reenable PGP after fixing it to be less variable, error prone, unrecorded - I'm automating the process I go through after being notified of a PGP signup.
| non_main | reenable pgp after fixing it to be less variable error prone unrecorded i m automating the process i go through after being notified of a pgp signup | 0 |
5,688 | 29,927,342,530 | IssuesEvent | 2023-06-22 06:56:21 | onebeyond/maintainers | https://api.github.com/repos/onebeyond/maintainers | closed | OpenSSF Scorecard implementation | maintainers-agenda | ### Intro
I reviewed the scores for some key projects ([rascal](https://deps.dev/npm/rascal/16.2.0), [Systemic](https://deps.dev/npm/systemic), [handy-postgres](https://deps.dev/npm/handy-postgres), etc.) and I have identified some clear initiatives or strategies that we can follow to improve the results. Our average score is around 5-5.5 out of 10.
### Opportunities in repo settings
**Code Review**
> Determines if the project requires code review before pull requests
**Branch protection**
> Determines if the default and release branches are protected with GitHub's branch protection settings.
> Info: 'force pushes' disabled on branch 'master'
> Info: 'allow deletion' disabled on branch 'master'
> Warn: no status checks found to merge onto branch 'master'
> Warn: number of required reviewers is only 1 on branch 'master'
> Warn: codeowner review is not required on branch 'master'
### Oportunities in pipelines
**Tokens-permissions**
> Determines if the project's workflows follow the principle of least privilege.
> non read-only tokens detected in GitHub workflows
**Pinned-Dependencies**
> Determines if the project has declared and pinned the dependencies of its build process.
**fuzzing**
> Determines if the project uses fuzzing.
### Other
**License**
Most projects now have a valid license that will patch the scorecard in the next deployments, but I noticed that we have some dependencies with unknown licenses.
**Security-policy**
> Determines if the project has published a security policy.
**SAST**
> Determines if the project uses static code analysis.
We have the possibility to use [CodeQL](https://codeql.github.com/) for free
**Dependency-Update-Tool**
> Determines if the project uses a dependency update tool.
We can set up Dependabot properly to avoid annoying auto-pull requests, but prompt us about relevant security releases only.
**CII-Best-Practices**
> Determines if the project has an OpenSSF (formerly CII) Best Practices Badge.
### Relevant Documentation:
- [Official Documentation](https://securityscorecards.dev/)
- [You should use the OpenSSF Scorecard](https://dev.to/ulisesgascon/you-should-use-the-openssf-scorecard-4eh4)
## Actionable items
- Add a global security Policy in the organization metadata repository
- Add code review mandatory in PRs at global org and/or in each repo. []()
- Add [branch protection rules](https://github.com/ossf/scorecard/blob/main/docs/checks.md#branch-protection) at global org and/or in each repo. [Related documentation](https://docs.github.com/en/code-security/getting-started/securing-your-organization)
- Add secret scanning at global org and/or in each repo. [Related documentation](https://docs.github.com/en/code-security/secret-scanning/about-secret-scanning)
- Add code scanning with CodeQL at global org and/or in each repo. [Related documentation](https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/about-code-scanning)
- Add dependabot with a good non-intrusive settings at global org and/or in each repo. [Related documentation](https://docs.github.com/en/code-security/dependabot/dependabot-alerts/about-dependabot-alerts)
- Create a pipeline to ensure that the projects are following best practices in terms of dependencies (avoid unkwnon, etc..) by using [license-checker](https://github.com/onebeyond/license-checker) in each repository
- Update each repository pipelines to use [pinned versions](https://github.com/ossf/scorecard/blob/main/docs/checks.md#pinned-dependencies) and [read-only tokens](https://github.com/ossf/scorecard/blob/main/docs/checks.md#token-permissions).
- Pin version in pkg for each project to ensure inmutability.
- Add OpenSSF (formerly CII) Best Practices Badge to each repo
| True | OpenSSF Scorecard implementation - ### Intro
I reviewed the scores for some key projects ([rascal](https://deps.dev/npm/rascal/16.2.0), [Systemic](https://deps.dev/npm/systemic), [handy-postgres](https://deps.dev/npm/handy-postgres), etc.) and I have identified some clear initiatives or strategies that we can follow to improve the results. Our average score is around 5-5.5 out of 10.
### Opportunities in repo settings
**Code Review**
> Determines if the project requires code review before pull requests
**Branch protection**
> Determines if the default and release branches are protected with GitHub's branch protection settings.
> Info: 'force pushes' disabled on branch 'master'
> Info: 'allow deletion' disabled on branch 'master'
> Warn: no status checks found to merge onto branch 'master'
> Warn: number of required reviewers is only 1 on branch 'master'
> Warn: codeowner review is not required on branch 'master'
### Oportunities in pipelines
**Tokens-permissions**
> Determines if the project's workflows follow the principle of least privilege.
> non read-only tokens detected in GitHub workflows
**Pinned-Dependencies**
> Determines if the project has declared and pinned the dependencies of its build process.
**fuzzing**
> Determines if the project uses fuzzing.
### Other
**License**
Most projects now have a valid license that will patch the scorecard in the next deployments, but I noticed that we have some dependencies with unknown licenses.
**Security-policy**
> Determines if the project has published a security policy.
**SAST**
> Determines if the project uses static code analysis.
We have the possibility to use [CodeQL](https://codeql.github.com/) for free
**Dependency-Update-Tool**
> Determines if the project uses a dependency update tool.
We can set up Dependabot properly to avoid annoying auto-pull requests, but prompt us about relevant security releases only.
**CII-Best-Practices**
> Determines if the project has an OpenSSF (formerly CII) Best Practices Badge.
### Relevant Documentation:
- [Official Documentation](https://securityscorecards.dev/)
- [You should use the OpenSSF Scorecard](https://dev.to/ulisesgascon/you-should-use-the-openssf-scorecard-4eh4)
## Actionable items
- Add a global security Policy in the organization metadata repository
- Add code review mandatory in PRs at global org and/or in each repo. []()
- Add [branch protection rules](https://github.com/ossf/scorecard/blob/main/docs/checks.md#branch-protection) at global org and/or in each repo. [Related documentation](https://docs.github.com/en/code-security/getting-started/securing-your-organization)
- Add secret scanning at global org and/or in each repo. [Related documentation](https://docs.github.com/en/code-security/secret-scanning/about-secret-scanning)
- Add code scanning with CodeQL at global org and/or in each repo. [Related documentation](https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/about-code-scanning)
- Add dependabot with a good non-intrusive settings at global org and/or in each repo. [Related documentation](https://docs.github.com/en/code-security/dependabot/dependabot-alerts/about-dependabot-alerts)
- Create a pipeline to ensure that the projects are following best practices in terms of dependencies (avoid unkwnon, etc..) by using [license-checker](https://github.com/onebeyond/license-checker) in each repository
- Update each repository pipelines to use [pinned versions](https://github.com/ossf/scorecard/blob/main/docs/checks.md#pinned-dependencies) and [read-only tokens](https://github.com/ossf/scorecard/blob/main/docs/checks.md#token-permissions).
- Pin version in pkg for each project to ensure inmutability.
- Add OpenSSF (formerly CII) Best Practices Badge to each repo
| main | openssf scorecard implementation intro i reviewed the scores for some key projects etc and i have identified some clear initiatives or strategies that we can follow to improve the results our average score is around out of opportunities in repo settings code review determines if the project requires code review before pull requests branch protection determines if the default and release branches are protected with github s branch protection settings info force pushes disabled on branch master info allow deletion disabled on branch master warn no status checks found to merge onto branch master warn number of required reviewers is only on branch master warn codeowner review is not required on branch master oportunities in pipelines tokens permissions determines if the project s workflows follow the principle of least privilege non read only tokens detected in github workflows pinned dependencies determines if the project has declared and pinned the dependencies of its build process fuzzing determines if the project uses fuzzing other license most projects now have a valid license that will patch the scorecard in the next deployments but i noticed that we have some dependencies with unknown licenses security policy determines if the project has published a security policy sast determines if the project uses static code analysis we have the possibility to use for free dependency update tool determines if the project uses a dependency update tool we can set up dependabot properly to avoid annoying auto pull requests but prompt us about relevant security releases only cii best practices determines if the project has an openssf formerly cii best practices badge relevant documentation actionable items add a global security policy in the organization metadata repository add code review mandatory in prs at global org and or in each repo add at global org and or in each repo add secret scanning at global org and or in each repo add code scanning with codeql at global org and or in each repo add dependabot with a good non intrusive settings at global org and or in each repo create a pipeline to ensure that the projects are following best practices in terms of dependencies avoid unkwnon etc by using in each repository update each repository pipelines to use and pin version in pkg for each project to ensure inmutability add openssf formerly cii best practices badge to each repo | 1 |
71,460 | 18,751,867,850 | IssuesEvent | 2021-11-05 03:48:47 | mavlink/mavros | https://api.github.com/repos/mavlink/mavros | closed | Issue with ros1_bridge with mavros | bug build error messages | ### Issue details
I was originally trying to use mavros on Ubuntu 18.04 using ros1 melodic and ros2 eloquent. This had failed result in an error regarding static assertion failed regarding ros1 and ros2 message sizes when trying to build ros1_bridge from source. Originally i had thought was an issue with message compatibility from ros-melodic-mavros-msgs and mavros on master branch.
I went ahead to test out debian packages `ros-foxy-mavros-msgs` and `ros-noetic-mavros-extras` and trying to build them, still have the same error(details below)
Background:
I have other ros2 nodes, trying to get them to communicate to mavros on ros1 via ros1_bridge.
### MAVROS version and platform
Previous attempted configuration that had worked
ros-melodic-mavros-msgs: 1.8.0
ros-eloquent-mavros-msgs: no such package was avaliable, used 2.0.3 from master.
ros-noetic-mavros-msgs: 1.9.0
ros-foxy-mavros-msgs: 2.0.3
ROS: foxy and noetic
Ubuntu: 20.04
### Node logs
Logs from building error
```
--- stderr: ros1_bridge
/srv/bridge/build/ros1_bridge/generated/mavros_msgs__msg__OverrideRCIn__factories.cpp: In static member function ‘static void ros1_bridge::Factory<ROS1_T, ROS2_T>::convert_1_to_2(const ROS1_T&, ROS2_T&) [with ROS1_T = mavros_msgs::OverrideRCIn_<std::allocator<void> >; ROS2_T = mavros_msgs::msg::OverrideRCIn_<std::allocator<void> >]’:
/srv/bridge/build/ros1_bridge/generated/mavros_msgs__msg__OverrideRCIn__factories.cpp:61:32: error: static assertion failed: destination array not large enough for source array
61 | (ros2_msg.channels.size()) >= (ros1_msg.channels.size()),
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~
make[2]: *** [CMakeFiles/ros1_bridge.dir/build.make:3803: CMakeFiles/ros1_bridge.dir/generated/mavros_msgs__msg__OverrideRCIn__factories.cpp.o] Error 1
make[2]: *** Waiting for unfinished jobs....
make[1]: *** [CMakeFiles/Makefile2:260: CMakeFiles/ros1_bridge.dir/all] Error 2
make: *** [Makefile:141: all] Error 2
---
Failed <<< ros1_bridge [2min 40s, exited with code 2]
``` | 1.0 | Issue with ros1_bridge with mavros - ### Issue details
I was originally trying to use mavros on Ubuntu 18.04 using ros1 melodic and ros2 eloquent. This had failed result in an error regarding static assertion failed regarding ros1 and ros2 message sizes when trying to build ros1_bridge from source. Originally i had thought was an issue with message compatibility from ros-melodic-mavros-msgs and mavros on master branch.
I went ahead to test out debian packages `ros-foxy-mavros-msgs` and `ros-noetic-mavros-extras` and trying to build them, still have the same error(details below)
Background:
I have other ros2 nodes, trying to get them to communicate to mavros on ros1 via ros1_bridge.
### MAVROS version and platform
Previous attempted configuration that had worked
ros-melodic-mavros-msgs: 1.8.0
ros-eloquent-mavros-msgs: no such package was avaliable, used 2.0.3 from master.
ros-noetic-mavros-msgs: 1.9.0
ros-foxy-mavros-msgs: 2.0.3
ROS: foxy and noetic
Ubuntu: 20.04
### Node logs
Logs from building error
```
--- stderr: ros1_bridge
/srv/bridge/build/ros1_bridge/generated/mavros_msgs__msg__OverrideRCIn__factories.cpp: In static member function ‘static void ros1_bridge::Factory<ROS1_T, ROS2_T>::convert_1_to_2(const ROS1_T&, ROS2_T&) [with ROS1_T = mavros_msgs::OverrideRCIn_<std::allocator<void> >; ROS2_T = mavros_msgs::msg::OverrideRCIn_<std::allocator<void> >]’:
/srv/bridge/build/ros1_bridge/generated/mavros_msgs__msg__OverrideRCIn__factories.cpp:61:32: error: static assertion failed: destination array not large enough for source array
61 | (ros2_msg.channels.size()) >= (ros1_msg.channels.size()),
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~
make[2]: *** [CMakeFiles/ros1_bridge.dir/build.make:3803: CMakeFiles/ros1_bridge.dir/generated/mavros_msgs__msg__OverrideRCIn__factories.cpp.o] Error 1
make[2]: *** Waiting for unfinished jobs....
make[1]: *** [CMakeFiles/Makefile2:260: CMakeFiles/ros1_bridge.dir/all] Error 2
make: *** [Makefile:141: all] Error 2
---
Failed <<< ros1_bridge [2min 40s, exited with code 2]
``` | non_main | issue with bridge with mavros issue details i was originally trying to use mavros on ubuntu using melodic and eloquent this had failed result in an error regarding static assertion failed regarding and message sizes when trying to build bridge from source originally i had thought was an issue with message compatibility from ros melodic mavros msgs and mavros on master branch i went ahead to test out debian packages ros foxy mavros msgs and ros noetic mavros extras and trying to build them still have the same error details below background i have other nodes trying to get them to communicate to mavros on via bridge mavros version and platform previous attempted configuration that had worked ros melodic mavros msgs ros eloquent mavros msgs no such package was avaliable used from master ros noetic mavros msgs ros foxy mavros msgs ros foxy and noetic ubuntu node logs logs from building error stderr bridge srv bridge build bridge generated mavros msgs msg overridercin factories cpp in static member function ‘static void bridge factory convert to const t t ’ srv bridge build bridge generated mavros msgs msg overridercin factories cpp error static assertion failed destination array not large enough for source array msg channels size msg channels size make error make waiting for unfinished jobs make error make error failed bridge | 0 |
683,266 | 23,374,945,609 | IssuesEvent | 2022-08-11 01:10:41 | codeforboston/advocacy-maps | https://api.github.com/repos/codeforboston/advocacy-maps | closed | Scraper's not scrapin': certificate error in calls to MA API | bug top priority | https://console.cloud.google.com/errors/detail/CO2EjIeot5Go3gE?project=digital-testimony-dev
Error: unable to verify the first certificate
at .TLSSocket.onConnectSecure ( node:_tls_wrap:1532 )
at .TLSSocket.emit ( node:events:527 )
at .TLSSocket.emit ( node:domain:537 )
at .TLSSocket._finishInit ( node:_tls_wrap:946 )
at TLSWrap.ssl.onhandshakedone ( node:_tls_wrap:727 )
at .TLSWrap.callbackTrampoline ( node:internal/async_hooks:130 )
| 1.0 | Scraper's not scrapin': certificate error in calls to MA API - https://console.cloud.google.com/errors/detail/CO2EjIeot5Go3gE?project=digital-testimony-dev
Error: unable to verify the first certificate
at .TLSSocket.onConnectSecure ( node:_tls_wrap:1532 )
at .TLSSocket.emit ( node:events:527 )
at .TLSSocket.emit ( node:domain:537 )
at .TLSSocket._finishInit ( node:_tls_wrap:946 )
at TLSWrap.ssl.onhandshakedone ( node:_tls_wrap:727 )
at .TLSWrap.callbackTrampoline ( node:internal/async_hooks:130 )
| non_main | scraper s not scrapin certificate error in calls to ma api error unable to verify the first certificate at tlssocket onconnectsecure node tls wrap at tlssocket emit node events at tlssocket emit node domain at tlssocket finishinit node tls wrap at tlswrap ssl onhandshakedone node tls wrap at tlswrap callbacktrampoline node internal async hooks | 0 |
1,722 | 6,574,505,065 | IssuesEvent | 2017-09-11 13:08:23 | ansible/ansible-modules-core | https://api.github.com/repos/ansible/ansible-modules-core | closed | docker_service not working as expected on 2.2.0 | affects_2.2 bug_report cloud docker waiting_on_maintainer | Hi,
Sorry if I'm making some sort of mistake, I can't find any reference to this issue and don't konw how to fix it. I'm using the same configuration that works with ansible 2.1.0, but it fails with ansible 2.2.0
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
docker_service
##### ANSIBLE VERSION
```
ansible --version
ansible 2.2.0.0
config file =
configured module search path = Default w/o overrides
```
##### CONFIGURATION
no specific config
##### OS / ENVIRONMENT
running from multiple OS (linux and mac) but managing an ubuntu machine (16.04 LTS xenial)
##### SUMMARY
On executing docker_service, I get an error running the docker services that seems to be related to Dockerfile, although I have specified not to build the containers. Not sure if I need to explicitly add another parameter or not.
##### STEPS TO REPRODUCE
<!---
For bugs, show exactly how to reproduce the problem.
For new features, show how the feature would be used.
-->
<!--- Paste example playbooks or commands between quotes below -->
my playbook:
```
- debug: msg="restarting services"
- docker_service:
project_src: /apps/liveheats
state: present
build: no
files:
- docker-compose.yml
- docker-compose.production.yml
```
<!--- You can also paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- What did you expect to happen when running the steps above? -->
With 2.1.0.0 it would start the docker services correctly
##### ACTUAL RESULTS
<!--- What actually happened? If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes below -->
```
fatal: [X.X.X.X]: FAILED! => {
"changed": false,
"failed": true,
"invocation": {
"module_args": {
"api_version": null,
"build": false,
"cacert_path": null,
"cert_path": null,
"debug": false,
"definition": null,
"dependencies": true,
"docker_host": null,
"files": [
"docker-compose.yml",
"docker-compose.production.yml"
],
"filter_logger": false,
"hostname_check": false,
"key_path": null,
"nocache": false,
"project_name": null,
"project_src": "/apps/liveheats",
"pull": false,
"recreate": "smart",
"remove_images": null,
"remove_orphans": false,
"remove_volumes": false,
"restarted": false,
"scale": null,
"services": null,
"ssl_version": null,
"state": "present",
"stopped": false,
"timeout": 10,
"tls": null,
"tls_hostname": null,
"tls_verify": null
},
"module_name": "docker_service"
},
"msg": "Error starting project - 500 Server Error: Internal Server Error (\"Cannot locate specified Dockerfile: Dockerfile\")"
}
```
| True | docker_service not working as expected on 2.2.0 - Hi,
Sorry if I'm making some sort of mistake, I can't find any reference to this issue and don't konw how to fix it. I'm using the same configuration that works with ansible 2.1.0, but it fails with ansible 2.2.0
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
docker_service
##### ANSIBLE VERSION
```
ansible --version
ansible 2.2.0.0
config file =
configured module search path = Default w/o overrides
```
##### CONFIGURATION
no specific config
##### OS / ENVIRONMENT
running from multiple OS (linux and mac) but managing an ubuntu machine (16.04 LTS xenial)
##### SUMMARY
On executing docker_service, I get an error running the docker services that seems to be related to Dockerfile, although I have specified not to build the containers. Not sure if I need to explicitly add another parameter or not.
##### STEPS TO REPRODUCE
<!---
For bugs, show exactly how to reproduce the problem.
For new features, show how the feature would be used.
-->
<!--- Paste example playbooks or commands between quotes below -->
my playbook:
```
- debug: msg="restarting services"
- docker_service:
project_src: /apps/liveheats
state: present
build: no
files:
- docker-compose.yml
- docker-compose.production.yml
```
<!--- You can also paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- What did you expect to happen when running the steps above? -->
With 2.1.0.0 it would start the docker services correctly
##### ACTUAL RESULTS
<!--- What actually happened? If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes below -->
```
fatal: [X.X.X.X]: FAILED! => {
"changed": false,
"failed": true,
"invocation": {
"module_args": {
"api_version": null,
"build": false,
"cacert_path": null,
"cert_path": null,
"debug": false,
"definition": null,
"dependencies": true,
"docker_host": null,
"files": [
"docker-compose.yml",
"docker-compose.production.yml"
],
"filter_logger": false,
"hostname_check": false,
"key_path": null,
"nocache": false,
"project_name": null,
"project_src": "/apps/liveheats",
"pull": false,
"recreate": "smart",
"remove_images": null,
"remove_orphans": false,
"remove_volumes": false,
"restarted": false,
"scale": null,
"services": null,
"ssl_version": null,
"state": "present",
"stopped": false,
"timeout": 10,
"tls": null,
"tls_hostname": null,
"tls_verify": null
},
"module_name": "docker_service"
},
"msg": "Error starting project - 500 Server Error: Internal Server Error (\"Cannot locate specified Dockerfile: Dockerfile\")"
}
```
| main | docker service not working as expected on hi sorry if i m making some sort of mistake i can t find any reference to this issue and don t konw how to fix it i m using the same configuration that works with ansible but it fails with ansible issue type bug report component name docker service ansible version ansible version ansible config file configured module search path default w o overrides configuration no specific config os environment running from multiple os linux and mac but managing an ubuntu machine lts xenial summary on executing docker service i get an error running the docker services that seems to be related to dockerfile although i have specified not to build the containers not sure if i need to explicitly add another parameter or not steps to reproduce for bugs show exactly how to reproduce the problem for new features show how the feature would be used my playbook debug msg restarting services docker service project src apps liveheats state present build no files docker compose yml docker compose production yml expected results with it would start the docker services correctly actual results fatal failed changed false failed true invocation module args api version null build false cacert path null cert path null debug false definition null dependencies true docker host null files docker compose yml docker compose production yml filter logger false hostname check false key path null nocache false project name null project src apps liveheats pull false recreate smart remove images null remove orphans false remove volumes false restarted false scale null services null ssl version null state present stopped false timeout tls null tls hostname null tls verify null module name docker service msg error starting project server error internal server error cannot locate specified dockerfile dockerfile | 1 |
2,855 | 10,257,018,715 | IssuesEvent | 2019-08-21 19:03:48 | arcticicestudio/styleguide-javascript | https://api.github.com/repos/arcticicestudio/styleguide-javascript | closed | From CircleCI to GitHub Actions | context-workflow scope-dx scope-maintainability scope-quality scope-security scope-stability type-feature | <p align="center"><img src="https://user-images.githubusercontent.com/7836623/63409801-31695780-c3f2-11e9-9742-8212f4919557.png" /></p>
### Project State
The current project setup uses [CircleCI][cci] with [API version 2.x][cci-d] as CI/CD service. This works great, but also comes with the disadvantage of being decoupled from the repository.
<p align="center">
<figure>
<div align="center"><img src="https://user-images.githubusercontent.com/7836623/63436449-00f0e000-c429-11e9-95e4-5fffe64e859b.jpg" width="80%" /></div>
<figcaption><div align="center">The <em>GitHub Actions</em> CI/CD UI</div></figcaption>
</figure>
</p>
During _GitHub Universe 2018_, the awesome new [GitHub Actions][gh-f-actions] feature was [introduced and launched as closed beta][gh-b-actions]. Luckily _Arctic Ice Studio_ was given access in order to test all the great possibilities. During the [GitHub Actions stream „Now with built-in CI/CD!“ (live from GitHub HQ)][yt-gh-actions-cicd] the _Actions_ update was announced and previewed showing the expansion to use _GitHub Actions_ as [CI/CD service described as „fast CI/CD for any OS, any language, and any cloud“][gh-b-actions-cicd].
<p align="center">
<figure>
<div align="center"><img src="https://user-images.githubusercontent.com/7836623/63409864-4c3bcc00-c3f2-11e9-99f0-24964e4a8cc2.gif" width="80%" /></div>
<figcaption><div align="center">Live logs showing real-time feedback</div></figcaption>
</figure>
</p>
**See the [official GitHub Actions documentation][gh-h-actions] for details about setups, features, the configuration API and many more!**
### Project Integration
The switch from _CircleCI_ to _GitHub Actions_ brings many advantages like a „close-to-the-source“ development pipeline. Having the code and automated pipelines/workflows in one place is worth a lot. This also comes along with the perfect integrations into GitHub itself like status reports on PRs and many more possibilities like the [customization and triggering of workflows through webhooks][gh-h-actions-events] for almost every event that can occur in a repository/issue/PR etc.
To integrate _GitHub Actions_ the current [_CircleCI_ build configuration][gh-b-circleci] will be adapted and adjusted. The official [starter-workflows][gh-starter-workflow] can be used as inspiration as well as showcase projects like the [Yarn _Berry_ (Yarn v2)][gh-yarn-berry-gh_act] also presented during the announcement livestream.
Next to the `starter-workflows` repository the [official _GitHub Actions_ documentation][gh-h-actions] will be the main source of information to set up the project workflows.
<p align="center">
<figure>
<div align="center"><img src="https://user-images.githubusercontent.com/7836623/63436450-01897680-c429-11e9-8031-4a744a816fe0.png" width="80%" /></div>
<figcaption><div align="center"><em>GitHub Actions</em> starter workflows based on the epository languages</div></figcaption>
</figure>
</p>
Since _GitHub Actions_ are still in closed/limited public beta, there is no support for SVG badges through shields.io. Anyway, there are (currently undocumented) official badges provided by the GitHub API that'll be used until _Actions_ go GA and shields.io implemented support for it: `https://github.com/{owner}/{repo}/workflows/{workflow_name}/badge.svg`
[cci-d]: https://circleci.com/docs
[cci]: https://circleci.com
[gh-b-actions-cicd]: https://github.blog/2019-08-08-github-actions-now-supports-ci-cd
[gh-b-actions]: https://github.blog/2018-10-17-action-demos
[gh-b-circleci]: https://github.com/arcticicestudio/styleguide-javascript/blob/ac611f7e342e8475767b03d95fa174aea65b39e5/.circleci/config.yml
[gh-f-actions]: https://github.com/features/actions
[gh-h-actions-events]: https://help.github.com/en/articles/events-that-trigger-workflows#webhook-events
[gh-h-actions]: https://help.github.com/en/categories/automating-your-workflow-with-github-actions
[gh-starter-workflow]: https://github.com/actions/starter-workflows
[gh-yarn-berry-gh_act]: https://github.com/yarnpkg/berry/tree/master/.github/workflows
[yt-gh-actions-cicd]: https://www.youtube.com/watch?v=E1OunoCyuhY
| True | From CircleCI to GitHub Actions - <p align="center"><img src="https://user-images.githubusercontent.com/7836623/63409801-31695780-c3f2-11e9-9742-8212f4919557.png" /></p>
### Project State
The current project setup uses [CircleCI][cci] with [API version 2.x][cci-d] as CI/CD service. This works great, but also comes with the disadvantage of being decoupled from the repository.
<p align="center">
<figure>
<div align="center"><img src="https://user-images.githubusercontent.com/7836623/63436449-00f0e000-c429-11e9-95e4-5fffe64e859b.jpg" width="80%" /></div>
<figcaption><div align="center">The <em>GitHub Actions</em> CI/CD UI</div></figcaption>
</figure>
</p>
During _GitHub Universe 2018_, the awesome new [GitHub Actions][gh-f-actions] feature was [introduced and launched as closed beta][gh-b-actions]. Luckily _Arctic Ice Studio_ was given access in order to test all the great possibilities. During the [GitHub Actions stream „Now with built-in CI/CD!“ (live from GitHub HQ)][yt-gh-actions-cicd] the _Actions_ update was announced and previewed showing the expansion to use _GitHub Actions_ as [CI/CD service described as „fast CI/CD for any OS, any language, and any cloud“][gh-b-actions-cicd].
<p align="center">
<figure>
<div align="center"><img src="https://user-images.githubusercontent.com/7836623/63409864-4c3bcc00-c3f2-11e9-99f0-24964e4a8cc2.gif" width="80%" /></div>
<figcaption><div align="center">Live logs showing real-time feedback</div></figcaption>
</figure>
</p>
**See the [official GitHub Actions documentation][gh-h-actions] for details about setups, features, the configuration API and many more!**
### Project Integration
The switch from _CircleCI_ to _GitHub Actions_ brings many advantages like a „close-to-the-source“ development pipeline. Having the code and automated pipelines/workflows in one place is worth a lot. This also comes along with the perfect integrations into GitHub itself like status reports on PRs and many more possibilities like the [customization and triggering of workflows through webhooks][gh-h-actions-events] for almost every event that can occur in a repository/issue/PR etc.
To integrate _GitHub Actions_ the current [_CircleCI_ build configuration][gh-b-circleci] will be adapted and adjusted. The official [starter-workflows][gh-starter-workflow] can be used as inspiration as well as showcase projects like the [Yarn _Berry_ (Yarn v2)][gh-yarn-berry-gh_act] also presented during the announcement livestream.
Next to the `starter-workflows` repository the [official _GitHub Actions_ documentation][gh-h-actions] will be the main source of information to set up the project workflows.
<p align="center">
<figure>
<div align="center"><img src="https://user-images.githubusercontent.com/7836623/63436450-01897680-c429-11e9-8031-4a744a816fe0.png" width="80%" /></div>
<figcaption><div align="center"><em>GitHub Actions</em> starter workflows based on the epository languages</div></figcaption>
</figure>
</p>
Since _GitHub Actions_ are still in closed/limited public beta, there is no support for SVG badges through shields.io. Anyway, there are (currently undocumented) official badges provided by the GitHub API that'll be used until _Actions_ go GA and shields.io implemented support for it: `https://github.com/{owner}/{repo}/workflows/{workflow_name}/badge.svg`
[cci-d]: https://circleci.com/docs
[cci]: https://circleci.com
[gh-b-actions-cicd]: https://github.blog/2019-08-08-github-actions-now-supports-ci-cd
[gh-b-actions]: https://github.blog/2018-10-17-action-demos
[gh-b-circleci]: https://github.com/arcticicestudio/styleguide-javascript/blob/ac611f7e342e8475767b03d95fa174aea65b39e5/.circleci/config.yml
[gh-f-actions]: https://github.com/features/actions
[gh-h-actions-events]: https://help.github.com/en/articles/events-that-trigger-workflows#webhook-events
[gh-h-actions]: https://help.github.com/en/categories/automating-your-workflow-with-github-actions
[gh-starter-workflow]: https://github.com/actions/starter-workflows
[gh-yarn-berry-gh_act]: https://github.com/yarnpkg/berry/tree/master/.github/workflows
[yt-gh-actions-cicd]: https://www.youtube.com/watch?v=E1OunoCyuhY
| main | from circleci to github actions project state the current project setup uses with as ci cd service this works great but also comes with the disadvantage of being decoupled from the repository the github actions ci cd ui during github universe the awesome new feature was luckily arctic ice studio was given access in order to test all the great possibilities during the the actions update was announced and previewed showing the expansion to use github actions as live logs showing real time feedback see the for details about setups features the configuration api and many more project integration the switch from circleci to github actions brings many advantages like a „close to the source“ development pipeline having the code and automated pipelines workflows in one place is worth a lot this also comes along with the perfect integrations into github itself like status reports on prs and many more possibilities like the for almost every event that can occur in a repository issue pr etc to integrate github actions the current will be adapted and adjusted the official can be used as inspiration as well as showcase projects like the also presented during the announcement livestream next to the starter workflows repository the will be the main source of information to set up the project workflows github actions starter workflows based on the epository languages since github actions are still in closed limited public beta there is no support for svg badges through shields io anyway there are currently undocumented official badges provided by the github api that ll be used until actions go ga and shields io implemented support for it | 1 |
22,457 | 10,758,056,680 | IssuesEvent | 2019-10-31 14:22:21 | elastic/kibana | https://api.github.com/repos/elastic/kibana | opened | User authenticated with Token authentication provider should not be able to change password | Feature:Security/Authentication Team:Security blocked | Currently Change Password API in Elasticsearch allows user to change password even if they are authenticated with Token authentication provider, but this will change with https://github.com/elastic/elasticsearch/issues/48752 and we should make necessary changes in Kibana as well.
**Blocked by: https://github.com/elastic/elasticsearch/issues/48752** | True | User authenticated with Token authentication provider should not be able to change password - Currently Change Password API in Elasticsearch allows user to change password even if they are authenticated with Token authentication provider, but this will change with https://github.com/elastic/elasticsearch/issues/48752 and we should make necessary changes in Kibana as well.
**Blocked by: https://github.com/elastic/elasticsearch/issues/48752** | non_main | user authenticated with token authentication provider should not be able to change password currently change password api in elasticsearch allows user to change password even if they are authenticated with token authentication provider but this will change with and we should make necessary changes in kibana as well blocked by | 0 |
161,957 | 20,164,341,126 | IssuesEvent | 2022-02-10 01:43:46 | kapseliboi/hybrixd | https://api.github.com/repos/kapseliboi/hybrixd | opened | CVE-2022-0144 (High) detected in shelljs-0.8.4.tgz | security vulnerability | ## CVE-2022-0144 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>shelljs-0.8.4.tgz</b></p></summary>
<p>Portable Unix shell commands for Node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/shelljs/-/shelljs-0.8.4.tgz">https://registry.npmjs.org/shelljs/-/shelljs-0.8.4.tgz</a></p>
<p>Path to dependency file: /modules/transport/torrent/peer-network-fork/package.json</p>
<p>Path to vulnerable library: /modules/transport/torrent/peer-network-fork/node_modules/shelljs/package.json</p>
<p>
Dependency Hierarchy:
- documentation-13.2.5.tgz (Root Library)
- module-deps-sortable-5.0.3.tgz
- standard-version-9.3.0.tgz
- conventional-changelog-3.1.24.tgz
- conventional-changelog-core-4.2.2.tgz
- :x: **shelljs-0.8.4.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
shelljs is vulnerable to Improper Privilege Management
<p>Publish Date: 2022-01-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0144>CVE-2022-0144</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/shelljs/shelljs/commit/d919d22dd6de385edaa9d90313075a77f74b338c">https://github.com/shelljs/shelljs/commit/d919d22dd6de385edaa9d90313075a77f74b338c</a></p>
<p>Release Date: 2022-01-11</p>
<p>Fix Resolution: shelljs - 0.8.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-0144 (High) detected in shelljs-0.8.4.tgz - ## CVE-2022-0144 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>shelljs-0.8.4.tgz</b></p></summary>
<p>Portable Unix shell commands for Node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/shelljs/-/shelljs-0.8.4.tgz">https://registry.npmjs.org/shelljs/-/shelljs-0.8.4.tgz</a></p>
<p>Path to dependency file: /modules/transport/torrent/peer-network-fork/package.json</p>
<p>Path to vulnerable library: /modules/transport/torrent/peer-network-fork/node_modules/shelljs/package.json</p>
<p>
Dependency Hierarchy:
- documentation-13.2.5.tgz (Root Library)
- module-deps-sortable-5.0.3.tgz
- standard-version-9.3.0.tgz
- conventional-changelog-3.1.24.tgz
- conventional-changelog-core-4.2.2.tgz
- :x: **shelljs-0.8.4.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
shelljs is vulnerable to Improper Privilege Management
<p>Publish Date: 2022-01-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0144>CVE-2022-0144</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/shelljs/shelljs/commit/d919d22dd6de385edaa9d90313075a77f74b338c">https://github.com/shelljs/shelljs/commit/d919d22dd6de385edaa9d90313075a77f74b338c</a></p>
<p>Release Date: 2022-01-11</p>
<p>Fix Resolution: shelljs - 0.8.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_main | cve high detected in shelljs tgz cve high severity vulnerability vulnerable library shelljs tgz portable unix shell commands for node js library home page a href path to dependency file modules transport torrent peer network fork package json path to vulnerable library modules transport torrent peer network fork node modules shelljs package json dependency hierarchy documentation tgz root library module deps sortable tgz standard version tgz conventional changelog tgz conventional changelog core tgz x shelljs tgz vulnerable library found in base branch master vulnerability details shelljs is vulnerable to improper privilege management publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution shelljs step up your open source security game with whitesource | 0 |
272,280 | 29,795,001,518 | IssuesEvent | 2023-06-16 01:03:17 | billmcchesney1/singleton | https://api.github.com/repos/billmcchesney1/singleton | closed | CVE-2021-37714 (High) detected in jsoup-1.8.1.jar - autoclosed | Mend: dependency security vulnerability | ## CVE-2021-37714 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jsoup-1.8.1.jar</b></p></summary>
<p>jsoup HTML parser</p>
<p>Library home page: <a href="http://jsoup.org/">http://jsoup.org/</a></p>
<p>Path to dependency file: /g11n-ws/vip-manager-i18n/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.jsoup/jsoup/1.8.1/1eb6690b0c629d7000c77a72df5e822fb974f521/jsoup-1.8.1.jar</p>
<p>
Dependency Hierarchy:
- swagger2markup-1.3.3.jar (Root Library)
- markup-document-builder-1.1.2.jar
- markdown_to_asciidoc-1.0.jar
- :x: **jsoup-1.8.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/billmcchesney1/singleton/commit/2bb883ae0b199f3e432621e91d0f801cfc406a89">2bb883ae0b199f3e432621e91d0f801cfc406a89</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
jsoup is a Java library for working with HTML. Those using jsoup versions prior to 1.14.2 to parse untrusted HTML or XML may be vulnerable to DOS attacks. If the parser is run on user supplied input, an attacker may supply content that causes the parser to get stuck (loop indefinitely until cancelled), to complete more slowly than usual, or to throw an unexpected exception. This effect may support a denial of service attack. The issue is patched in version 1.14.2. There are a few available workarounds. Users may rate limit input parsing, limit the size of inputs based on system resources, and/or implement thread watchdogs to cap and timeout parse runtimes.
<p>Publish Date: 2021-08-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-37714>CVE-2021-37714</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://jsoup.org/news/release-1.14.2">https://jsoup.org/news/release-1.14.2</a></p>
<p>Release Date: 2021-08-18</p>
<p>Fix Resolution: org.jsoup:jsoup:1.14.2</p>
</p>
</details>
<p></p>
| True | CVE-2021-37714 (High) detected in jsoup-1.8.1.jar - autoclosed - ## CVE-2021-37714 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jsoup-1.8.1.jar</b></p></summary>
<p>jsoup HTML parser</p>
<p>Library home page: <a href="http://jsoup.org/">http://jsoup.org/</a></p>
<p>Path to dependency file: /g11n-ws/vip-manager-i18n/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.jsoup/jsoup/1.8.1/1eb6690b0c629d7000c77a72df5e822fb974f521/jsoup-1.8.1.jar</p>
<p>
Dependency Hierarchy:
- swagger2markup-1.3.3.jar (Root Library)
- markup-document-builder-1.1.2.jar
- markdown_to_asciidoc-1.0.jar
- :x: **jsoup-1.8.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/billmcchesney1/singleton/commit/2bb883ae0b199f3e432621e91d0f801cfc406a89">2bb883ae0b199f3e432621e91d0f801cfc406a89</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
jsoup is a Java library for working with HTML. Those using jsoup versions prior to 1.14.2 to parse untrusted HTML or XML may be vulnerable to DOS attacks. If the parser is run on user supplied input, an attacker may supply content that causes the parser to get stuck (loop indefinitely until cancelled), to complete more slowly than usual, or to throw an unexpected exception. This effect may support a denial of service attack. The issue is patched in version 1.14.2. There are a few available workarounds. Users may rate limit input parsing, limit the size of inputs based on system resources, and/or implement thread watchdogs to cap and timeout parse runtimes.
<p>Publish Date: 2021-08-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-37714>CVE-2021-37714</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://jsoup.org/news/release-1.14.2">https://jsoup.org/news/release-1.14.2</a></p>
<p>Release Date: 2021-08-18</p>
<p>Fix Resolution: org.jsoup:jsoup:1.14.2</p>
</p>
</details>
<p></p>
| non_main | cve high detected in jsoup jar autoclosed cve high severity vulnerability vulnerable library jsoup jar jsoup html parser library home page a href path to dependency file ws vip manager build gradle path to vulnerable library home wss scanner gradle caches modules files org jsoup jsoup jsoup jar dependency hierarchy jar root library markup document builder jar markdown to asciidoc jar x jsoup jar vulnerable library found in head commit a href found in base branch master vulnerability details jsoup is a java library for working with html those using jsoup versions prior to to parse untrusted html or xml may be vulnerable to dos attacks if the parser is run on user supplied input an attacker may supply content that causes the parser to get stuck loop indefinitely until cancelled to complete more slowly than usual or to throw an unexpected exception this effect may support a denial of service attack the issue is patched in version there are a few available workarounds users may rate limit input parsing limit the size of inputs based on system resources and or implement thread watchdogs to cap and timeout parse runtimes publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org jsoup jsoup | 0 |
177,093 | 14,615,498,792 | IssuesEvent | 2020-12-22 11:38:42 | markmuetz/remake | https://api.github.com/repos/markmuetz/remake | opened | Documentation | documentation | Basic documentation on how to use remake. Upload to doc hosting site:
- installation
- quickstart
- running
- CLI
- python API | 1.0 | Documentation - Basic documentation on how to use remake. Upload to doc hosting site:
- installation
- quickstart
- running
- CLI
- python API | non_main | documentation basic documentation on how to use remake upload to doc hosting site installation quickstart running cli python api | 0 |
4,720 | 24,342,585,935 | IssuesEvent | 2022-10-01 22:25:46 | beekama/NutritionApp | https://api.github.com/repos/beekama/NutritionApp | closed | dividerItemDecorator duplicated Code | invalid maintainability | `dividerItemDecorator` seems to have a lot of similar definitions. Extract. | True | dividerItemDecorator duplicated Code - `dividerItemDecorator` seems to have a lot of similar definitions. Extract. | main | divideritemdecorator duplicated code divideritemdecorator seems to have a lot of similar definitions extract | 1 |
163,333 | 12,718,436,715 | IssuesEvent | 2020-06-24 07:31:42 | MachoThemes/modula-lite | https://api.github.com/repos/MachoThemes/modula-lite | closed | Social Sharing | bug need testing | https://github.com/MachoThemes/modula-lite/blob/master/assets/js/jquery-modula.js#L570
Here the text will always be document.title because we don't have settings for social texts.
Also the caption element it's commented.
Check all platforms for this. | 1.0 | Social Sharing - https://github.com/MachoThemes/modula-lite/blob/master/assets/js/jquery-modula.js#L570
Here the text will always be document.title because we don't have settings for social texts.
Also the caption element it's commented.
Check all platforms for this. | non_main | social sharing here the text will always be document title because we don t have settings for social texts also the caption element it s commented check all platforms for this | 0 |
5,132 | 26,151,672,846 | IssuesEvent | 2022-12-30 14:37:16 | camunda/zeebe | https://api.github.com/repos/camunda/zeebe | closed | Analysis fails in stable/1.0 and 1.1 | kind/toil area/maintainability | **Description**
In these two stable branches the analysis fails, because `analyse-java.sh` still references `develop` branch.
Not sure if this is supposed to be fixed, or if those branches get reitred.
| True | Analysis fails in stable/1.0 and 1.1 - **Description**
In these two stable branches the analysis fails, because `analyse-java.sh` still references `develop` branch.
Not sure if this is supposed to be fixed, or if those branches get reitred.
| main | analysis fails in stable and description in these two stable branches the analysis fails because analyse java sh still references develop branch not sure if this is supposed to be fixed or if those branches get reitred | 1 |
248,732 | 18,858,111,726 | IssuesEvent | 2021-11-12 09:23:52 | giterator/pe | https://api.github.com/repos/giterator/pe | opened | User Guide: Example explanation of `list` is not consistent with the screenshot | severity.Low type.DocumentationBug | It is stated that `Jerry, Lewis, Smith` are the those that have made orders. However, the image includes the names `Adam, Jerry, Lee, Lewis`.
The reader may be confused as to why `Adam` and `Lee` are included.

<!--session: 1636706275629-38028469-9081-4b49-90d1-6da129f7c6a3-->
<!--Version: Web v3.4.1--> | 1.0 | User Guide: Example explanation of `list` is not consistent with the screenshot - It is stated that `Jerry, Lewis, Smith` are the those that have made orders. However, the image includes the names `Adam, Jerry, Lee, Lewis`.
The reader may be confused as to why `Adam` and `Lee` are included.

<!--session: 1636706275629-38028469-9081-4b49-90d1-6da129f7c6a3-->
<!--Version: Web v3.4.1--> | non_main | user guide example explanation of list is not consistent with the screenshot it is stated that jerry lewis smith are the those that have made orders however the image includes the names adam jerry lee lewis the reader may be confused as to why adam and lee are included | 0 |
39,453 | 8,649,540,632 | IssuesEvent | 2018-11-26 19:43:22 | PennyDreadfulMTG/perf-reports | https://api.github.com/repos/PennyDreadfulMTG/perf-reports | closed | 500 error at /authenticate/callback/ | MissingCodeError decksite wontfix | (missing_code) Missing code parameter in response.
Reported on decksite by 270302348956401664```
--------------------------------------------------------------------------------
Request Method: GET
Path: /authenticate/callback/?state=xaAEN01O1hlrRXhviax1xCNpa6fM7Y
Cookies: {'session': '.eJx9UF1Pg0AQ_C_7TMwBx-cbVmuNacSS1Kgx5IQFDvkod4eFNv3vnk2sbz7uzOzM7hwh5zLrRZ7yHEKwPGITy6Z-4LiUmK5LwYB_mC4tR95ovmCNRAN6NqrKSqViCvVS1D86fN7zuWVtMi9jy9vix56vWbBZtge46FX_iR2ER2BZhlL-zvDQPcnV1yIZlFiV1FzUnRUl8Ywv03Qbb_Q-TjsuUKZMQWg6tk88agfOlRn41HPoH8-1m0uoT4gBAgsNVZcQ6Ud-sr2_KZ7d9WuV06E-1DEZ9mTIY36nQ3Q_O_3Nmy4CO8WLWWPntyW8G3C2SdX8I4FrZAIFnDTMRIn6qm5smtM37EJxsg.DoV-Gw.l2iTJEtB8FTmpCuPajtSAlIkWWg', '_gid': 'GA1.2.965532229.1537366240', '_gat_gtag_UA_109131120_1': '1', 'hide_intro': 'True', '_ga': 'GA1.2.1726144769.1537088738', '__cfduid': 'd09ef5fa9d7a91ccf0f8658dd79b0708c1537088735'}
Endpoint: authenticate_callback
View Args: {}
Person: 270302348956401664
Referrer: https://discordapp.com/oauth2/authorize?response_type=code&client_id=338056190779195392&redirect_uri=https%3A%2F%2Fpennydreadfulmagic.com%2Fauthenticate%2Fcallback%2F&scope=identify+guilds&state=AoO5iywiymamSyFP27VebwiMa9RFmz
Request Data: {}
Host: pennydreadfulmagic.com
Accept-Encoding: gzip
Cf-Ipcountry: DE
X-Forwarded-For: 85.195.101.212, 141.101.107.20
Cf-Ray: 45d67e6d3f2c6b5b-LHR
X-Forwarded-Proto: https
Cf-Visitor: {"scheme":"https"}
Referer: https://discordapp.com/oauth2/authorize?response_type=code&client_id=338056190779195392&redirect_uri=https%3A%2F%2Fpennydreadfulmagic.com%2Fauthenticate%2Fcallback%2F&scope=identify+guilds&state=AoO5iywiymamSyFP27VebwiMa9RFmz
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.140 Safari/537.36 Edge/17.17134
Accept-Language: de-DE
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Upgrade-Insecure-Requests: 1
Cookie: session=.eJx9UF1Pg0AQ_C_7TMwBx-cbVmuNacSS1Kgx5IQFDvkod4eFNv3vnk2sbz7uzOzM7hwh5zLrRZ7yHEKwPGITy6Z-4LiUmK5LwYB_mC4tR95ovmCNRAN6NqrKSqViCvVS1D86fN7zuWVtMi9jy9vix56vWbBZtge46FX_iR2ER2BZhlL-zvDQPcnV1yIZlFiV1FzUnRUl8Ywv03Qbb_Q-TjsuUKZMQWg6tk88agfOlRn41HPoH8-1m0uoT4gBAgsNVZcQ6Ud-sr2_KZ7d9WuV06E-1DEZ9mTIY36nQ3Q_O_3Nmy4CO8WLWWPntyW8G3C2SdX8I4FrZAIFnDTMRIn6qm5smtM37EJxsg.DoV-Gw.l2iTJEtB8FTmpCuPajtSAlIkWWg; _gid=GA1.2.965532229.1537366240; _gat_gtag_UA_109131120_1=1; hide_intro=True; _ga=GA1.2.1726144769.1537088738; __cfduid=d09ef5fa9d7a91ccf0f8658dd79b0708c1537088735
Cf-Connecting-Ip: 85.195.101.212
X-Forwarded-Host: pennydreadfulmagic.com
X-Forwarded-Server: pennydreadfulmagic.com
Connection: Keep-Alive
```
--------------------------------------------------------------------------------
MissingCodeError
(missing_code) Missing code parameter in response.
Stack Trace:
```
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 2309, in __call__
return self.wsgi_app(environ, start_response)
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 2295, in wsgi_app
response = self.handle_exception(e)
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 2292, in wsgi_app
response = self.full_dispatch_request()
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1815, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1718, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/home/discord/.local/lib/python3.6/site-packages/flask/_compat.py", line 35, in reraise
raise value
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1813, in full_dispatch_request
rv = self.dispatch_request()
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1799, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "./shared_web/flask_app.py", line 97, in authenticate_callback
oauth.setup_session(request.url)
File "./shared_web/oauth.py", line 26, in setup_session
authorization_response=url)
File "/home/discord/.local/lib/python3.6/site-packages/requests_oauthlib/oauth2_session.py", line 187, in fetch_token
state=self._state)
File "/home/discord/.local/lib/python3.6/site-packages/oauthlib/oauth2/rfc6749/clients/web_application.py", line 174, in parse_request_uri_response
response = parse_authorization_code_response(uri, state=state)
File "/home/discord/.local/lib/python3.6/site-packages/oauthlib/oauth2/rfc6749/parameters.py", line 227, in parse_authorization_code_response
raise MissingCodeError("Missing code parameter in response.")
```
| 1.0 | 500 error at /authenticate/callback/ - (missing_code) Missing code parameter in response.
Reported on decksite by 270302348956401664```
--------------------------------------------------------------------------------
Request Method: GET
Path: /authenticate/callback/?state=xaAEN01O1hlrRXhviax1xCNpa6fM7Y
Cookies: {'session': '.eJx9UF1Pg0AQ_C_7TMwBx-cbVmuNacSS1Kgx5IQFDvkod4eFNv3vnk2sbz7uzOzM7hwh5zLrRZ7yHEKwPGITy6Z-4LiUmK5LwYB_mC4tR95ovmCNRAN6NqrKSqViCvVS1D86fN7zuWVtMi9jy9vix56vWbBZtge46FX_iR2ER2BZhlL-zvDQPcnV1yIZlFiV1FzUnRUl8Ywv03Qbb_Q-TjsuUKZMQWg6tk88agfOlRn41HPoH8-1m0uoT4gBAgsNVZcQ6Ud-sr2_KZ7d9WuV06E-1DEZ9mTIY36nQ3Q_O_3Nmy4CO8WLWWPntyW8G3C2SdX8I4FrZAIFnDTMRIn6qm5smtM37EJxsg.DoV-Gw.l2iTJEtB8FTmpCuPajtSAlIkWWg', '_gid': 'GA1.2.965532229.1537366240', '_gat_gtag_UA_109131120_1': '1', 'hide_intro': 'True', '_ga': 'GA1.2.1726144769.1537088738', '__cfduid': 'd09ef5fa9d7a91ccf0f8658dd79b0708c1537088735'}
Endpoint: authenticate_callback
View Args: {}
Person: 270302348956401664
Referrer: https://discordapp.com/oauth2/authorize?response_type=code&client_id=338056190779195392&redirect_uri=https%3A%2F%2Fpennydreadfulmagic.com%2Fauthenticate%2Fcallback%2F&scope=identify+guilds&state=AoO5iywiymamSyFP27VebwiMa9RFmz
Request Data: {}
Host: pennydreadfulmagic.com
Accept-Encoding: gzip
Cf-Ipcountry: DE
X-Forwarded-For: 85.195.101.212, 141.101.107.20
Cf-Ray: 45d67e6d3f2c6b5b-LHR
X-Forwarded-Proto: https
Cf-Visitor: {"scheme":"https"}
Referer: https://discordapp.com/oauth2/authorize?response_type=code&client_id=338056190779195392&redirect_uri=https%3A%2F%2Fpennydreadfulmagic.com%2Fauthenticate%2Fcallback%2F&scope=identify+guilds&state=AoO5iywiymamSyFP27VebwiMa9RFmz
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.140 Safari/537.36 Edge/17.17134
Accept-Language: de-DE
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Upgrade-Insecure-Requests: 1
Cookie: session=.eJx9UF1Pg0AQ_C_7TMwBx-cbVmuNacSS1Kgx5IQFDvkod4eFNv3vnk2sbz7uzOzM7hwh5zLrRZ7yHEKwPGITy6Z-4LiUmK5LwYB_mC4tR95ovmCNRAN6NqrKSqViCvVS1D86fN7zuWVtMi9jy9vix56vWbBZtge46FX_iR2ER2BZhlL-zvDQPcnV1yIZlFiV1FzUnRUl8Ywv03Qbb_Q-TjsuUKZMQWg6tk88agfOlRn41HPoH8-1m0uoT4gBAgsNVZcQ6Ud-sr2_KZ7d9WuV06E-1DEZ9mTIY36nQ3Q_O_3Nmy4CO8WLWWPntyW8G3C2SdX8I4FrZAIFnDTMRIn6qm5smtM37EJxsg.DoV-Gw.l2iTJEtB8FTmpCuPajtSAlIkWWg; _gid=GA1.2.965532229.1537366240; _gat_gtag_UA_109131120_1=1; hide_intro=True; _ga=GA1.2.1726144769.1537088738; __cfduid=d09ef5fa9d7a91ccf0f8658dd79b0708c1537088735
Cf-Connecting-Ip: 85.195.101.212
X-Forwarded-Host: pennydreadfulmagic.com
X-Forwarded-Server: pennydreadfulmagic.com
Connection: Keep-Alive
```
--------------------------------------------------------------------------------
MissingCodeError
(missing_code) Missing code parameter in response.
Stack Trace:
```
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 2309, in __call__
return self.wsgi_app(environ, start_response)
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 2295, in wsgi_app
response = self.handle_exception(e)
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 2292, in wsgi_app
response = self.full_dispatch_request()
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1815, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1718, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/home/discord/.local/lib/python3.6/site-packages/flask/_compat.py", line 35, in reraise
raise value
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1813, in full_dispatch_request
rv = self.dispatch_request()
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1799, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "./shared_web/flask_app.py", line 97, in authenticate_callback
oauth.setup_session(request.url)
File "./shared_web/oauth.py", line 26, in setup_session
authorization_response=url)
File "/home/discord/.local/lib/python3.6/site-packages/requests_oauthlib/oauth2_session.py", line 187, in fetch_token
state=self._state)
File "/home/discord/.local/lib/python3.6/site-packages/oauthlib/oauth2/rfc6749/clients/web_application.py", line 174, in parse_request_uri_response
response = parse_authorization_code_response(uri, state=state)
File "/home/discord/.local/lib/python3.6/site-packages/oauthlib/oauth2/rfc6749/parameters.py", line 227, in parse_authorization_code_response
raise MissingCodeError("Missing code parameter in response.")
```
| non_main | error at authenticate callback missing code missing code parameter in response reported on decksite by request method get path authenticate callback state cookies session c q o dov gw gid gat gtag ua hide intro true ga cfduid endpoint authenticate callback view args person referrer request data host pennydreadfulmagic com accept encoding gzip cf ipcountry de x forwarded for cf ray lhr x forwarded proto https cf visitor scheme https referer user agent mozilla windows nt applewebkit khtml like gecko chrome safari edge accept language de de accept text html application xhtml xml application xml q q upgrade insecure requests cookie session c q o dov gw gid gat gtag ua hide intro true ga cfduid cf connecting ip x forwarded host pennydreadfulmagic com x forwarded server pennydreadfulmagic com connection keep alive missingcodeerror missing code missing code parameter in response stack trace file home discord local lib site packages flask app py line in call return self wsgi app environ start response file home discord local lib site packages flask app py line in wsgi app response self handle exception e file home discord local lib site packages flask app py line in wsgi app response self full dispatch request file home discord local lib site packages flask app py line in full dispatch request rv self handle user exception e file home discord local lib site packages flask app py line in handle user exception reraise exc type exc value tb file home discord local lib site packages flask compat py line in reraise raise value file home discord local lib site packages flask app py line in full dispatch request rv self dispatch request file home discord local lib site packages flask app py line in dispatch request return self view functions req view args file shared web flask app py line in authenticate callback oauth setup session request url file shared web oauth py line in setup session authorization response url file home discord local lib site packages requests oauthlib session py line in fetch token state self state file home discord local lib site packages oauthlib clients web application py line in parse request uri response response parse authorization code response uri state state file home discord local lib site packages oauthlib parameters py line in parse authorization code response raise missingcodeerror missing code parameter in response | 0 |
47,376 | 13,056,152,167 | IssuesEvent | 2020-07-30 03:49:07 | icecube-trac/tix2 | https://api.github.com/repos/icecube-trac/tix2 | closed | GCD generation warning/error (Trac #451) | I3Db Migrated from Trac defect | Hi Georges,
I'm getting the following warning from the I3Db project when I try to
generate GCD files:
=========
/net/user/i3filter/IC86_OfflineProcessing/icerec/IC2012-L2_V12-09-00_dev/I3Db/private/I3Db/I3DbDetectorStatusService.cxx:247:
WARN : I3DbDetectorStatus:GetDetectorStatus-GetRunSummary:Change=01;Run=00120900;DomConfId=00000841;TriggerConfigId=00001211;
=========
This error only results from using the latest release of the project
V12-08-00. I was trying to generate a GCD file for an IC86 run 118772
taken on 2011-10-10 and the warning suggests it is picking a
DetectorStatus config for a wrong run number. I looked into the
created GCD file and found a "StartTime" of sometime in 2012 in the D
frame. The 120900 reported in the warning message is a run taken
yesterday so it looks like it is reporting the latest run in the DB.
The I3Db release V12-06-00 used for IC86_1 production does not have
this problem. There seems to be a mismatch between the latest release
and the DB entries. Chris W. has also seen similar problems but I
think related to the Geometry frame, he is copied in this mail and
I'll let him explain the situation properly. The issues may be linked
and this may help with your effort at a fix.
Thanks for looking into this.
Migrated from https://code.icecube.wisc.edu/ticket/451
```json
{
"status": "closed",
"changetime": "2012-11-26T22:43:56",
"description": "Hi Georges,\nI'm getting the following warning from the I3Db project when I try to\ngenerate GCD files:\n\n=========\n/net/user/i3filter/IC86_OfflineProcessing/icerec/IC2012-L2_V12-09-00_dev/I3Db/private/I3Db/I3DbDetectorStatusService.cxx:247:\nWARN : I3DbDetectorStatus:GetDetectorStatus-GetRunSummary:Change=01;Run=00120900;DomConfId=00000841;TriggerConfigId=00001211;\n=========\n\nThis error only results from using the latest release of the project\nV12-08-00. I was trying to generate a GCD file for an IC86 run 118772\ntaken on 2011-10-10 and the warning suggests it is picking a\nDetectorStatus config for a wrong run number. I looked into the\ncreated GCD file and found a \"StartTime\" of sometime in 2012 in the D\nframe. The 120900 reported in the warning message is a run taken\nyesterday so it looks like it is reporting the latest run in the DB.\n\nThe I3Db release V12-06-00 used for IC86_1 production does not have\nthis problem. There seems to be a mismatch between the latest release\nand the DB entries. Chris W. has also seen similar problems but I\nthink related to the Geometry frame, he is copied in this mail and\nI'll let him explain the situation properly. The issues may be linked\nand this may help with your effort at a fix.\n\nThanks for looking into this.",
"reporter": "icecube",
"cc": "",
"resolution": "invalid",
"_ts": "1353969836000000",
"component": "I3Db",
"summary": "GCD generation warning/error",
"priority": "normal",
"keywords": "",
"time": "2012-11-26T22:04:20",
"milestone": "",
"owner": "ofadiran",
"type": "defect"
}
```
| 1.0 | GCD generation warning/error (Trac #451) - Hi Georges,
I'm getting the following warning from the I3Db project when I try to
generate GCD files:
=========
/net/user/i3filter/IC86_OfflineProcessing/icerec/IC2012-L2_V12-09-00_dev/I3Db/private/I3Db/I3DbDetectorStatusService.cxx:247:
WARN : I3DbDetectorStatus:GetDetectorStatus-GetRunSummary:Change=01;Run=00120900;DomConfId=00000841;TriggerConfigId=00001211;
=========
This error only results from using the latest release of the project
V12-08-00. I was trying to generate a GCD file for an IC86 run 118772
taken on 2011-10-10 and the warning suggests it is picking a
DetectorStatus config for a wrong run number. I looked into the
created GCD file and found a "StartTime" of sometime in 2012 in the D
frame. The 120900 reported in the warning message is a run taken
yesterday so it looks like it is reporting the latest run in the DB.
The I3Db release V12-06-00 used for IC86_1 production does not have
this problem. There seems to be a mismatch between the latest release
and the DB entries. Chris W. has also seen similar problems but I
think related to the Geometry frame, he is copied in this mail and
I'll let him explain the situation properly. The issues may be linked
and this may help with your effort at a fix.
Thanks for looking into this.
Migrated from https://code.icecube.wisc.edu/ticket/451
```json
{
"status": "closed",
"changetime": "2012-11-26T22:43:56",
"description": "Hi Georges,\nI'm getting the following warning from the I3Db project when I try to\ngenerate GCD files:\n\n=========\n/net/user/i3filter/IC86_OfflineProcessing/icerec/IC2012-L2_V12-09-00_dev/I3Db/private/I3Db/I3DbDetectorStatusService.cxx:247:\nWARN : I3DbDetectorStatus:GetDetectorStatus-GetRunSummary:Change=01;Run=00120900;DomConfId=00000841;TriggerConfigId=00001211;\n=========\n\nThis error only results from using the latest release of the project\nV12-08-00. I was trying to generate a GCD file for an IC86 run 118772\ntaken on 2011-10-10 and the warning suggests it is picking a\nDetectorStatus config for a wrong run number. I looked into the\ncreated GCD file and found a \"StartTime\" of sometime in 2012 in the D\nframe. The 120900 reported in the warning message is a run taken\nyesterday so it looks like it is reporting the latest run in the DB.\n\nThe I3Db release V12-06-00 used for IC86_1 production does not have\nthis problem. There seems to be a mismatch between the latest release\nand the DB entries. Chris W. has also seen similar problems but I\nthink related to the Geometry frame, he is copied in this mail and\nI'll let him explain the situation properly. The issues may be linked\nand this may help with your effort at a fix.\n\nThanks for looking into this.",
"reporter": "icecube",
"cc": "",
"resolution": "invalid",
"_ts": "1353969836000000",
"component": "I3Db",
"summary": "GCD generation warning/error",
"priority": "normal",
"keywords": "",
"time": "2012-11-26T22:04:20",
"milestone": "",
"owner": "ofadiran",
"type": "defect"
}
```
| non_main | gcd generation warning error trac hi georges i m getting the following warning from the project when i try to generate gcd files net user offlineprocessing icerec dev private cxx warn getdetectorstatus getrunsummary change run domconfid triggerconfigid this error only results from using the latest release of the project i was trying to generate a gcd file for an run taken on and the warning suggests it is picking a detectorstatus config for a wrong run number i looked into the created gcd file and found a starttime of sometime in in the d frame the reported in the warning message is a run taken yesterday so it looks like it is reporting the latest run in the db the release used for production does not have this problem there seems to be a mismatch between the latest release and the db entries chris w has also seen similar problems but i think related to the geometry frame he is copied in this mail and i ll let him explain the situation properly the issues may be linked and this may help with your effort at a fix thanks for looking into this migrated from json status closed changetime description hi georges ni m getting the following warning from the project when i try to ngenerate gcd files n n n net user offlineprocessing icerec dev private cxx nwarn getdetectorstatus getrunsummary change run domconfid triggerconfigid n n nthis error only results from using the latest release of the project i was trying to generate a gcd file for an run ntaken on and the warning suggests it is picking a ndetectorstatus config for a wrong run number i looked into the ncreated gcd file and found a starttime of sometime in in the d nframe the reported in the warning message is a run taken nyesterday so it looks like it is reporting the latest run in the db n nthe release used for production does not have nthis problem there seems to be a mismatch between the latest release nand the db entries chris w has also seen similar problems but i nthink related to the geometry frame he is copied in this mail and ni ll let him explain the situation properly the issues may be linked nand this may help with your effort at a fix n nthanks for looking into this reporter icecube cc resolution invalid ts component summary gcd generation warning error priority normal keywords time milestone owner ofadiran type defect | 0 |
5,750 | 30,471,762,918 | IssuesEvent | 2023-07-17 14:03:44 | centerofci/mathesar | https://api.github.com/repos/centerofci/mathesar | opened | Gracefully recover from failed type inference during import | type: enhancement work: frontend status: ready restricted: maintainers | ## Current behavior
1. Sometimes column type inference fails during import. This can happen due to a variety of reasons described in #2346.
I'm able to reliably reproduce an inference failure by attempting to import this CSV:
[small-long.csv](https://github.com/centerofci/mathesar/files/12070026/small-long.csv)
1. When inference fails, Mathesar presents the user with this screen:

At this point, the user's data **has already been imported**, but it just lives in a table with all Text columns.
From here the user has the following options:
- "Delete import", or "Cancel" which wipes their imported data
- "Retry" (which will eventually lead back to this screen)
- Navigate to the Schema page and then click on the table, at which point Mathesar will attempt to perform inference again (similar to "Retry")
The user does **not** have an option to skip the inference and keep the imported table as plain text.
## Desired behavior
- We should give the user an option to skip the column type inference and keep the imported table with all columns as the "Text" type.
| True | Gracefully recover from failed type inference during import - ## Current behavior
1. Sometimes column type inference fails during import. This can happen due to a variety of reasons described in #2346.
I'm able to reliably reproduce an inference failure by attempting to import this CSV:
[small-long.csv](https://github.com/centerofci/mathesar/files/12070026/small-long.csv)
1. When inference fails, Mathesar presents the user with this screen:

At this point, the user's data **has already been imported**, but it just lives in a table with all Text columns.
From here the user has the following options:
- "Delete import", or "Cancel" which wipes their imported data
- "Retry" (which will eventually lead back to this screen)
- Navigate to the Schema page and then click on the table, at which point Mathesar will attempt to perform inference again (similar to "Retry")
The user does **not** have an option to skip the inference and keep the imported table as plain text.
## Desired behavior
- We should give the user an option to skip the column type inference and keep the imported table with all columns as the "Text" type.
| main | gracefully recover from failed type inference during import current behavior sometimes column type inference fails during import this can happen due to a variety of reasons described in i m able to reliably reproduce an inference failure by attempting to import this csv when inference fails mathesar presents the user with this screen at this point the user s data has already been imported but it just lives in a table with all text columns from here the user has the following options delete import or cancel which wipes their imported data retry which will eventually lead back to this screen navigate to the schema page and then click on the table at which point mathesar will attempt to perform inference again similar to retry the user does not have an option to skip the inference and keep the imported table as plain text desired behavior we should give the user an option to skip the column type inference and keep the imported table with all columns as the text type | 1 |
2,150 | 7,459,737,469 | IssuesEvent | 2018-03-30 16:36:43 | WhitestormJS/whs.js | https://api.github.com/repos/WhitestormJS/whs.js | closed | Update to three.js r90 | MAINTAINANCE | Update three.js dependency, it now sits at r90.0
A number of fixes and improvements were made since r87.
###### Version:
- [x] v2.x.x
- [ ] v1.x.x
###### Issue type:
- [ ] Bug
- [x] Proposal/Enhancement
- [ ] Question
- [ ] Discussion
------
<details>
<summary> <b>Tested on: </b> </summary>
###### Desktop
- [ ] Chrome
- [ ] Chrome Canary
- [ ] Chrome dev-channel
- [ ] Firefox
- [ ] Opera
- [ ] Microsoft IE
- [ ] Microsoft Edge
###### Android
- [ ] Chrome
- [ ] Firefox
- [ ] Opera
###### IOS
- [ ] Chrome
- [ ] Firefox
- [ ] Opera
</details>
| True | Update to three.js r90 - Update three.js dependency, it now sits at r90.0
A number of fixes and improvements were made since r87.
###### Version:
- [x] v2.x.x
- [ ] v1.x.x
###### Issue type:
- [ ] Bug
- [x] Proposal/Enhancement
- [ ] Question
- [ ] Discussion
------
<details>
<summary> <b>Tested on: </b> </summary>
###### Desktop
- [ ] Chrome
- [ ] Chrome Canary
- [ ] Chrome dev-channel
- [ ] Firefox
- [ ] Opera
- [ ] Microsoft IE
- [ ] Microsoft Edge
###### Android
- [ ] Chrome
- [ ] Firefox
- [ ] Opera
###### IOS
- [ ] Chrome
- [ ] Firefox
- [ ] Opera
</details>
| main | update to three js update three js dependency it now sits at a number of fixes and improvements were made since version x x x x issue type bug proposal enhancement question discussion tested on desktop chrome chrome canary chrome dev channel firefox opera microsoft ie microsoft edge android chrome firefox opera ios chrome firefox opera | 1 |
69,473 | 14,988,725,638 | IssuesEvent | 2021-01-29 01:58:19 | MValle21/oathkeeper | https://api.github.com/repos/MValle21/oathkeeper | opened | CVE-2019-0205 (High) detected in github.com/uber/jaeger-client-go/thrift-fe3fa553c313b32f58cc684a59a4d48f03e07df9, github.com/uber/jaeger-client-go-fe3fa553c313b32f58cc684a59a4d48f03e07df9 | security vulnerability | ## CVE-2019-0205 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>github.com/uber/jaeger-client-go/thrift-fe3fa553c313b32f58cc684a59a4d48f03e07df9</b>, <b>github.com/uber/jaeger-client-go-fe3fa553c313b32f58cc684a59a4d48f03e07df9</b></p></summary>
<p>
<details><summary><b>github.com/uber/jaeger-client-go/thrift-fe3fa553c313b32f58cc684a59a4d48f03e07df9</b></p></summary>
<p>Jaeger Bindings for Go OpenTracing API.</p>
<p>
Dependency Hierarchy:
- github.com/ory/oathkeeper/pipeline-eb53de71bfdc0bc10448220fda211321b054d811 (Root Library)
- github.com/ory/oathkeeper/driver/configuration-eb53de71bfdc0bc10448220fda211321b054d811
- github.com/ory/x/tracing-d066f77955a58f5d62414e23d6cfe3a68b6cb57a
- github.com/uber/jaeger-client-go/zipkin-fe3fa553c313b32f58cc684a59a4d48f03e07df9
- github.com/uber/jaeger-client-go-fe3fa553c313b32f58cc684a59a4d48f03e07df9
- github.com/uber/jaeger-client-go/thrift-gen/agent-fe3fa553c313b32f58cc684a59a4d48f03e07df9
- :x: **github.com/uber/jaeger-client-go/thrift-fe3fa553c313b32f58cc684a59a4d48f03e07df9** (Vulnerable Library)
</details>
<details><summary><b>github.com/uber/jaeger-client-go-fe3fa553c313b32f58cc684a59a4d48f03e07df9</b></p></summary>
<p>Jaeger Bindings for Go OpenTracing API.</p>
<p>
Dependency Hierarchy:
- github.com/ory/oathkeeper/pipeline-eb53de71bfdc0bc10448220fda211321b054d811 (Root Library)
- github.com/ory/oathkeeper/driver/configuration-eb53de71bfdc0bc10448220fda211321b054d811
- github.com/ory/x/tracing-d066f77955a58f5d62414e23d6cfe3a68b6cb57a
- github.com/uber/jaeger-client-go/zipkin-fe3fa553c313b32f58cc684a59a4d48f03e07df9
- :x: **github.com/uber/jaeger-client-go-fe3fa553c313b32f58cc684a59a4d48f03e07df9** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/MValle21/oathkeeper/commit/43c00a05bdb772edb5194a57f42ee834b37f3774">43c00a05bdb772edb5194a57f42ee834b37f3774</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Apache Thrift all versions up to and including 0.12.0, a server or client may run into an endless loop when feed with specific input data. Because the issue had already been partially fixed in version 0.11.0, depending on the installed version it affects only certain language bindings.
<p>Publish Date: 2019-10-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-0205>CVE-2019-0205</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-0205">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-0205</a></p>
<p>Release Date: 2019-10-29</p>
<p>Fix Resolution: org.apache.thrift:libthrift:0.13.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"GO","packageName":"github.com/uber/jaeger-client-go/thrift","packageVersion":"fe3fa553c313b32f58cc684a59a4d48f03e07df9","isTransitiveDependency":true,"dependencyTree":"github.com/ory/oathkeeper/pipeline:eb53de71bfdc0bc10448220fda211321b054d811;github.com/ory/oathkeeper/driver/configuration:eb53de71bfdc0bc10448220fda211321b054d811;github.com/ory/x/tracing:d066f77955a58f5d62414e23d6cfe3a68b6cb57a;github.com/uber/jaeger-client-go/zipkin:fe3fa553c313b32f58cc684a59a4d48f03e07df9;github.com/uber/jaeger-client-go:fe3fa553c313b32f58cc684a59a4d48f03e07df9;github.com/uber/jaeger-client-go/thrift-gen/agent:fe3fa553c313b32f58cc684a59a4d48f03e07df9;github.com/uber/jaeger-client-go/thrift:fe3fa553c313b32f58cc684a59a4d48f03e07df9","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.thrift:libthrift:0.13.0"},{"packageType":"GO","packageName":"github.com/uber/jaeger-client-go","packageVersion":"fe3fa553c313b32f58cc684a59a4d48f03e07df9","isTransitiveDependency":true,"dependencyTree":"github.com/ory/oathkeeper/pipeline:eb53de71bfdc0bc10448220fda211321b054d811;github.com/ory/oathkeeper/driver/configuration:eb53de71bfdc0bc10448220fda211321b054d811;github.com/ory/x/tracing:d066f77955a58f5d62414e23d6cfe3a68b6cb57a;github.com/uber/jaeger-client-go/zipkin:fe3fa553c313b32f58cc684a59a4d48f03e07df9;github.com/uber/jaeger-client-go:fe3fa553c313b32f58cc684a59a4d48f03e07df9","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.thrift:libthrift:0.13.0"}],"vulnerabilityIdentifier":"CVE-2019-0205","vulnerabilityDetails":"In Apache Thrift all versions up to and including 0.12.0, a server or client may run into an endless loop when feed with specific input data. Because the issue had already been partially fixed in version 0.11.0, depending on the installed version it affects only certain language bindings.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-0205","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | True | CVE-2019-0205 (High) detected in github.com/uber/jaeger-client-go/thrift-fe3fa553c313b32f58cc684a59a4d48f03e07df9, github.com/uber/jaeger-client-go-fe3fa553c313b32f58cc684a59a4d48f03e07df9 - ## CVE-2019-0205 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>github.com/uber/jaeger-client-go/thrift-fe3fa553c313b32f58cc684a59a4d48f03e07df9</b>, <b>github.com/uber/jaeger-client-go-fe3fa553c313b32f58cc684a59a4d48f03e07df9</b></p></summary>
<p>
<details><summary><b>github.com/uber/jaeger-client-go/thrift-fe3fa553c313b32f58cc684a59a4d48f03e07df9</b></p></summary>
<p>Jaeger Bindings for Go OpenTracing API.</p>
<p>
Dependency Hierarchy:
- github.com/ory/oathkeeper/pipeline-eb53de71bfdc0bc10448220fda211321b054d811 (Root Library)
- github.com/ory/oathkeeper/driver/configuration-eb53de71bfdc0bc10448220fda211321b054d811
- github.com/ory/x/tracing-d066f77955a58f5d62414e23d6cfe3a68b6cb57a
- github.com/uber/jaeger-client-go/zipkin-fe3fa553c313b32f58cc684a59a4d48f03e07df9
- github.com/uber/jaeger-client-go-fe3fa553c313b32f58cc684a59a4d48f03e07df9
- github.com/uber/jaeger-client-go/thrift-gen/agent-fe3fa553c313b32f58cc684a59a4d48f03e07df9
- :x: **github.com/uber/jaeger-client-go/thrift-fe3fa553c313b32f58cc684a59a4d48f03e07df9** (Vulnerable Library)
</details>
<details><summary><b>github.com/uber/jaeger-client-go-fe3fa553c313b32f58cc684a59a4d48f03e07df9</b></p></summary>
<p>Jaeger Bindings for Go OpenTracing API.</p>
<p>
Dependency Hierarchy:
- github.com/ory/oathkeeper/pipeline-eb53de71bfdc0bc10448220fda211321b054d811 (Root Library)
- github.com/ory/oathkeeper/driver/configuration-eb53de71bfdc0bc10448220fda211321b054d811
- github.com/ory/x/tracing-d066f77955a58f5d62414e23d6cfe3a68b6cb57a
- github.com/uber/jaeger-client-go/zipkin-fe3fa553c313b32f58cc684a59a4d48f03e07df9
- :x: **github.com/uber/jaeger-client-go-fe3fa553c313b32f58cc684a59a4d48f03e07df9** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/MValle21/oathkeeper/commit/43c00a05bdb772edb5194a57f42ee834b37f3774">43c00a05bdb772edb5194a57f42ee834b37f3774</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Apache Thrift all versions up to and including 0.12.0, a server or client may run into an endless loop when feed with specific input data. Because the issue had already been partially fixed in version 0.11.0, depending on the installed version it affects only certain language bindings.
<p>Publish Date: 2019-10-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-0205>CVE-2019-0205</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-0205">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-0205</a></p>
<p>Release Date: 2019-10-29</p>
<p>Fix Resolution: org.apache.thrift:libthrift:0.13.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"GO","packageName":"github.com/uber/jaeger-client-go/thrift","packageVersion":"fe3fa553c313b32f58cc684a59a4d48f03e07df9","isTransitiveDependency":true,"dependencyTree":"github.com/ory/oathkeeper/pipeline:eb53de71bfdc0bc10448220fda211321b054d811;github.com/ory/oathkeeper/driver/configuration:eb53de71bfdc0bc10448220fda211321b054d811;github.com/ory/x/tracing:d066f77955a58f5d62414e23d6cfe3a68b6cb57a;github.com/uber/jaeger-client-go/zipkin:fe3fa553c313b32f58cc684a59a4d48f03e07df9;github.com/uber/jaeger-client-go:fe3fa553c313b32f58cc684a59a4d48f03e07df9;github.com/uber/jaeger-client-go/thrift-gen/agent:fe3fa553c313b32f58cc684a59a4d48f03e07df9;github.com/uber/jaeger-client-go/thrift:fe3fa553c313b32f58cc684a59a4d48f03e07df9","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.thrift:libthrift:0.13.0"},{"packageType":"GO","packageName":"github.com/uber/jaeger-client-go","packageVersion":"fe3fa553c313b32f58cc684a59a4d48f03e07df9","isTransitiveDependency":true,"dependencyTree":"github.com/ory/oathkeeper/pipeline:eb53de71bfdc0bc10448220fda211321b054d811;github.com/ory/oathkeeper/driver/configuration:eb53de71bfdc0bc10448220fda211321b054d811;github.com/ory/x/tracing:d066f77955a58f5d62414e23d6cfe3a68b6cb57a;github.com/uber/jaeger-client-go/zipkin:fe3fa553c313b32f58cc684a59a4d48f03e07df9;github.com/uber/jaeger-client-go:fe3fa553c313b32f58cc684a59a4d48f03e07df9","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.thrift:libthrift:0.13.0"}],"vulnerabilityIdentifier":"CVE-2019-0205","vulnerabilityDetails":"In Apache Thrift all versions up to and including 0.12.0, a server or client may run into an endless loop when feed with specific input data. Because the issue had already been partially fixed in version 0.11.0, depending on the installed version it affects only certain language bindings.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-0205","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | non_main | cve high detected in github com uber jaeger client go thrift github com uber jaeger client go cve high severity vulnerability vulnerable libraries github com uber jaeger client go thrift github com uber jaeger client go github com uber jaeger client go thrift jaeger bindings for go opentracing api dependency hierarchy github com ory oathkeeper pipeline root library github com ory oathkeeper driver configuration github com ory x tracing github com uber jaeger client go zipkin github com uber jaeger client go github com uber jaeger client go thrift gen agent x github com uber jaeger client go thrift vulnerable library github com uber jaeger client go jaeger bindings for go opentracing api dependency hierarchy github com ory oathkeeper pipeline root library github com ory oathkeeper driver configuration github com ory x tracing github com uber jaeger client go zipkin x github com uber jaeger client go vulnerable library found in head commit a href found in base branch master vulnerability details in apache thrift all versions up to and including a server or client may run into an endless loop when feed with specific input data because the issue had already been partially fixed in version depending on the installed version it affects only certain language bindings publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache thrift libthrift isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails in apache thrift all versions up to and including a server or client may run into an endless loop when feed with specific input data because the issue had already been partially fixed in version depending on the installed version it affects only certain language bindings vulnerabilityurl | 0 |
189,280 | 14,497,292,870 | IssuesEvent | 2020-12-11 14:02:37 | kalexmills/github-vet-tests-dec2020 | https://api.github.com/repos/kalexmills/github-vet-tests-dec2020 | closed | itsivareddy/terrafrom-Oci: oci/waas_address_list_test.go; 16 LoC | fresh small test |
Found a possible issue in [itsivareddy/terrafrom-Oci](https://www.github.com/itsivareddy/terrafrom-Oci) at [oci/waas_address_list_test.go](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/waas_address_list_test.go#L282-L297)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> reference to addressListId is reassigned at line 286
[Click here to see the code in its original context.](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/waas_address_list_test.go#L282-L297)
<details>
<summary>Click here to show the 16 line(s) of Go which triggered the analyzer.</summary>
```go
for _, addressListId := range addressListIds {
if ok := SweeperDefaultResourceId[addressListId]; !ok {
deleteAddressListRequest := oci_waas.DeleteAddressListRequest{}
deleteAddressListRequest.AddressListId = &addressListId
deleteAddressListRequest.RequestMetadata.RetryPolicy = getRetryPolicy(true, "waas")
_, error := waasClient.DeleteAddressList(context.Background(), deleteAddressListRequest)
if error != nil {
fmt.Printf("Error deleting AddressList %s %s, It is possible that the resource is already deleted. Please verify manually \n", addressListId, error)
continue
}
waitTillCondition(testAccProvider, &addressListId, addressListSweepWaitCondition, time.Duration(3*time.Minute),
addressListSweepResponseFetchOperation, "waas", true)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 075608a9e201ee0e32484da68d5ba5370dfde1be
| 1.0 | itsivareddy/terrafrom-Oci: oci/waas_address_list_test.go; 16 LoC -
Found a possible issue in [itsivareddy/terrafrom-Oci](https://www.github.com/itsivareddy/terrafrom-Oci) at [oci/waas_address_list_test.go](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/waas_address_list_test.go#L282-L297)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> reference to addressListId is reassigned at line 286
[Click here to see the code in its original context.](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/waas_address_list_test.go#L282-L297)
<details>
<summary>Click here to show the 16 line(s) of Go which triggered the analyzer.</summary>
```go
for _, addressListId := range addressListIds {
if ok := SweeperDefaultResourceId[addressListId]; !ok {
deleteAddressListRequest := oci_waas.DeleteAddressListRequest{}
deleteAddressListRequest.AddressListId = &addressListId
deleteAddressListRequest.RequestMetadata.RetryPolicy = getRetryPolicy(true, "waas")
_, error := waasClient.DeleteAddressList(context.Background(), deleteAddressListRequest)
if error != nil {
fmt.Printf("Error deleting AddressList %s %s, It is possible that the resource is already deleted. Please verify manually \n", addressListId, error)
continue
}
waitTillCondition(testAccProvider, &addressListId, addressListSweepWaitCondition, time.Duration(3*time.Minute),
addressListSweepResponseFetchOperation, "waas", true)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 075608a9e201ee0e32484da68d5ba5370dfde1be
| non_main | itsivareddy terrafrom oci oci waas address list test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message reference to addresslistid is reassigned at line click here to show the line s of go which triggered the analyzer go for addresslistid range addresslistids if ok sweeperdefaultresourceid ok deleteaddresslistrequest oci waas deleteaddresslistrequest deleteaddresslistrequest addresslistid addresslistid deleteaddresslistrequest requestmetadata retrypolicy getretrypolicy true waas error waasclient deleteaddresslist context background deleteaddresslistrequest if error nil fmt printf error deleting addresslist s s it is possible that the resource is already deleted please verify manually n addresslistid error continue waittillcondition testaccprovider addresslistid addresslistsweepwaitcondition time duration time minute addresslistsweepresponsefetchoperation waas true leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id | 0 |
180,077 | 6,643,184,068 | IssuesEvent | 2017-09-27 10:16:27 | ntop/ntopng | https://api.github.com/repos/ntop/ntopng | closed | Aggregated interface names do not honour the expected format | Low-priority bug | <img width="296" alt="screen shot 2017-09-21 at 15 48 02" src="https://user-images.githubusercontent.com/4493366/30699181-4f462f7e-9ee4-11e7-8204-f58e0c77b56f.png">
| 1.0 | Aggregated interface names do not honour the expected format - <img width="296" alt="screen shot 2017-09-21 at 15 48 02" src="https://user-images.githubusercontent.com/4493366/30699181-4f462f7e-9ee4-11e7-8204-f58e0c77b56f.png">
| non_main | aggregated interface names do not honour the expected format img width alt screen shot at src | 0 |
121,976 | 12,138,415,393 | IssuesEvent | 2020-04-23 17:12:21 | Revolutionary-Games/Thrive | https://api.github.com/repos/Revolutionary-Games/Thrive | closed | Improve setup instructions | documentation | I have some random points to improve the setup instructions with:
- [ ] make sure you have new mono with msbuild included. Fedora has mono in the official repo but it is way too old to work
- [ ] add bigger note about needing MONO VERSION not STANDARD VERSION.
Otherwise you get errors like:
```
ERROR: No loader found for resource: res://scripts/gui/MainMenu.cs.
```
- [ ] on windows make sure autocrlf is on
- [ ] git book resource: https://git-scm.com/book/en/v2
- [ ] git tutorial videos:
https://www.youtube.com/watch?v=SWYqp7iY_Tc
https://www.youtube.com/watch?v=HVsySz-h9r4
- [ ] jsonlint is now used, remove note from doc/setup_instructions.md
- [ ] if npm install -g doesn't work on linux, add `sudo` | 1.0 | Improve setup instructions - I have some random points to improve the setup instructions with:
- [ ] make sure you have new mono with msbuild included. Fedora has mono in the official repo but it is way too old to work
- [ ] add bigger note about needing MONO VERSION not STANDARD VERSION.
Otherwise you get errors like:
```
ERROR: No loader found for resource: res://scripts/gui/MainMenu.cs.
```
- [ ] on windows make sure autocrlf is on
- [ ] git book resource: https://git-scm.com/book/en/v2
- [ ] git tutorial videos:
https://www.youtube.com/watch?v=SWYqp7iY_Tc
https://www.youtube.com/watch?v=HVsySz-h9r4
- [ ] jsonlint is now used, remove note from doc/setup_instructions.md
- [ ] if npm install -g doesn't work on linux, add `sudo` | non_main | improve setup instructions i have some random points to improve the setup instructions with make sure you have new mono with msbuild included fedora has mono in the official repo but it is way too old to work add bigger note about needing mono version not standard version otherwise you get errors like error no loader found for resource res scripts gui mainmenu cs on windows make sure autocrlf is on git book resource git tutorial videos jsonlint is now used remove note from doc setup instructions md if npm install g doesn t work on linux add sudo | 0 |
1,563 | 6,572,257,707 | IssuesEvent | 2017-09-11 00:42:13 | ansible/ansible-modules-extras | https://api.github.com/repos/ansible/ansible-modules-extras | closed | Deployment kind support on Kubernetes module | affects_2.1 feature_idea waiting_on_maintainer | <!--- Verify first that your issue/request is not already reported in GitHub -->
##### ISSUE TYPE
<!--- Pick one below and delete the rest: -->
- Feature Idea
##### COMPONENT NAME
<!--- Name of the plugin/module/task -->
clustering/kubernetes
##### ANSIBLE VERSION
<!--- Paste verbatim output from “ansible --version” between quotes below -->
```
ansible 2.1.0.0
config file = /etc/ansible/ansible.cfg
configured module search path = Default w/o overrides
```
##### OS / ENVIRONMENT
<!---
Mention the OS you are running Ansible from, and the OS you are
managing, or say “N/A” for anything that is not platform-specific.
-->
CentOS Linux release 7.2.1511 (Core)
##### SUMMARY
<!--- Explain the problem briefly -->
It would be great if Ansible kubernetes module could support "Deployment" kind.
Issue reference: https://github.com/ansible/ansible-modules-extras/issues/2477.
Thanks in advance.
| True | Deployment kind support on Kubernetes module - <!--- Verify first that your issue/request is not already reported in GitHub -->
##### ISSUE TYPE
<!--- Pick one below and delete the rest: -->
- Feature Idea
##### COMPONENT NAME
<!--- Name of the plugin/module/task -->
clustering/kubernetes
##### ANSIBLE VERSION
<!--- Paste verbatim output from “ansible --version” between quotes below -->
```
ansible 2.1.0.0
config file = /etc/ansible/ansible.cfg
configured module search path = Default w/o overrides
```
##### OS / ENVIRONMENT
<!---
Mention the OS you are running Ansible from, and the OS you are
managing, or say “N/A” for anything that is not platform-specific.
-->
CentOS Linux release 7.2.1511 (Core)
##### SUMMARY
<!--- Explain the problem briefly -->
It would be great if Ansible kubernetes module could support "Deployment" kind.
Issue reference: https://github.com/ansible/ansible-modules-extras/issues/2477.
Thanks in advance.
| main | deployment kind support on kubernetes module issue type feature idea component name clustering kubernetes ansible version ansible config file etc ansible ansible cfg configured module search path default w o overrides os environment mention the os you are running ansible from and the os you are managing or say “n a” for anything that is not platform specific centos linux release core summary it would be great if ansible kubernetes module could support deployment kind issue reference thanks in advance | 1 |
125,791 | 16,832,003,042 | IssuesEvent | 2021-06-18 06:51:21 | WordPress/pattern-directory | https://api.github.com/repos/WordPress/pattern-directory | opened | Pattern Submission: Major quote with parallax background | [Status] Needs Design Feedback [Type] Pattern Submission | <!-- Use this area to share an overview of this pattern and why you feel it should be included. -->
Title says it all.
---
## Pattern Title
<!-- Choose a simple, descriptive title for your pattern. -->
Major quote with parallax background
## Pattern Categories
<!-- e.g. Buttons, Columns, Gallery, Header, Text. -->
Text
## Screenshots
<!-- Please include a screenshot of your pattern design. -->

## Image Credits
<!-- All images used in your patterns must be CC0 or Public Domain. -->
- https://www.rawpixel.com/image/3285639/free-photo-image-architecture-cityscape-people-and-landscape
- https://www.rawpixel.com/image/2545031/free-illustration-image-statue-man-face
## Block Markup
<!--
Optional. If you have already created this pattern using blocks, paste the block markup here. You can also paste it into a new GitHub Gist, and include just the link here: https://gist.github.com
-->
https://gist.github.com/webmandesign/e1faca072d6e1fb123828b938385f55e | 1.0 | Pattern Submission: Major quote with parallax background - <!-- Use this area to share an overview of this pattern and why you feel it should be included. -->
Title says it all.
---
## Pattern Title
<!-- Choose a simple, descriptive title for your pattern. -->
Major quote with parallax background
## Pattern Categories
<!-- e.g. Buttons, Columns, Gallery, Header, Text. -->
Text
## Screenshots
<!-- Please include a screenshot of your pattern design. -->

## Image Credits
<!-- All images used in your patterns must be CC0 or Public Domain. -->
- https://www.rawpixel.com/image/3285639/free-photo-image-architecture-cityscape-people-and-landscape
- https://www.rawpixel.com/image/2545031/free-illustration-image-statue-man-face
## Block Markup
<!--
Optional. If you have already created this pattern using blocks, paste the block markup here. You can also paste it into a new GitHub Gist, and include just the link here: https://gist.github.com
-->
https://gist.github.com/webmandesign/e1faca072d6e1fb123828b938385f55e | non_main | pattern submission major quote with parallax background title says it all pattern title major quote with parallax background pattern categories text screenshots image credits block markup optional if you have already created this pattern using blocks paste the block markup here you can also paste it into a new github gist and include just the link here | 0 |
22,917 | 11,776,942,237 | IssuesEvent | 2020-03-16 14:04:44 | IBM/FHIR | https://api.github.com/repos/IBM/FHIR | closed | Update Observation is slow (under certain circumstances) | bug performance schema-change | Create performance looks pretty good to me, but the update performance is looking ugly in our test env (millions of observations).
Update is going to take longer because, with the current design, we do lots of extra work:
1. read before update to distinguish creates from updates and check versions
2. delete all existing search parameters for this logical resource before inserting the new ones
However, my initial analysis points to a possible degradation due to the support I added for Composite search parameter types (which cross multiple tables). Needs further analysis. | True | Update Observation is slow (under certain circumstances) - Create performance looks pretty good to me, but the update performance is looking ugly in our test env (millions of observations).
Update is going to take longer because, with the current design, we do lots of extra work:
1. read before update to distinguish creates from updates and check versions
2. delete all existing search parameters for this logical resource before inserting the new ones
However, my initial analysis points to a possible degradation due to the support I added for Composite search parameter types (which cross multiple tables). Needs further analysis. | non_main | update observation is slow under certain circumstances create performance looks pretty good to me but the update performance is looking ugly in our test env millions of observations update is going to take longer because with the current design we do lots of extra work read before update to distinguish creates from updates and check versions delete all existing search parameters for this logical resource before inserting the new ones however my initial analysis points to a possible degradation due to the support i added for composite search parameter types which cross multiple tables needs further analysis | 0 |
3,341 | 12,957,632,786 | IssuesEvent | 2020-07-20 10:01:33 | precice/precice | https://api.github.com/repos/precice/precice | opened | Test Action Timings | maintainability | **Please describe the problem you are trying to solve.**
We cannot test if, when and how often actions have been triggered by preCICE.
**Describe the solution you propose.**
Add a `CountingAction`, which contains a `static std::map<Timing, int>`.
Everytime `performAction()` is called, it increments the counter of its timing.
This map can be accessed by `CountingAction::count(Timing::XXX)` and `CountingAction::resetCounter()`.
We can configure this action in integration tests and check the counters after each invocation of interface functions.
Related to #823 and #711 | True | Test Action Timings - **Please describe the problem you are trying to solve.**
We cannot test if, when and how often actions have been triggered by preCICE.
**Describe the solution you propose.**
Add a `CountingAction`, which contains a `static std::map<Timing, int>`.
Everytime `performAction()` is called, it increments the counter of its timing.
This map can be accessed by `CountingAction::count(Timing::XXX)` and `CountingAction::resetCounter()`.
We can configure this action in integration tests and check the counters after each invocation of interface functions.
Related to #823 and #711 | main | test action timings please describe the problem you are trying to solve we cannot test if when and how often actions have been triggered by precice describe the solution you propose add a countingaction which contains a static std map everytime performaction is called it increments the counter of its timing this map can be accessed by countingaction count timing xxx and countingaction resetcounter we can configure this action in integration tests and check the counters after each invocation of interface functions related to and | 1 |
504,483 | 14,619,662,369 | IssuesEvent | 2020-12-22 18:15:44 | kubernetes/minikube | https://api.github.com/repos/kubernetes/minikube | closed | ISO: Enable CONFIG_XFS_QUOTA | area/guest-vm kind/feature priority/backlog | Please enable CONFIG_XFS_QUOTA in linux kernel options for the VirtualBox ISO image. This leeds to problems with the kadalu storage provider. See: https://github.com/kadalu/kadalu/issues/351
<!--- Please include the "minikube start" command you used in your reproduction steps --->
**Steps to reproduce the issue:**
1. Boot minikube
2. try to mount a xfs filesystem with quota support.
<!--- TIP: Add the "--alsologtostderr" flag to the command-line for more logs --->
**Full output of failed command:**
mount: /bricks/storage-pool-1/data: wrong fs type, bad option, bad superblock on /dev/loop0, missing codepage or helper program, or other error.
**Full output of `minikube start` command used, if not already included:**
😄 minikube v1.15.1 auf Darwin 11.0.1
✨ Using the virtualbox driver based on existing profile
👍 Starting control plane node minikube in cluster minikube
🔄 Restarting existing virtualbox VM for "minikube" ...
🐳 Vorbereiten von Kubernetes v1.19.4 auf Docker 19.03.13...
🔎 Verifying Kubernetes components...
🌟 Enabled addons: storage-provisioner, default-storageclass
🏄 Done! kubectl is now configured to use "minikube" cluster and "default" namespace by default
| 1.0 | ISO: Enable CONFIG_XFS_QUOTA - Please enable CONFIG_XFS_QUOTA in linux kernel options for the VirtualBox ISO image. This leeds to problems with the kadalu storage provider. See: https://github.com/kadalu/kadalu/issues/351
<!--- Please include the "minikube start" command you used in your reproduction steps --->
**Steps to reproduce the issue:**
1. Boot minikube
2. try to mount a xfs filesystem with quota support.
<!--- TIP: Add the "--alsologtostderr" flag to the command-line for more logs --->
**Full output of failed command:**
mount: /bricks/storage-pool-1/data: wrong fs type, bad option, bad superblock on /dev/loop0, missing codepage or helper program, or other error.
**Full output of `minikube start` command used, if not already included:**
😄 minikube v1.15.1 auf Darwin 11.0.1
✨ Using the virtualbox driver based on existing profile
👍 Starting control plane node minikube in cluster minikube
🔄 Restarting existing virtualbox VM for "minikube" ...
🐳 Vorbereiten von Kubernetes v1.19.4 auf Docker 19.03.13...
🔎 Verifying Kubernetes components...
🌟 Enabled addons: storage-provisioner, default-storageclass
🏄 Done! kubectl is now configured to use "minikube" cluster and "default" namespace by default
| non_main | iso enable config xfs quota please enable config xfs quota in linux kernel options for the virtualbox iso image this leeds to problems with the kadalu storage provider see steps to reproduce the issue boot minikube try to mount a xfs filesystem with quota support full output of failed command mount bricks storage pool data wrong fs type bad option bad superblock on dev missing codepage or helper program or other error full output of minikube start command used if not already included 😄 minikube auf darwin ✨ using the virtualbox driver based on existing profile 👍 starting control plane node minikube in cluster minikube 🔄 restarting existing virtualbox vm for minikube 🐳 vorbereiten von kubernetes auf docker 🔎 verifying kubernetes components 🌟 enabled addons storage provisioner default storageclass 🏄 done kubectl is now configured to use minikube cluster and default namespace by default | 0 |
2,392 | 8,499,472,205 | IssuesEvent | 2018-10-29 17:14:54 | RalfKoban/MiKo-Analyzers | https://api.github.com/repos/RalfKoban/MiKo-Analyzers | closed | EventArgs should not use delegates | Area: analyzer Area: maintainability feature next | The `EventArgs` should not use delegates such as `Action` or `Func`.
The reason is that the callee has to know how the action behaves. If there is a failure inside the action, then it's really hard to tackle down the problem if the exception is thrown in a complete unrelated area. | True | EventArgs should not use delegates - The `EventArgs` should not use delegates such as `Action` or `Func`.
The reason is that the callee has to know how the action behaves. If there is a failure inside the action, then it's really hard to tackle down the problem if the exception is thrown in a complete unrelated area. | main | eventargs should not use delegates the eventargs should not use delegates such as action or func the reason is that the callee has to know how the action behaves if there is a failure inside the action then it s really hard to tackle down the problem if the exception is thrown in a complete unrelated area | 1 |
369,528 | 25,855,309,296 | IssuesEvent | 2022-12-13 13:20:48 | BiBiServ/bibigrid | https://api.github.com/repos/BiBiServ/bibigrid | closed | Documentation of light rest 4j API Access | documentation | Creating a simple guide, "How to use and start the API Server ..."
Postman examples should be updated
| 1.0 | Documentation of light rest 4j API Access - Creating a simple guide, "How to use and start the API Server ..."
Postman examples should be updated
| non_main | documentation of light rest api access creating a simple guide how to use and start the api server postman examples should be updated | 0 |
198,178 | 22,617,952,256 | IssuesEvent | 2022-06-30 01:26:23 | KOSASIH/hmkit-Android | https://api.github.com/repos/KOSASIH/hmkit-Android | opened | kotlin-stdlib-1.4.10.jar: 1 vulnerabilities (highest severity is: 5.3) | security vulnerability | <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>kotlin-stdlib-1.4.10.jar</b></p></summary>
<p>Kotlin Standard Library for JVM</p>
<p>Path to dependency file: /tmp/ws-scm/hmkit-android/hmkit-android/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.jetbrains.kotlin/kotlin-stdlib/1.4.10/ea29e063d2bbe695be13e9d044dcfb0c7add398e/kotlin-stdlib-1.4.10.jar,/caches/modules-2/files-2.1/org.jetbrains.kotlin/kotlin-stdlib/1.4.10/ea29e063d2bbe695be13e9d044dcfb0c7add398e/kotlin-stdlib-1.4.10.jar</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2022-24329](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-24329) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | kotlin-stdlib-1.4.10.jar | Direct | 1.6.0-M1 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-24329</summary>
### Vulnerable Library - <b>kotlin-stdlib-1.4.10.jar</b></p>
<p>Kotlin Standard Library for JVM</p>
<p>Path to dependency file: /tmp/ws-scm/hmkit-android/hmkit-android/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.jetbrains.kotlin/kotlin-stdlib/1.4.10/ea29e063d2bbe695be13e9d044dcfb0c7add398e/kotlin-stdlib-1.4.10.jar,/caches/modules-2/files-2.1/org.jetbrains.kotlin/kotlin-stdlib/1.4.10/ea29e063d2bbe695be13e9d044dcfb0c7add398e/kotlin-stdlib-1.4.10.jar</p>
<p>
Dependency Hierarchy:
- :x: **kotlin-stdlib-1.4.10.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In JetBrains Kotlin before 1.6.0, it was not possible to lock dependencies for Multiplatform Gradle Projects.
<p>Publish Date: 2022-02-25
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-24329>CVE-2022-24329</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-2qp4-g3q3-f92w">https://github.com/advisories/GHSA-2qp4-g3q3-f92w</a></p>
<p>Release Date: 2022-02-25</p>
<p>Fix Resolution: 1.6.0-M1</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details> | True | kotlin-stdlib-1.4.10.jar: 1 vulnerabilities (highest severity is: 5.3) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>kotlin-stdlib-1.4.10.jar</b></p></summary>
<p>Kotlin Standard Library for JVM</p>
<p>Path to dependency file: /tmp/ws-scm/hmkit-android/hmkit-android/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.jetbrains.kotlin/kotlin-stdlib/1.4.10/ea29e063d2bbe695be13e9d044dcfb0c7add398e/kotlin-stdlib-1.4.10.jar,/caches/modules-2/files-2.1/org.jetbrains.kotlin/kotlin-stdlib/1.4.10/ea29e063d2bbe695be13e9d044dcfb0c7add398e/kotlin-stdlib-1.4.10.jar</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2022-24329](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-24329) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | kotlin-stdlib-1.4.10.jar | Direct | 1.6.0-M1 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-24329</summary>
### Vulnerable Library - <b>kotlin-stdlib-1.4.10.jar</b></p>
<p>Kotlin Standard Library for JVM</p>
<p>Path to dependency file: /tmp/ws-scm/hmkit-android/hmkit-android/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.jetbrains.kotlin/kotlin-stdlib/1.4.10/ea29e063d2bbe695be13e9d044dcfb0c7add398e/kotlin-stdlib-1.4.10.jar,/caches/modules-2/files-2.1/org.jetbrains.kotlin/kotlin-stdlib/1.4.10/ea29e063d2bbe695be13e9d044dcfb0c7add398e/kotlin-stdlib-1.4.10.jar</p>
<p>
Dependency Hierarchy:
- :x: **kotlin-stdlib-1.4.10.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In JetBrains Kotlin before 1.6.0, it was not possible to lock dependencies for Multiplatform Gradle Projects.
<p>Publish Date: 2022-02-25
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-24329>CVE-2022-24329</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-2qp4-g3q3-f92w">https://github.com/advisories/GHSA-2qp4-g3q3-f92w</a></p>
<p>Release Date: 2022-02-25</p>
<p>Fix Resolution: 1.6.0-M1</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details> | non_main | kotlin stdlib jar vulnerabilities highest severity is vulnerable library kotlin stdlib jar kotlin standard library for jvm path to dependency file tmp ws scm hmkit android hmkit android build gradle path to vulnerable library home wss scanner gradle caches modules files org jetbrains kotlin kotlin stdlib kotlin stdlib jar caches modules files org jetbrains kotlin kotlin stdlib kotlin stdlib jar vulnerabilities cve severity cvss dependency type fixed in remediation available medium kotlin stdlib jar direct details cve vulnerable library kotlin stdlib jar kotlin standard library for jvm path to dependency file tmp ws scm hmkit android hmkit android build gradle path to vulnerable library home wss scanner gradle caches modules files org jetbrains kotlin kotlin stdlib kotlin stdlib jar caches modules files org jetbrains kotlin kotlin stdlib kotlin stdlib jar dependency hierarchy x kotlin stdlib jar vulnerable library found in base branch master vulnerability details in jetbrains kotlin before it was not possible to lock dependencies for multiplatform gradle projects publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend | 0 |
193,175 | 15,369,617,218 | IssuesEvent | 2021-03-02 07:38:25 | dusk-network/plonk | https://api.github.com/repos/dusk-network/plonk | closed | Finish Zerocaf paper | Epic area:cryptography area:documentation area:research team:R&D | The Zerocaf paper is now centric around Fq arithmetic inside of Fq circuits. This is to be finished and put into Review prior to an IACR.
This Epic will have to include iterations of each of the sections and then their individual review. | 1.0 | Finish Zerocaf paper - The Zerocaf paper is now centric around Fq arithmetic inside of Fq circuits. This is to be finished and put into Review prior to an IACR.
This Epic will have to include iterations of each of the sections and then their individual review. | non_main | finish zerocaf paper the zerocaf paper is now centric around fq arithmetic inside of fq circuits this is to be finished and put into review prior to an iacr this epic will have to include iterations of each of the sections and then their individual review | 0 |
489 | 3,775,461,663 | IssuesEvent | 2016-03-17 13:40:09 | backdrop-ops/contrib | https://api.github.com/repos/backdrop-ops/contrib | closed | MathJax for BackdropCMS | Maintainer application | I would you like to share with Backdrop community a module named [MathJax](https://github.com/rhoscalas/MathJax-BackdropCMS). Thanks! | True | MathJax for BackdropCMS - I would you like to share with Backdrop community a module named [MathJax](https://github.com/rhoscalas/MathJax-BackdropCMS). Thanks! | main | mathjax for backdropcms i would you like to share with backdrop community a module named thanks | 1 |
1,242 | 5,301,009,862 | IssuesEvent | 2017-02-10 07:59:07 | Kristinita/KeypressLog | https://api.github.com/repos/Kristinita/KeypressLog | closed | [Question] Force logging | need_maintainer question | ### 1. Problem
If
+ KeypressLog run,
+ User remove `logs` folder.
Logs is no longer recorded.
It would be nice, if KeypressLog continued to work after removing `logs` folder.
### 2. Justification
I think, the majority of users don't need old logs. For example, I write `bat` file for move old logs to Recycle Bin (use `recycle` command of [**CmdUtils**](http://www.maddogsw.com/cmdutils/))
```shell
start recycle -f "D:\KeypressLog\logs"
```
I create task in my Task Scheduler for run this command every day. After command run, KeypressLog don't write logs.
Thanks. | True | [Question] Force logging - ### 1. Problem
If
+ KeypressLog run,
+ User remove `logs` folder.
Logs is no longer recorded.
It would be nice, if KeypressLog continued to work after removing `logs` folder.
### 2. Justification
I think, the majority of users don't need old logs. For example, I write `bat` file for move old logs to Recycle Bin (use `recycle` command of [**CmdUtils**](http://www.maddogsw.com/cmdutils/))
```shell
start recycle -f "D:\KeypressLog\logs"
```
I create task in my Task Scheduler for run this command every day. After command run, KeypressLog don't write logs.
Thanks. | main | force logging problem if keypresslog run user remove logs folder logs is no longer recorded it would be nice if keypresslog continued to work after removing logs folder justification i think the majority of users don t need old logs for example i write bat file for move old logs to recycle bin use recycle command of shell start recycle f d keypresslog logs i create task in my task scheduler for run this command every day after command run keypresslog don t write logs thanks | 1 |
119,024 | 10,023,095,240 | IssuesEvent | 2019-07-16 18:19:53 | grpc/grpc | https://api.github.com/repos/grpc/grpc | closed | use of undeclared identifier 'TCP_USER_TIMEOUT' | area/test kind/bug lang/c++ priority/P1 |
### What version of gRPC and what language are you using?
latest
### What operating system (Linux, Windows,...) and version?
mac
### What runtime / compiler are you using (e.g. python version or version of gcc)
Apple LLVM version 9.0.0 (clang-900.0.39.2)
### What did you do?
tools/run_tests/run_tests.py -l c++ -x report.xml -c gcov
### What did you expect to see?
tests passed
### What did you see instead?
```
[CXX] Compiling test/cpp/interop/grpclb_fallback_test.cc
mkdir -p `dirname /Users/w00482011/grpc-fork2/grpc/objs/gcov/test/cpp/interop/grpclb_fallback_test.o`
g++ -Ithird_party/googletest/googletest/include -Ithird_party/googletest/googlemock/include -Ithird_party/boringssl/include -Ithird_party/cares -Ithird_party/cares/cares -Ithird_party/address_sorting/include -g -Wall -Wextra -Werror -Wno-long-long -Wno-unused-parameter -DOSATOMIC_USE_INLINED=1 -Wno-deprecated-declarations -Ithird_party/nanopb -DPB_FIELD_32BIT -O0 -fprofile-arcs -ftest-coverage -Wno-return-type -fPIC -I. -Iinclude -I/Users/w00482011/grpc-fork2/grpc/gens -I/usr/local/include -D_DEBUG -DDEBUG -DGPR_GCOV -DINSTALL_PREFIX=\"/usr/local\" -DGRPC_TEST_SLOWDOWN_MACHINE_FACTOR=1.000000 -Ithird_party/zlib -Qunused-arguments -pthread -I/usr/local/include -std=c++11 -stdlib=libc++ -Wnon-virtual-dtor -MMD -MF /Users/w00482011/grpc-fork2/grpc/objs/gcov/test/cpp/interop/grpclb_fallback_test.dep -c -o /Users/w00482011/grpc-fork2/grpc/objs/gcov/test/cpp/interop/grpclb_fallback_test.o test/cpp/interop/grpclb_fallback_test.cc
test/cpp/interop/grpclb_fallback_test.cc:123:40: error: use of undeclared identifier 'TCP_USER_TIMEOUT'
if (0 != setsockopt(fd, IPPROTO_TCP, TCP_USER_TIMEOUT, &timeout,
^
test/cpp/interop/grpclb_fallback_test.cc:130:40: error: use of undeclared identifier 'TCP_USER_TIMEOUT'
if (0 != getsockopt(fd, IPPROTO_TCP, TCP_USER_TIMEOUT, &newval, &len) ||
^
2 errors generated.
make: *** [/Users/w00482011/grpc-fork2/grpc/objs/gcov/test/cpp/interop/grpclb_fallback_test.o] Error 1
make: *** Waiting for unfinished jobs....
```
### Anything else we should know about your project / environment?
tools/run_tests/run_tests.py -l c -x report.xml -c gcov
is ok | 1.0 | use of undeclared identifier 'TCP_USER_TIMEOUT' -
### What version of gRPC and what language are you using?
latest
### What operating system (Linux, Windows,...) and version?
mac
### What runtime / compiler are you using (e.g. python version or version of gcc)
Apple LLVM version 9.0.0 (clang-900.0.39.2)
### What did you do?
tools/run_tests/run_tests.py -l c++ -x report.xml -c gcov
### What did you expect to see?
tests passed
### What did you see instead?
```
[CXX] Compiling test/cpp/interop/grpclb_fallback_test.cc
mkdir -p `dirname /Users/w00482011/grpc-fork2/grpc/objs/gcov/test/cpp/interop/grpclb_fallback_test.o`
g++ -Ithird_party/googletest/googletest/include -Ithird_party/googletest/googlemock/include -Ithird_party/boringssl/include -Ithird_party/cares -Ithird_party/cares/cares -Ithird_party/address_sorting/include -g -Wall -Wextra -Werror -Wno-long-long -Wno-unused-parameter -DOSATOMIC_USE_INLINED=1 -Wno-deprecated-declarations -Ithird_party/nanopb -DPB_FIELD_32BIT -O0 -fprofile-arcs -ftest-coverage -Wno-return-type -fPIC -I. -Iinclude -I/Users/w00482011/grpc-fork2/grpc/gens -I/usr/local/include -D_DEBUG -DDEBUG -DGPR_GCOV -DINSTALL_PREFIX=\"/usr/local\" -DGRPC_TEST_SLOWDOWN_MACHINE_FACTOR=1.000000 -Ithird_party/zlib -Qunused-arguments -pthread -I/usr/local/include -std=c++11 -stdlib=libc++ -Wnon-virtual-dtor -MMD -MF /Users/w00482011/grpc-fork2/grpc/objs/gcov/test/cpp/interop/grpclb_fallback_test.dep -c -o /Users/w00482011/grpc-fork2/grpc/objs/gcov/test/cpp/interop/grpclb_fallback_test.o test/cpp/interop/grpclb_fallback_test.cc
test/cpp/interop/grpclb_fallback_test.cc:123:40: error: use of undeclared identifier 'TCP_USER_TIMEOUT'
if (0 != setsockopt(fd, IPPROTO_TCP, TCP_USER_TIMEOUT, &timeout,
^
test/cpp/interop/grpclb_fallback_test.cc:130:40: error: use of undeclared identifier 'TCP_USER_TIMEOUT'
if (0 != getsockopt(fd, IPPROTO_TCP, TCP_USER_TIMEOUT, &newval, &len) ||
^
2 errors generated.
make: *** [/Users/w00482011/grpc-fork2/grpc/objs/gcov/test/cpp/interop/grpclb_fallback_test.o] Error 1
make: *** Waiting for unfinished jobs....
```
### Anything else we should know about your project / environment?
tools/run_tests/run_tests.py -l c -x report.xml -c gcov
is ok | non_main | use of undeclared identifier tcp user timeout what version of grpc and what language are you using latest what operating system linux windows and version mac what runtime compiler are you using e g python version or version of gcc apple llvm version clang what did you do tools run tests run tests py l c x report xml c gcov what did you expect to see tests passed what did you see instead compiling test cpp interop grpclb fallback test cc mkdir p dirname users grpc grpc objs gcov test cpp interop grpclb fallback test o g ithird party googletest googletest include ithird party googletest googlemock include ithird party boringssl include ithird party cares ithird party cares cares ithird party address sorting include g wall wextra werror wno long long wno unused parameter dosatomic use inlined wno deprecated declarations ithird party nanopb dpb field fprofile arcs ftest coverage wno return type fpic i iinclude i users grpc grpc gens i usr local include d debug ddebug dgpr gcov dinstall prefix usr local dgrpc test slowdown machine factor ithird party zlib qunused arguments pthread i usr local include std c stdlib libc wnon virtual dtor mmd mf users grpc grpc objs gcov test cpp interop grpclb fallback test dep c o users grpc grpc objs gcov test cpp interop grpclb fallback test o test cpp interop grpclb fallback test cc test cpp interop grpclb fallback test cc error use of undeclared identifier tcp user timeout if setsockopt fd ipproto tcp tcp user timeout timeout test cpp interop grpclb fallback test cc error use of undeclared identifier tcp user timeout if getsockopt fd ipproto tcp tcp user timeout newval len errors generated make error make waiting for unfinished jobs anything else we should know about your project environment tools run tests run tests py l c x report xml c gcov is ok | 0 |
130,861 | 5,134,550,892 | IssuesEvent | 2017-01-11 09:24:59 | zetkin/organize.zetk.in | https://api.github.com/repos/zetkin/organize.zetk.in | closed | Mini-calendar cropped | bug priority | The new filter drawer feature which expands/contracts the header of root panes to hide or show the filters adds a fixed height to pane headers. This is incompatible with having the `ActionMiniCalendar` in the pane headers of `AllActionsPane`, `ActionDistributionPane` and `CampaignPlaybackPane`. | 1.0 | Mini-calendar cropped - The new filter drawer feature which expands/contracts the header of root panes to hide or show the filters adds a fixed height to pane headers. This is incompatible with having the `ActionMiniCalendar` in the pane headers of `AllActionsPane`, `ActionDistributionPane` and `CampaignPlaybackPane`. | non_main | mini calendar cropped the new filter drawer feature which expands contracts the header of root panes to hide or show the filters adds a fixed height to pane headers this is incompatible with having the actionminicalendar in the pane headers of allactionspane actiondistributionpane and campaignplaybackpane | 0 |
5,584 | 27,985,289,168 | IssuesEvent | 2023-03-26 16:20:28 | rollerderby/scoreboard | https://api.github.com/repos/rollerderby/scoreboard | closed | Allow user to set order of jams in score view | Feature Request maintainer needed | In the current dev version, the operator screen shows the jams with the most recent at the top of the list. There is certainly a logic for this order, but I suspect there will be strong demand for the option to set it in the opposite order. I would suggest adding this as a configurable option if it isn't already and I just haven't found it. | True | Allow user to set order of jams in score view - In the current dev version, the operator screen shows the jams with the most recent at the top of the list. There is certainly a logic for this order, but I suspect there will be strong demand for the option to set it in the opposite order. I would suggest adding this as a configurable option if it isn't already and I just haven't found it. | main | allow user to set order of jams in score view in the current dev version the operator screen shows the jams with the most recent at the top of the list there is certainly a logic for this order but i suspect there will be strong demand for the option to set it in the opposite order i would suggest adding this as a configurable option if it isn t already and i just haven t found it | 1 |
3,635 | 14,701,887,912 | IssuesEvent | 2021-01-04 12:41:12 | DLR-RM/stable-baselines3 | https://api.github.com/repos/DLR-RM/stable-baselines3 | closed | [Question] How to implement RNN? | Maintainers on vacation question | Hi, I want to implement a policy network for RNN, but I only found the following message:
```
# TODO (GH/1): add support for RNN policies
# if state is None:
# state = self.initial_state
# if mask is None:
# mask = [False for _ in range(self.n_envs)]
```
I check the state, but cannot find how to control the state value.
I think it will be better and easy to implement that make the policy network ifself to maintian all hidden states, and add a function of reset() to policy network. the RL algorithom only need to call the reset() at the begining of each episode.
Or can you tell me that how it implement currently to support RNN learning?
Best | True | [Question] How to implement RNN? - Hi, I want to implement a policy network for RNN, but I only found the following message:
```
# TODO (GH/1): add support for RNN policies
# if state is None:
# state = self.initial_state
# if mask is None:
# mask = [False for _ in range(self.n_envs)]
```
I check the state, but cannot find how to control the state value.
I think it will be better and easy to implement that make the policy network ifself to maintian all hidden states, and add a function of reset() to policy network. the RL algorithom only need to call the reset() at the begining of each episode.
Or can you tell me that how it implement currently to support RNN learning?
Best | main | how to implement rnn hi i want to implement a policy network for rnn but i only found the following message todo gh add support for rnn policies if state is none state self initial state if mask is none mask i check the state but cannot find how to control the state value i think it will be better and easy to implement that make the policy network ifself to maintian all hidden states and add a function of reset to policy network the rl algorithom only need to call the reset at the begining of each episode or can you tell me that how it implement currently to support rnn learning best | 1 |
143,277 | 19,177,908,119 | IssuesEvent | 2021-12-04 00:04:04 | samq-ghdemo/js-monorepo | https://api.github.com/repos/samq-ghdemo/js-monorepo | opened | CVE-2020-8158 (High) detected in typeorm-0.2.24.tgz | security vulnerability | ## CVE-2020-8158 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>typeorm-0.2.24.tgz</b></p></summary>
<p>Data-Mapper ORM for TypeScript, ES7, ES6, ES5. Supports MySQL, PostgreSQL, MariaDB, SQLite, MS SQL Server, Oracle, MongoDB databases.</p>
<p>Library home page: <a href="https://registry.npmjs.org/typeorm/-/typeorm-0.2.24.tgz">https://registry.npmjs.org/typeorm/-/typeorm-0.2.24.tgz</a></p>
<p>Path to dependency file: js-monorepo/nodejs-goof/package.json</p>
<p>Path to vulnerable library: /nodejs-goof/node_modules/typeorm/package.json</p>
<p>
Dependency Hierarchy:
- :x: **typeorm-0.2.24.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-ghdemo/js-monorepo/commit/f3701923c18333c1e4e49bf595dd36b3f186812f">f3701923c18333c1e4e49bf595dd36b3f186812f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution vulnerability in the TypeORM package < 0.2.25 may allow attackers to add or modify Object properties leading to further denial of service or SQL injection attacks.
<p>Publish Date: 2020-09-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8158>CVE-2020-8158</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-8158">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-8158</a></p>
<p>Release Date: 2020-09-29</p>
<p>Fix Resolution: 0.2.25</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"typeorm","packageVersion":"0.2.24","packageFilePaths":["/nodejs-goof/package.json"],"isTransitiveDependency":false,"dependencyTree":"typeorm:0.2.24","isMinimumFixVersionAvailable":true,"minimumFixVersion":"0.2.25","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2020-8158","vulnerabilityDetails":"Prototype pollution vulnerability in the TypeORM package \u003c 0.2.25 may allow attackers to add or modify Object properties leading to further denial of service or SQL injection attacks.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8158","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> --> | True | CVE-2020-8158 (High) detected in typeorm-0.2.24.tgz - ## CVE-2020-8158 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>typeorm-0.2.24.tgz</b></p></summary>
<p>Data-Mapper ORM for TypeScript, ES7, ES6, ES5. Supports MySQL, PostgreSQL, MariaDB, SQLite, MS SQL Server, Oracle, MongoDB databases.</p>
<p>Library home page: <a href="https://registry.npmjs.org/typeorm/-/typeorm-0.2.24.tgz">https://registry.npmjs.org/typeorm/-/typeorm-0.2.24.tgz</a></p>
<p>Path to dependency file: js-monorepo/nodejs-goof/package.json</p>
<p>Path to vulnerable library: /nodejs-goof/node_modules/typeorm/package.json</p>
<p>
Dependency Hierarchy:
- :x: **typeorm-0.2.24.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-ghdemo/js-monorepo/commit/f3701923c18333c1e4e49bf595dd36b3f186812f">f3701923c18333c1e4e49bf595dd36b3f186812f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution vulnerability in the TypeORM package < 0.2.25 may allow attackers to add or modify Object properties leading to further denial of service or SQL injection attacks.
<p>Publish Date: 2020-09-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8158>CVE-2020-8158</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-8158">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-8158</a></p>
<p>Release Date: 2020-09-29</p>
<p>Fix Resolution: 0.2.25</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"typeorm","packageVersion":"0.2.24","packageFilePaths":["/nodejs-goof/package.json"],"isTransitiveDependency":false,"dependencyTree":"typeorm:0.2.24","isMinimumFixVersionAvailable":true,"minimumFixVersion":"0.2.25","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2020-8158","vulnerabilityDetails":"Prototype pollution vulnerability in the TypeORM package \u003c 0.2.25 may allow attackers to add or modify Object properties leading to further denial of service or SQL injection attacks.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8158","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> --> | non_main | cve high detected in typeorm tgz cve high severity vulnerability vulnerable library typeorm tgz data mapper orm for typescript supports mysql postgresql mariadb sqlite ms sql server oracle mongodb databases library home page a href path to dependency file js monorepo nodejs goof package json path to vulnerable library nodejs goof node modules typeorm package json dependency hierarchy x typeorm tgz vulnerable library found in head commit a href found in base branch main vulnerability details prototype pollution vulnerability in the typeorm package may allow attackers to add or modify object properties leading to further denial of service or sql injection attacks publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree typeorm isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails prototype pollution vulnerability in the typeorm package may allow attackers to add or modify object properties leading to further denial of service or sql injection attacks vulnerabilityurl | 0 |
3,018 | 3,289,033,780 | IssuesEvent | 2015-10-29 17:19:34 | zaproxy/zaproxy | https://api.github.com/repos/zaproxy/zaproxy | closed | Scripts - Save Button Enablement Issue | bug Component-Scripts Component-UI Usability | Noticed this morning that when I changed a script, and tried to exit ZAP. ZAP complained that the script wasn't saved (cool) but that the save button for the script was disabled.
Screenshot attached (see red boxes)

.
Selecting another item in the script tree and then going back to the original script caused the Save button to become enabled and allowed me to save. | True | Scripts - Save Button Enablement Issue - Noticed this morning that when I changed a script, and tried to exit ZAP. ZAP complained that the script wasn't saved (cool) but that the save button for the script was disabled.
Screenshot attached (see red boxes)

.
Selecting another item in the script tree and then going back to the original script caused the Save button to become enabled and allowed me to save. | non_main | scripts save button enablement issue noticed this morning that when i changed a script and tried to exit zap zap complained that the script wasn t saved cool but that the save button for the script was disabled screenshot attached see red boxes selecting another item in the script tree and then going back to the original script caused the save button to become enabled and allowed me to save | 0 |
3,525 | 13,865,247,577 | IssuesEvent | 2020-10-16 03:46:32 | TabbycatDebate/tabbycat | https://api.github.com/repos/TabbycatDebate/tabbycat | closed | Write framework for managing different result types | awaiting maintainer refactoring | Currently the GUI provides no way to nominate a single team winning, as they do in final elimination rounds. | True | Write framework for managing different result types - Currently the GUI provides no way to nominate a single team winning, as they do in final elimination rounds. | main | write framework for managing different result types currently the gui provides no way to nominate a single team winning as they do in final elimination rounds | 1 |
274,978 | 23,885,572,887 | IssuesEvent | 2022-09-08 07:22:58 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | opened | roachtest: transfer-leases/drain failed | C-test-failure O-robot O-roachtest release-blocker branch-release-22.2 | roachtest.transfer-leases/drain [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6383281?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6383281?buildTab=artifacts#/transfer-leases/drain) on release-22.2 @ [f8e04f32b84a9727c9e813d0096bde2de64d5675](https://github.com/cockroachdb/cockroach/commits/f8e04f32b84a9727c9e813d0096bde2de64d5675):
```
test artifacts and logs in: /artifacts/transfer-leases/drain/run_1
quit.go:72,quit.go:323,soon.go:69,retry.go:207,soon.go:75,soon.go:48,quit.go:228,quit.go:95,quit.go:154,context.go:91,quit.go:153,quit.go:95,quit.go:54,quit.go:359,test_runner.go:908: (1) ranges with no lease outside of node 1: []string{"22", "19"}
```
<p>Parameters: <code>ROACHTEST_cloud=gce</code>
, <code>ROACHTEST_cpu=4</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #83261 roachtest: transfer-leases/drain-other-node failed [C-test-failure GA-blocker O-roachtest O-robot T-kv branch-master]
</p>
</details>
/cc @cockroachdb/kv-triage
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*transfer-leases/drain.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| 2.0 | roachtest: transfer-leases/drain failed - roachtest.transfer-leases/drain [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6383281?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6383281?buildTab=artifacts#/transfer-leases/drain) on release-22.2 @ [f8e04f32b84a9727c9e813d0096bde2de64d5675](https://github.com/cockroachdb/cockroach/commits/f8e04f32b84a9727c9e813d0096bde2de64d5675):
```
test artifacts and logs in: /artifacts/transfer-leases/drain/run_1
quit.go:72,quit.go:323,soon.go:69,retry.go:207,soon.go:75,soon.go:48,quit.go:228,quit.go:95,quit.go:154,context.go:91,quit.go:153,quit.go:95,quit.go:54,quit.go:359,test_runner.go:908: (1) ranges with no lease outside of node 1: []string{"22", "19"}
```
<p>Parameters: <code>ROACHTEST_cloud=gce</code>
, <code>ROACHTEST_cpu=4</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #83261 roachtest: transfer-leases/drain-other-node failed [C-test-failure GA-blocker O-roachtest O-robot T-kv branch-master]
</p>
</details>
/cc @cockroachdb/kv-triage
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*transfer-leases/drain.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| non_main | roachtest transfer leases drain failed roachtest transfer leases drain with on release test artifacts and logs in artifacts transfer leases drain run quit go quit go soon go retry go soon go soon go quit go quit go quit go context go quit go quit go quit go quit go test runner go ranges with no lease outside of node string parameters roachtest cloud gce roachtest cpu roachtest ssd help see see same failure on other branches roachtest transfer leases drain other node failed cc cockroachdb kv triage | 0 |
562 | 4,016,096,163 | IssuesEvent | 2016-05-15 11:17:34 | duckduckgo/zeroclickinfo-goodies | https://api.github.com/repos/duckduckgo/zeroclickinfo-goodies | closed | CSS Language Cheat Sheet: Fix typo on description | Maintainer Input Requested | This description: "Specifies the stack order of an elemen"
Should be: "Specifies the stack order of an element"
:ok_hand:
------
IA Page: http://duck.co/ia/view/css_cheat_sheet
[Maintainer](http://docs.duckduckhack.com/maintaining/guidelines.html): @ethanchewy | True | CSS Language Cheat Sheet: Fix typo on description - This description: "Specifies the stack order of an elemen"
Should be: "Specifies the stack order of an element"
:ok_hand:
------
IA Page: http://duck.co/ia/view/css_cheat_sheet
[Maintainer](http://docs.duckduckhack.com/maintaining/guidelines.html): @ethanchewy | main | css language cheat sheet fix typo on description this description specifies the stack order of an elemen should be specifies the stack order of an element ok hand ia page ethanchewy | 1 |
5,376 | 27,018,196,870 | IssuesEvent | 2023-02-10 21:42:08 | pulp/pulp-oci-images | https://api.github.com/repos/pulp/pulp-oci-images | closed | Deduplicate the CI jobs for single-process pulp vs single-process galaxy images | Triage-Needed Maintainability | These 2 sections are so similar, that they should be differentiated via variables.
Also, we can hopefully deduplicate the steps to tag and push the images. | True | Deduplicate the CI jobs for single-process pulp vs single-process galaxy images - These 2 sections are so similar, that they should be differentiated via variables.
Also, we can hopefully deduplicate the steps to tag and push the images. | main | deduplicate the ci jobs for single process pulp vs single process galaxy images these sections are so similar that they should be differentiated via variables also we can hopefully deduplicate the steps to tag and push the images | 1 |
194,649 | 14,684,624,690 | IssuesEvent | 2021-01-01 04:04:04 | github-vet/rangeloop-pointer-findings | https://api.github.com/repos/github-vet/rangeloop-pointer-findings | closed | itsivareddy/terrafrom-Oci: oci/core_volume_test.go; 16 LoC | fresh small test |
Found a possible issue in [itsivareddy/terrafrom-Oci](https://www.github.com/itsivareddy/terrafrom-Oci) at [oci/core_volume_test.go](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/core_volume_test.go#L652-L667)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message.
> reference to volumeId is reassigned at line 656
[Click here to see the code in its original context.](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/core_volume_test.go#L652-L667)
<details>
<summary>Click here to show the 16 line(s) of Go which triggered the analyzer.</summary>
```go
for _, volumeId := range volumeIds {
if ok := SweeperDefaultResourceId[volumeId]; !ok {
deleteVolumeRequest := oci_core.DeleteVolumeRequest{}
deleteVolumeRequest.VolumeId = &volumeId
deleteVolumeRequest.RequestMetadata.RetryPolicy = getRetryPolicy(true, "core")
_, error := blockstorageClient.DeleteVolume(context.Background(), deleteVolumeRequest)
if error != nil {
fmt.Printf("Error deleting Volume %s %s, It is possible that the resource is already deleted. Please verify manually \n", volumeId, error)
continue
}
waitTillCondition(testAccProvider, &volumeId, volumeSweepWaitCondition, time.Duration(3*time.Minute),
volumeSweepResponseFetchOperation, "core", true)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 075608a9e201ee0e32484da68d5ba5370dfde1be
| 1.0 | itsivareddy/terrafrom-Oci: oci/core_volume_test.go; 16 LoC -
Found a possible issue in [itsivareddy/terrafrom-Oci](https://www.github.com/itsivareddy/terrafrom-Oci) at [oci/core_volume_test.go](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/core_volume_test.go#L652-L667)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message.
> reference to volumeId is reassigned at line 656
[Click here to see the code in its original context.](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/core_volume_test.go#L652-L667)
<details>
<summary>Click here to show the 16 line(s) of Go which triggered the analyzer.</summary>
```go
for _, volumeId := range volumeIds {
if ok := SweeperDefaultResourceId[volumeId]; !ok {
deleteVolumeRequest := oci_core.DeleteVolumeRequest{}
deleteVolumeRequest.VolumeId = &volumeId
deleteVolumeRequest.RequestMetadata.RetryPolicy = getRetryPolicy(true, "core")
_, error := blockstorageClient.DeleteVolume(context.Background(), deleteVolumeRequest)
if error != nil {
fmt.Printf("Error deleting Volume %s %s, It is possible that the resource is already deleted. Please verify manually \n", volumeId, error)
continue
}
waitTillCondition(testAccProvider, &volumeId, volumeSweepWaitCondition, time.Duration(3*time.Minute),
volumeSweepResponseFetchOperation, "core", true)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 075608a9e201ee0e32484da68d5ba5370dfde1be
| non_main | itsivareddy terrafrom oci oci core volume test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message reference to volumeid is reassigned at line click here to show the line s of go which triggered the analyzer go for volumeid range volumeids if ok sweeperdefaultresourceid ok deletevolumerequest oci core deletevolumerequest deletevolumerequest volumeid volumeid deletevolumerequest requestmetadata retrypolicy getretrypolicy true core error blockstorageclient deletevolume context background deletevolumerequest if error nil fmt printf error deleting volume s s it is possible that the resource is already deleted please verify manually n volumeid error continue waittillcondition testaccprovider volumeid volumesweepwaitcondition time duration time minute volumesweepresponsefetchoperation core true leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id | 0 |
4,067 | 19,073,317,965 | IssuesEvent | 2021-11-27 09:44:33 | WarenGonzaga/css-text-portrait-builder | https://api.github.com/repos/WarenGonzaga/css-text-portrait-builder | closed | replace license to GPL-3.0 | chore maintainers only | The reason is that my project is not a library or framework. It is a builder a TOOL which means not fit for MIT license. I'm gonna change this to GPL-3.0 to match the purpose of the project. 👌 | True | replace license to GPL-3.0 - The reason is that my project is not a library or framework. It is a builder a TOOL which means not fit for MIT license. I'm gonna change this to GPL-3.0 to match the purpose of the project. 👌 | main | replace license to gpl the reason is that my project is not a library or framework it is a builder a tool which means not fit for mit license i m gonna change this to gpl to match the purpose of the project 👌 | 1 |
34,484 | 2,781,409,937 | IssuesEvent | 2015-05-06 13:14:25 | facelessuser/ApplySyntax | https://api.github.com/repos/facelessuser/ApplySyntax | closed | Scan extensions as first rule | Enhancement Priority - Low | When using `extensions` scan first before evaluating rules and skip rules if there was a match. | 1.0 | Scan extensions as first rule - When using `extensions` scan first before evaluating rules and skip rules if there was a match. | non_main | scan extensions as first rule when using extensions scan first before evaluating rules and skip rules if there was a match | 0 |
226,969 | 17,369,165,354 | IssuesEvent | 2021-07-30 11:37:50 | allinbits/cosmos-cash | https://api.github.com/repos/allinbits/cosmos-cash | closed | Docs: Implement CODEOWNER files | documentation | ## Objective
This issue is raised to implement the specific recommendations of ADR-002, specifically the [CODEOWENER files](https://docs.github.com/en/github/creating-cloning-and-archiving-repositories/creating-a-repository-on-github/about-code-owners)
## Tasks
- [ ] Create top-level CODEOWNERS file
- [ ] In CODEOWNERS file add Barrie as a reviewer for docs/ folders | 1.0 | Docs: Implement CODEOWNER files - ## Objective
This issue is raised to implement the specific recommendations of ADR-002, specifically the [CODEOWENER files](https://docs.github.com/en/github/creating-cloning-and-archiving-repositories/creating-a-repository-on-github/about-code-owners)
## Tasks
- [ ] Create top-level CODEOWNERS file
- [ ] In CODEOWNERS file add Barrie as a reviewer for docs/ folders | non_main | docs implement codeowner files objective this issue is raised to implement the specific recommendations of adr specifically the tasks create top level codeowners file in codeowners file add barrie as a reviewer for docs folders | 0 |
1,386 | 6,011,674,702 | IssuesEvent | 2017-06-06 15:39:27 | ansible/ansible-modules-extras | https://api.github.com/repos/ansible/ansible-modules-extras | closed | win_chocolatey error: "The package was not found with the source listed" | affects_2.0 feature_idea waiting_on_maintainer windows | ##### Issue Type:
- Feature Idea
##### Component Name:
win_chocolatey module
##### Ansible Version:
```
ansible 2.0.1.0
```
##### Ansible Configuration:
My `ansible.cfg` file:
[defaults]
hostfile = ./hosts
My `hosts` file:
[win_clones]
windows10 ansible_host=172.18.4.16 ansible_password="pass"
My `group_vars/win_clones` file:
ansible_user: diego
ansible_port: 5986
ansible_connection: winrm
ansible_winrm_server_cert_validation: ignore
My `choco_test.yml` playbook:
- name: Test
hosts: windows10
tasks:
- name: Install/Uninstall
win_chocolatey:
name: vlc
state: present
##### Environment:
Host = ubuntu 15.10 VM
Client = windows10 VM
I use a shared folder on NAS at : \174.18.4.17\choco-repo\choco-repo\
##### Summary:
I would like to use win_chocolatey module to pull vlc package present on my shared folder using Ansible. I therefore run:
ansible-playbook choco_test.yml
##### Actual Results:
I get the following error:
```
fatal: [windows10]: FAILED! => {"changed": false, "choco_error_cmd": "choco.exe install -dv -y vlc -source \\\\172.18.4.15\\choco-repo\\choco-repo\\", "choco_error_log": "Chocolatey is running on Windows v 10.0.10586.0 Attempting to delete file \"C:/ProgramData/chocolatey/choco.exe.old\". Attempting to delete file \"C:\\ProgramData\\chocolatey\\choco.exe.old\". Command line: \"C:\\ProgramData\\chocolatey\\choco.exe\" install -dv -y vlc -source \\\\172.18.4.15\\choco-repo\\choco-repo\\ Received arguments: install -dv -y vlc -source \\\\172.18.4.15\\choco-repo\\choco-repo\\ NOTE: Hiding sensitive configuration data! Please double and triple check to be sure no sensitive data is shown, especially if copying output to a gist for review. Configuration: CommandName='install'| CacheLocation='C:\\Users\\diego\\AppData\\Local\\Temp'| ContainsLegacyPackageInstalls='True'| CommandExecutionTimeoutSeconds='2700'| Sources='\\\\172.18.4.15\\choco-repo\\choco-repo\\'|Debug='True'| Verbose='True'|Force='False'|Noop='False'|HelpRequested='False'| RegularOutput='True'|QuietOutput='False'|PromptForConfirmation='False'| AcceptLicense='True'|AllowUnofficialBuild='False'|Input='vlc'| AllVersions='False'|SkipPackageInstallProvider='False'| PackageNames='vlc'|Prerelease='False'|ForceX86='False'| OverrideArguments='False'|NotSilent='False'|IgnoreDependencies='False'| AllowMultipleVersions='False'|AllowDowngrade='False'| ForceDependencies='False'|Information.PlatformType='Windows'| Information.PlatformVersion='10.0.10586.0'| Information.PlatformName='Windows'| Information.ChocolateyVersion='0.9.9.11'| Information.ChocolateyProductVersion='0.9.9.11'| Information.FullName='choco, Version=0.9.9.11, Culture=neutral, PublicKeyToken=79d02ea9cad655eb'| Information.Is64Bit='True'|Information.IsInteractive='False'| Information.IsUserAdministrator='True'| Information.IsProcessElevated='True'|Features.AutoUninstaller='True'| Features.CheckSumFiles='True'|Features.FailOnAutoUninstaller='False'| ListCommand.LocalOnly='False'| ListCommand.IncludeRegistryPrograms='False'| UpgradeCommand.FailOnUnfound='False'| UpgradeCommand.FailOnNotInstalled='False'| UpgradeCommand.NotifyOnlyAvailableUpgrades='False'| NewCommand.AutomaticPackage='False'|SourceCommand.Command='unknown'| SourceCommand.Priority='0'|FeatureCommand.Command='unknown'| ConfigCommand.Command='unknown'|PushCommand.TimeoutInSeconds='0'| PinCommand.Command='unknown'| _ Chocolatey:ChocolateyInstallCommand - Normal Run Mode _ Installing the following packages: vlc By installing you accept licenses for the packages. vlc not installed. The package was not found with the source(s) listed. If you specified a particular version and are receiving this message, it is possible that the package name exists but the version does not. Version: \"\" Source(s): \"\\\\172.18.4.15\\choco-repo\\choco-repo\\\" Chocolatey installed 0/1 package(s). 1 package(s) failed. See the log for details (C:\\ProgramData\\chocolatey\\logs\\chocolatey.log). Failures: - vlc Exiting with 1", "failed": true, "msg": "Error installing vlc"}
```
NOTE: I can pull vlc package from the shared folder using chocolatey directly from the client by running:
```
choco install vlc -y
```
Any help?
| True | win_chocolatey error: "The package was not found with the source listed" - ##### Issue Type:
- Feature Idea
##### Component Name:
win_chocolatey module
##### Ansible Version:
```
ansible 2.0.1.0
```
##### Ansible Configuration:
My `ansible.cfg` file:
[defaults]
hostfile = ./hosts
My `hosts` file:
[win_clones]
windows10 ansible_host=172.18.4.16 ansible_password="pass"
My `group_vars/win_clones` file:
ansible_user: diego
ansible_port: 5986
ansible_connection: winrm
ansible_winrm_server_cert_validation: ignore
My `choco_test.yml` playbook:
- name: Test
hosts: windows10
tasks:
- name: Install/Uninstall
win_chocolatey:
name: vlc
state: present
##### Environment:
Host = ubuntu 15.10 VM
Client = windows10 VM
I use a shared folder on NAS at : \174.18.4.17\choco-repo\choco-repo\
##### Summary:
I would like to use win_chocolatey module to pull vlc package present on my shared folder using Ansible. I therefore run:
ansible-playbook choco_test.yml
##### Actual Results:
I get the following error:
```
fatal: [windows10]: FAILED! => {"changed": false, "choco_error_cmd": "choco.exe install -dv -y vlc -source \\\\172.18.4.15\\choco-repo\\choco-repo\\", "choco_error_log": "Chocolatey is running on Windows v 10.0.10586.0 Attempting to delete file \"C:/ProgramData/chocolatey/choco.exe.old\". Attempting to delete file \"C:\\ProgramData\\chocolatey\\choco.exe.old\". Command line: \"C:\\ProgramData\\chocolatey\\choco.exe\" install -dv -y vlc -source \\\\172.18.4.15\\choco-repo\\choco-repo\\ Received arguments: install -dv -y vlc -source \\\\172.18.4.15\\choco-repo\\choco-repo\\ NOTE: Hiding sensitive configuration data! Please double and triple check to be sure no sensitive data is shown, especially if copying output to a gist for review. Configuration: CommandName='install'| CacheLocation='C:\\Users\\diego\\AppData\\Local\\Temp'| ContainsLegacyPackageInstalls='True'| CommandExecutionTimeoutSeconds='2700'| Sources='\\\\172.18.4.15\\choco-repo\\choco-repo\\'|Debug='True'| Verbose='True'|Force='False'|Noop='False'|HelpRequested='False'| RegularOutput='True'|QuietOutput='False'|PromptForConfirmation='False'| AcceptLicense='True'|AllowUnofficialBuild='False'|Input='vlc'| AllVersions='False'|SkipPackageInstallProvider='False'| PackageNames='vlc'|Prerelease='False'|ForceX86='False'| OverrideArguments='False'|NotSilent='False'|IgnoreDependencies='False'| AllowMultipleVersions='False'|AllowDowngrade='False'| ForceDependencies='False'|Information.PlatformType='Windows'| Information.PlatformVersion='10.0.10586.0'| Information.PlatformName='Windows'| Information.ChocolateyVersion='0.9.9.11'| Information.ChocolateyProductVersion='0.9.9.11'| Information.FullName='choco, Version=0.9.9.11, Culture=neutral, PublicKeyToken=79d02ea9cad655eb'| Information.Is64Bit='True'|Information.IsInteractive='False'| Information.IsUserAdministrator='True'| Information.IsProcessElevated='True'|Features.AutoUninstaller='True'| Features.CheckSumFiles='True'|Features.FailOnAutoUninstaller='False'| ListCommand.LocalOnly='False'| ListCommand.IncludeRegistryPrograms='False'| UpgradeCommand.FailOnUnfound='False'| UpgradeCommand.FailOnNotInstalled='False'| UpgradeCommand.NotifyOnlyAvailableUpgrades='False'| NewCommand.AutomaticPackage='False'|SourceCommand.Command='unknown'| SourceCommand.Priority='0'|FeatureCommand.Command='unknown'| ConfigCommand.Command='unknown'|PushCommand.TimeoutInSeconds='0'| PinCommand.Command='unknown'| _ Chocolatey:ChocolateyInstallCommand - Normal Run Mode _ Installing the following packages: vlc By installing you accept licenses for the packages. vlc not installed. The package was not found with the source(s) listed. If you specified a particular version and are receiving this message, it is possible that the package name exists but the version does not. Version: \"\" Source(s): \"\\\\172.18.4.15\\choco-repo\\choco-repo\\\" Chocolatey installed 0/1 package(s). 1 package(s) failed. See the log for details (C:\\ProgramData\\chocolatey\\logs\\chocolatey.log). Failures: - vlc Exiting with 1", "failed": true, "msg": "Error installing vlc"}
```
NOTE: I can pull vlc package from the shared folder using chocolatey directly from the client by running:
```
choco install vlc -y
```
Any help?
| main | win chocolatey error the package was not found with the source listed issue type feature idea component name win chocolatey module ansible version ansible ansible configuration my ansible cfg file hostfile hosts my hosts file ansible host ansible password pass my group vars win clones file ansible user diego ansible port ansible connection winrm ansible winrm server cert validation ignore my choco test yml playbook name test hosts tasks name install uninstall win chocolatey name vlc state present environment host ubuntu vm client vm i use a shared folder on nas at choco repo choco repo summary i would like to use win chocolatey module to pull vlc package present on my shared folder using ansible i therefore run ansible playbook choco test yml actual results i get the following error fatal failed changed false choco error cmd choco exe install dv y vlc source choco repo choco repo choco error log chocolatey is running on windows v attempting to delete file c programdata chocolatey choco exe old attempting to delete file c programdata chocolatey choco exe old command line c programdata chocolatey choco exe install dv y vlc source choco repo choco repo received arguments install dv y vlc source choco repo choco repo note hiding sensitive configuration data please double and triple check to be sure no sensitive data is shown especially if copying output to a gist for review configuration commandname install cachelocation c users diego appdata local temp containslegacypackageinstalls true commandexecutiontimeoutseconds sources choco repo choco repo debug true verbose true force false noop false helprequested false regularoutput true quietoutput false promptforconfirmation false acceptlicense true allowunofficialbuild false input vlc allversions false skippackageinstallprovider false packagenames vlc prerelease false false overridearguments false notsilent false ignoredependencies false allowmultipleversions false allowdowngrade false forcedependencies false information platformtype windows information platformversion information platformname windows information chocolateyversion information chocolateyproductversion information fullname choco version culture neutral publickeytoken information true information isinteractive false information isuseradministrator true information isprocesselevated true features autouninstaller true features checksumfiles true features failonautouninstaller false listcommand localonly false listcommand includeregistryprograms false upgradecommand failonunfound false upgradecommand failonnotinstalled false upgradecommand notifyonlyavailableupgrades false newcommand automaticpackage false sourcecommand command unknown sourcecommand priority featurecommand command unknown configcommand command unknown pushcommand timeoutinseconds pincommand command unknown chocolatey chocolateyinstallcommand normal run mode installing the following packages vlc by installing you accept licenses for the packages vlc not installed the package was not found with the source s listed if you specified a particular version and are receiving this message it is possible that the package name exists but the version does not version source s choco repo choco repo chocolatey installed package s package s failed see the log for details c programdata chocolatey logs chocolatey log failures vlc exiting with failed true msg error installing vlc note i can pull vlc package from the shared folder using chocolatey directly from the client by running choco install vlc y any help | 1 |
10,803 | 8,728,997,344 | IssuesEvent | 2018-12-10 18:59:00 | aspnet/AspNetCore | https://api.github.com/repos/aspnet/AspNetCore | closed | Crossgen the linux-arm, linux-arm64 and linux-musl shared runtime | 3 - Done area-infrastructure cost: M | _From @natemcmaster on Thursday, April 12, 2018 8:22:54 AM_
Currently, the linux-arm shared runtime is not preoptimized via crossgen. This was not added in the first round because
- we currently lack the infrastructure to run crossgen via CI
- did not have time to validate a crossgen version works correctly
- the value of crossgen perf for linux-arm as not clear (we have a hunch, but did not do any perf tests)
_Copied from original issue: aspnet/Universe#1061_ | 1.0 | Crossgen the linux-arm, linux-arm64 and linux-musl shared runtime - _From @natemcmaster on Thursday, April 12, 2018 8:22:54 AM_
Currently, the linux-arm shared runtime is not preoptimized via crossgen. This was not added in the first round because
- we currently lack the infrastructure to run crossgen via CI
- did not have time to validate a crossgen version works correctly
- the value of crossgen perf for linux-arm as not clear (we have a hunch, but did not do any perf tests)
_Copied from original issue: aspnet/Universe#1061_ | non_main | crossgen the linux arm linux and linux musl shared runtime from natemcmaster on thursday april am currently the linux arm shared runtime is not preoptimized via crossgen this was not added in the first round because we currently lack the infrastructure to run crossgen via ci did not have time to validate a crossgen version works correctly the value of crossgen perf for linux arm as not clear we have a hunch but did not do any perf tests copied from original issue aspnet universe | 0 |
207,480 | 7,130,290,632 | IssuesEvent | 2018-01-22 05:35:08 | hasura/support | https://api.github.com/repos/hasura/support | closed | Error in auth docs | hasura/docs priority/urgent | [Sign up auth docs](https://hasura.io/_docs/auth/4.0/swagger-ui/#!/anonymous/post_signup) mentions that it requires either username, phone or email

But when a request is made to the signup endpoint without username, it returns **_"Expected username to not be blank"_**.

**Feature request:** Signup without an username, with just email + password or mobile + password | 1.0 | Error in auth docs - [Sign up auth docs](https://hasura.io/_docs/auth/4.0/swagger-ui/#!/anonymous/post_signup) mentions that it requires either username, phone or email

But when a request is made to the signup endpoint without username, it returns **_"Expected username to not be blank"_**.

**Feature request:** Signup without an username, with just email + password or mobile + password | non_main | error in auth docs mentions that it requires either username phone or email but when a request is made to the signup endpoint without username it returns expected username to not be blank feature request signup without an username with just email password or mobile password | 0 |
166,747 | 12,970,986,039 | IssuesEvent | 2020-07-21 10:10:47 | WoWManiaUK/Redemption | https://api.github.com/repos/WoWManiaUK/Redemption | closed | Karazhan Opera ewent loot | Fix - Tester Confirmed | There is a problem with Karazan opera ewent loot. Ewery bos in krazan drops 2 items per kill, but opera drops only one and it is always from shared loot. Items wich should drop depend of wich bos u kill dont drop at all. | 1.0 | Karazhan Opera ewent loot - There is a problem with Karazan opera ewent loot. Ewery bos in krazan drops 2 items per kill, but opera drops only one and it is always from shared loot. Items wich should drop depend of wich bos u kill dont drop at all. | non_main | karazhan opera ewent loot there is a problem with karazan opera ewent loot ewery bos in krazan drops items per kill but opera drops only one and it is always from shared loot items wich should drop depend of wich bos u kill dont drop at all | 0 |
7,187 | 10,565,340,852 | IssuesEvent | 2019-10-05 10:31:05 | isawnyu/isaw.web | https://api.github.com/repos/isawnyu/isaw.web | opened | ensure all amheida videos are accessible before publication | WCAG A enhancement manual content work requirement | _governing epic: #408 migrate amheida content from standalone website to the research section of isaw web_
There are videos in the Amheida website content. General experience suggests that it's likely that none of these videos have been prepared properly for accessibility (i.e., with transcripts or closed captioning). The latter condition is a WCAG violation and must be corrected before the Amheida materials can be published.
Steps:
- [ ] determine which videos need remediation (make a list)
- [ ] ask Amheida team to prepare and provide transcripts or add closed captioning
- [ ] if Amheida team provides transcripts, upload these through the plone and add appropriate tagging to alert screen-reader users to the relationship
| 1.0 | ensure all amheida videos are accessible before publication - _governing epic: #408 migrate amheida content from standalone website to the research section of isaw web_
There are videos in the Amheida website content. General experience suggests that it's likely that none of these videos have been prepared properly for accessibility (i.e., with transcripts or closed captioning). The latter condition is a WCAG violation and must be corrected before the Amheida materials can be published.
Steps:
- [ ] determine which videos need remediation (make a list)
- [ ] ask Amheida team to prepare and provide transcripts or add closed captioning
- [ ] if Amheida team provides transcripts, upload these through the plone and add appropriate tagging to alert screen-reader users to the relationship
| non_main | ensure all amheida videos are accessible before publication governing epic migrate amheida content from standalone website to the research section of isaw web there are videos in the amheida website content general experience suggests that it s likely that none of these videos have been prepared properly for accessibility i e with transcripts or closed captioning the latter condition is a wcag violation and must be corrected before the amheida materials can be published steps determine which videos need remediation make a list ask amheida team to prepare and provide transcripts or add closed captioning if amheida team provides transcripts upload these through the plone and add appropriate tagging to alert screen reader users to the relationship | 0 |
4,807 | 24,760,748,889 | IssuesEvent | 2022-10-21 23:41:33 | deislabs/spiderlightning | https://api.github.com/repos/deislabs/spiderlightning | opened | Can't create new rust project | 🐛 bug 🚧 maintainer issue | **Description of the bug**
When I try to create a new project, I get this error:
```
agracey@desktop:~/Projects/scratchpad> slight new -n slisp@v0.1 rust
Error: failed to iterate over archive
Caused by:
failed to fill whole buffer
```
**To Reproduce**
Install slight using https://deislabs.github.io/spiderlightning/#how-to-install-on-macos-and-linux and run the command above. I saw the same output on both my M1 mac and x86 linux computers.
**Additional context**
Both have cargo/rust installed using rustup.
```
agracey@desktop:~/Projects/scratchpad> rustup show
Default host: x86_64-unknown-linux-gnu
rustup home: /home/agracey/.rustup
stable-x86_64-unknown-linux-gnu (default)
rustc 1.64.0 (a55dd71d5 2022-09-19)
```
I tried in both fish and bash. | True | Can't create new rust project - **Description of the bug**
When I try to create a new project, I get this error:
```
agracey@desktop:~/Projects/scratchpad> slight new -n slisp@v0.1 rust
Error: failed to iterate over archive
Caused by:
failed to fill whole buffer
```
**To Reproduce**
Install slight using https://deislabs.github.io/spiderlightning/#how-to-install-on-macos-and-linux and run the command above. I saw the same output on both my M1 mac and x86 linux computers.
**Additional context**
Both have cargo/rust installed using rustup.
```
agracey@desktop:~/Projects/scratchpad> rustup show
Default host: x86_64-unknown-linux-gnu
rustup home: /home/agracey/.rustup
stable-x86_64-unknown-linux-gnu (default)
rustc 1.64.0 (a55dd71d5 2022-09-19)
```
I tried in both fish and bash. | main | can t create new rust project description of the bug when i try to create a new project i get this error agracey desktop projects scratchpad slight new n slisp rust error failed to iterate over archive caused by failed to fill whole buffer to reproduce install slight using and run the command above i saw the same output on both my mac and linux computers additional context both have cargo rust installed using rustup agracey desktop projects scratchpad rustup show default host unknown linux gnu rustup home home agracey rustup stable unknown linux gnu default rustc i tried in both fish and bash | 1 |
11,647 | 14,501,161,222 | IssuesEvent | 2020-12-11 19:05:50 | GoogleCloudPlatform/cloud-opensource-java | https://api.github.com/repos/GoogleCloudPlatform/cloud-opensource-java | closed | Shall we stop the Circle CI build? | process | Shall we remove Circle CI build, for the following reasons?
1. This repository has used Circle CI for Linux (Java 8) build. With the recently-introduced Github Actions that run for Java 8 (and Java 11) on Linux, I don't think we need Circle CI build.
2. Circle CI requires permission to see logs.
<img width="1345" alt="Screen Shot 2020-12-04 at 15 11 08" src="https://user-images.githubusercontent.com/28604/101221098-fcb7f880-3654-11eb-8c37-9fa2f67552c4.png">
3. CI with Github Actions is in line with other Google Cloud Java repositories maintained by Yoshi team.
# Memo on disabling Circle CI
https://support.circleci.com/hc/en-us/articles/360021666393-How-to-stop-building-by-manually-removing-the-CircleCI-webhook-and-deploy-key-from-your-GitHub-repository
- Updated https://circleci.com/hooks/github webhook as inactive.
- Updated protected branch
- not to require Circle CI
- added "units (8)" as required check. | 1.0 | Shall we stop the Circle CI build? - Shall we remove Circle CI build, for the following reasons?
1. This repository has used Circle CI for Linux (Java 8) build. With the recently-introduced Github Actions that run for Java 8 (and Java 11) on Linux, I don't think we need Circle CI build.
2. Circle CI requires permission to see logs.
<img width="1345" alt="Screen Shot 2020-12-04 at 15 11 08" src="https://user-images.githubusercontent.com/28604/101221098-fcb7f880-3654-11eb-8c37-9fa2f67552c4.png">
3. CI with Github Actions is in line with other Google Cloud Java repositories maintained by Yoshi team.
# Memo on disabling Circle CI
https://support.circleci.com/hc/en-us/articles/360021666393-How-to-stop-building-by-manually-removing-the-CircleCI-webhook-and-deploy-key-from-your-GitHub-repository
- Updated https://circleci.com/hooks/github webhook as inactive.
- Updated protected branch
- not to require Circle CI
- added "units (8)" as required check. | non_main | shall we stop the circle ci build shall we remove circle ci build for the following reasons this repository has used circle ci for linux java build with the recently introduced github actions that run for java and java on linux i don t think we need circle ci build circle ci requires permission to see logs img width alt screen shot at src ci with github actions is in line with other google cloud java repositories maintained by yoshi team memo on disabling circle ci updated webhook as inactive updated protected branch not to require circle ci added units as required check | 0 |
4,573 | 23,754,230,899 | IssuesEvent | 2022-09-01 00:27:25 | rustsec/advisory-db | https://api.github.com/repos/rustsec/advisory-db | opened | `num-format` Status | Unmaintained | Last release was over three years ago
It is using the old version of itoa: https://github.com/rustsec/advisory-db/issues/1404
Ralf was helpful to ping earlier: https://github.com/bcmyers/num-format/issues/29
Maintenance status was asked on 9 Jan 2022
https://github.com/bcmyers/num-format/issues/27
@bcmyers - I see I wonder if people should be still using this crate today and whether it would be helpful to bump the itoa dependency ? | True | `num-format` Status - Last release was over three years ago
It is using the old version of itoa: https://github.com/rustsec/advisory-db/issues/1404
Ralf was helpful to ping earlier: https://github.com/bcmyers/num-format/issues/29
Maintenance status was asked on 9 Jan 2022
https://github.com/bcmyers/num-format/issues/27
@bcmyers - I see I wonder if people should be still using this crate today and whether it would be helpful to bump the itoa dependency ? | main | num format status last release was over three years ago it is using the old version of itoa ralf was helpful to ping earlier maintenance status was asked on jan bcmyers i see i wonder if people should be still using this crate today and whether it would be helpful to bump the itoa dependency | 1 |
5,758 | 30,523,133,708 | IssuesEvent | 2023-07-19 09:24:47 | tgstation/tgstation | https://api.github.com/repos/tgstation/tgstation | closed | Revs arent being saved to manifest.txt | Maintainability/Hinders improvements Logging Administration | <!-- Write **BELOW** The Headers and **ABOVE** The comments else it may not be viewable -->
## Round ID: 97283
Check the manifest.txt: https://tgstation13.org/parsed-logs/basil/data/logs/2018/11/18/round-97283/manifest.txt
No head revs or revs listed, even though this was a rev round: https://sb.atlantaned.space/rounds/97283
| True | Revs arent being saved to manifest.txt - <!-- Write **BELOW** The Headers and **ABOVE** The comments else it may not be viewable -->
## Round ID: 97283
Check the manifest.txt: https://tgstation13.org/parsed-logs/basil/data/logs/2018/11/18/round-97283/manifest.txt
No head revs or revs listed, even though this was a rev round: https://sb.atlantaned.space/rounds/97283
| main | revs arent being saved to manifest txt round id check the manifest txt no head revs or revs listed even though this was a rev round | 1 |
3,278 | 12,509,418,999 | IssuesEvent | 2020-06-02 17:02:40 | OpenRefine/OpenRefine | https://api.github.com/repos/OpenRefine/OpenRefine | opened | Check that all i18n messages are used in the interface, in the CI | enhancement localization maintainability | **Is your feature request related to a problem or area of OpenRefine? Please describe.**
It would be nice if the CI could check that all messages ids stored in the translations are actually used in the interface. When changing a part of the UI, it is easy to forget to remove any unused messages. Having this check in place would ensure translators are not wasting their efforts on messages that are never shown.
The script could also check for duplicate message ids in the translation files (#2668).
**Describe the solution you'd like**
A basic prototype for this would just search for each message id in the JS source. There might be cases where we are dynamically constructing the message id, in which case this would falsely report that the message is not used. This could be fixed by adding as a comment all the possible values of the constructed message ids.
**Describe alternatives you've considered**
There might be off-the-shelf tooling for this.
**Additional context**
| True | Check that all i18n messages are used in the interface, in the CI - **Is your feature request related to a problem or area of OpenRefine? Please describe.**
It would be nice if the CI could check that all messages ids stored in the translations are actually used in the interface. When changing a part of the UI, it is easy to forget to remove any unused messages. Having this check in place would ensure translators are not wasting their efforts on messages that are never shown.
The script could also check for duplicate message ids in the translation files (#2668).
**Describe the solution you'd like**
A basic prototype for this would just search for each message id in the JS source. There might be cases where we are dynamically constructing the message id, in which case this would falsely report that the message is not used. This could be fixed by adding as a comment all the possible values of the constructed message ids.
**Describe alternatives you've considered**
There might be off-the-shelf tooling for this.
**Additional context**
| main | check that all messages are used in the interface in the ci is your feature request related to a problem or area of openrefine please describe it would be nice if the ci could check that all messages ids stored in the translations are actually used in the interface when changing a part of the ui it is easy to forget to remove any unused messages having this check in place would ensure translators are not wasting their efforts on messages that are never shown the script could also check for duplicate message ids in the translation files describe the solution you d like a basic prototype for this would just search for each message id in the js source there might be cases where we are dynamically constructing the message id in which case this would falsely report that the message is not used this could be fixed by adding as a comment all the possible values of the constructed message ids describe alternatives you ve considered there might be off the shelf tooling for this additional context | 1 |
224,100 | 7,466,653,648 | IssuesEvent | 2018-04-02 11:52:14 | qu4d/symmetrical-pancake | https://api.github.com/repos/qu4d/symmetrical-pancake | closed | Implement mock for blog data | Priority: Hight Type: Feature | Implement fake functionality for getting posts of blog.
Methods should allow get all posts and get post by id or name. | 1.0 | Implement mock for blog data - Implement fake functionality for getting posts of blog.
Methods should allow get all posts and get post by id or name. | non_main | implement mock for blog data implement fake functionality for getting posts of blog methods should allow get all posts and get post by id or name | 0 |
5,337 | 26,933,004,517 | IssuesEvent | 2023-02-07 18:13:30 | OpenRefine/OpenRefine | https://api.github.com/repos/OpenRefine/OpenRefine | opened | Unassign contributors automatically after a delay | maintainability | Our issues labeled "good first issue" tend to attract new contributors, which is great.
However, contributors often abandon the issue without unassigning themselves.
This means that the pool of good first issues available for prospective contributors shrinks artificially.
As a preparation for our participation in Outreachy/GSoC I have cleared assignees of all good first issues today (after checking that they had been assigned for a long time). But I think we should not have to do this manually.
There is a GitHub Action which seems to do just that:
https://github.com/marketplace/actions/unassign-contributor-after-days-of-inactivity
Any resistance to trying this out?
As an experiment I would first restrict this to the "good first issue" tag, and set a fairly generous delay - perhaps 3 months? | True | Unassign contributors automatically after a delay - Our issues labeled "good first issue" tend to attract new contributors, which is great.
However, contributors often abandon the issue without unassigning themselves.
This means that the pool of good first issues available for prospective contributors shrinks artificially.
As a preparation for our participation in Outreachy/GSoC I have cleared assignees of all good first issues today (after checking that they had been assigned for a long time). But I think we should not have to do this manually.
There is a GitHub Action which seems to do just that:
https://github.com/marketplace/actions/unassign-contributor-after-days-of-inactivity
Any resistance to trying this out?
As an experiment I would first restrict this to the "good first issue" tag, and set a fairly generous delay - perhaps 3 months? | main | unassign contributors automatically after a delay our issues labeled good first issue tend to attract new contributors which is great however contributors often abandon the issue without unassigning themselves this means that the pool of good first issues available for prospective contributors shrinks artificially as a preparation for our participation in outreachy gsoc i have cleared assignees of all good first issues today after checking that they had been assigned for a long time but i think we should not have to do this manually there is a github action which seems to do just that any resistance to trying this out as an experiment i would first restrict this to the good first issue tag and set a fairly generous delay perhaps months | 1 |
400 | 3,452,366,072 | IssuesEvent | 2015-12-17 03:35:41 | OpenLightingProject/ola | https://api.github.com/repos/OpenLightingProject/ola | opened | Add pychecker static analysis check to Travis | enhancement Maintainability Type-Task | To catch python string format errors among other things. | True | Add pychecker static analysis check to Travis - To catch python string format errors among other things. | main | add pychecker static analysis check to travis to catch python string format errors among other things | 1 |
103,813 | 8,951,033,244 | IssuesEvent | 2019-01-25 12:43:04 | FluentLenium/FluentLenium | https://api.github.com/repos/FluentLenium/FluentLenium | closed | Reorganize integration tests in fluentlenium-core | tests | Cleanup is required here. It's hard to figure out what's tested and what is not. Those are the most important tests for our users perspective and should be better maintained.
That's rather enjoyable task :)
https://github.com/FluentLenium/FluentLenium/tree/develop/fluentlenium-core/src/test/java/org/fluentlenium/integration | 1.0 | Reorganize integration tests in fluentlenium-core - Cleanup is required here. It's hard to figure out what's tested and what is not. Those are the most important tests for our users perspective and should be better maintained.
That's rather enjoyable task :)
https://github.com/FluentLenium/FluentLenium/tree/develop/fluentlenium-core/src/test/java/org/fluentlenium/integration | non_main | reorganize integration tests in fluentlenium core cleanup is required here it s hard to figure out what s tested and what is not those are the most important tests for our users perspective and should be better maintained that s rather enjoyable task | 0 |
3,113 | 11,893,788,284 | IssuesEvent | 2020-03-29 13:11:58 | callistoteam/status | https://api.github.com/repos/callistoteam/status | closed | 파크봇 복구 중 | maintain | - Event Type: Under recovery
- Start Time: 2020-03-29 20:30:00 KST
- End Time: 2020-03-29 22:00:00 KST | True | 파크봇 복구 중 - - Event Type: Under recovery
- Start Time: 2020-03-29 20:30:00 KST
- End Time: 2020-03-29 22:00:00 KST | main | 파크봇 복구 중 event type under recovery start time kst end time kst | 1 |
125,128 | 26,598,410,545 | IssuesEvent | 2023-01-23 14:10:19 | eclipse-theia/theia | https://api.github.com/repos/eclipse-theia/theia | closed | vscode: update typings for `activeSignatureHelp` | beginners good first issue vscode | <!-- Please fill out the following content for a feature request. -->
<!-- Please provide a clear description of the feature and any relevant information. -->
#### Feature Description:
The typings for `SignatureHelpContext.activeSignatureHelp` should be updated to align with the VS Code API:
https://github.com/eclipse-theia/theia/blob/1cdfae21899f1efacb2a8caec7f2147129518c8d/packages/plugin/src/theia.d.ts#L7267
The changes should be:
```diff
- readonly activeSignatureHelp?: SignatureHelp;
+ readonly activeSignatureHelp: SignatureHelp | undefined;
``` | 1.0 | vscode: update typings for `activeSignatureHelp` - <!-- Please fill out the following content for a feature request. -->
<!-- Please provide a clear description of the feature and any relevant information. -->
#### Feature Description:
The typings for `SignatureHelpContext.activeSignatureHelp` should be updated to align with the VS Code API:
https://github.com/eclipse-theia/theia/blob/1cdfae21899f1efacb2a8caec7f2147129518c8d/packages/plugin/src/theia.d.ts#L7267
The changes should be:
```diff
- readonly activeSignatureHelp?: SignatureHelp;
+ readonly activeSignatureHelp: SignatureHelp | undefined;
``` | non_main | vscode update typings for activesignaturehelp feature description the typings for signaturehelpcontext activesignaturehelp should be updated to align with the vs code api the changes should be diff readonly activesignaturehelp signaturehelp readonly activesignaturehelp signaturehelp undefined | 0 |
4,880 | 25,041,763,561 | IssuesEvent | 2022-11-04 21:42:36 | OpenRefine/OpenRefine | https://api.github.com/repos/OpenRefine/OpenRefine | closed | Create fully automated kit build | enhancement maintainability packaging | Currently the kit build process is a mix of automated (`./refine dist 3.4`) and manual steps (tag, edit, upload, etc).
### Proposed solution
Integrate as many steps as possible into the automated procedure, including:
- checking prerequisites: Apple & Google credentials, other?
- setting release version string
- tagging release in git
- injecting Google credentials
- downloading/caching Windows & Mac JREs to be bundled
- building all 4 kits - Windows (x2), Mac, Linux
- compressing .dmg
- signing release artifacts
- creating Github release & uploading artefacts (marked as draft until manually reviewed/published)
### Alternatives considered
We could keep our current mix of manual and automated steps, but the higher the number of manual steps, the more error-prone it will be. | True | Create fully automated kit build - Currently the kit build process is a mix of automated (`./refine dist 3.4`) and manual steps (tag, edit, upload, etc).
### Proposed solution
Integrate as many steps as possible into the automated procedure, including:
- checking prerequisites: Apple & Google credentials, other?
- setting release version string
- tagging release in git
- injecting Google credentials
- downloading/caching Windows & Mac JREs to be bundled
- building all 4 kits - Windows (x2), Mac, Linux
- compressing .dmg
- signing release artifacts
- creating Github release & uploading artefacts (marked as draft until manually reviewed/published)
### Alternatives considered
We could keep our current mix of manual and automated steps, but the higher the number of manual steps, the more error-prone it will be. | main | create fully automated kit build currently the kit build process is a mix of automated refine dist and manual steps tag edit upload etc proposed solution integrate as many steps as possible into the automated procedure including checking prerequisites apple google credentials other setting release version string tagging release in git injecting google credentials downloading caching windows mac jres to be bundled building all kits windows mac linux compressing dmg signing release artifacts creating github release uploading artefacts marked as draft until manually reviewed published alternatives considered we could keep our current mix of manual and automated steps but the higher the number of manual steps the more error prone it will be | 1 |
2,317 | 8,301,778,177 | IssuesEvent | 2018-09-21 12:38:15 | invertase/react-native-firebase | https://api.github.com/repos/invertase/react-native-firebase | closed | Invites is not working in latest version | android await-maintainer-feedback links 🐞 bug | ### Issue
if i run this code
```js
// create invitation
const invitation = new firebase.invites.Invitation('Title', 'Message');
invitation.setDeepLink('https://je786.app.goo.gl/testing');
// send the invitation
const invitationIds = await firebase.invites().sendInvitation(invitation);
// use the invitationIds as you see fit
```
i get red screen error
```
Can only use lower 16 bits for requestCode
```
changing the value of REQUEST_INVITE in this line below 65,536 makes it work without error
https://github.com/invertase/react-native-firebase/blob/d2c432f961ee35901c458e37a147105dfff0ec36/android/src/main/java/io/invertase/firebase/invites/RNFirebaseInvites.java#L34
### Environment
1. Application Target Platform:
Android
2. Development Operating System:
Ubuntu Linux 16.04
3. Build Tools:
Android Studio 3.1.4
4. `React Native` version:
0.55.3
5. `React Native Firebase` Version:
4.3.8
6. `Firebase` Module:
Invites
7. Are you using `typescript`?
no
| True | Invites is not working in latest version - ### Issue
if i run this code
```js
// create invitation
const invitation = new firebase.invites.Invitation('Title', 'Message');
invitation.setDeepLink('https://je786.app.goo.gl/testing');
// send the invitation
const invitationIds = await firebase.invites().sendInvitation(invitation);
// use the invitationIds as you see fit
```
i get red screen error
```
Can only use lower 16 bits for requestCode
```
changing the value of REQUEST_INVITE in this line below 65,536 makes it work without error
https://github.com/invertase/react-native-firebase/blob/d2c432f961ee35901c458e37a147105dfff0ec36/android/src/main/java/io/invertase/firebase/invites/RNFirebaseInvites.java#L34
### Environment
1. Application Target Platform:
Android
2. Development Operating System:
Ubuntu Linux 16.04
3. Build Tools:
Android Studio 3.1.4
4. `React Native` version:
0.55.3
5. `React Native Firebase` Version:
4.3.8
6. `Firebase` Module:
Invites
7. Are you using `typescript`?
no
| main | invites is not working in latest version issue if i run this code js create invitation const invitation new firebase invites invitation title message invitation setdeeplink send the invitation const invitationids await firebase invites sendinvitation invitation use the invitationids as you see fit i get red screen error can only use lower bits for requestcode changing the value of request invite in this line below makes it work without error environment application target platform android development operating system ubuntu linux build tools android studio react native version react native firebase version firebase module invites are you using typescript no | 1 |
289,319 | 24,980,058,213 | IssuesEvent | 2022-11-02 10:57:40 | enonic/app-users | https://api.github.com/repos/enonic/app-users | opened | Add new ui-tests for password field in Change password dialog | Test | Verify issue - Incorrect behaviour in the password field #1308 | 1.0 | Add new ui-tests for password field in Change password dialog - Verify issue - Incorrect behaviour in the password field #1308 | non_main | add new ui tests for password field in change password dialog verify issue incorrect behaviour in the password field | 0 |
353 | 3,256,919,271 | IssuesEvent | 2015-10-20 15:41:04 | DynamoRIO/drmemory | https://api.github.com/repos/DynamoRIO/drmemory | opened | refactor to use drreg | Component-Framework Component-FullMode Maintainability Type-Feature | Not only will refactoring to use drreg clean up the existing code and make it more maintainable (xref #164, xref #243 where it was not easy to safely use reg3 where we wanted to) but it will simultaneously help porting to 64-bit (#111), porting to ARM (#1726), add ymm propagation (#1485), adding shadow race options (#195), and creating a shadow value propagation library (#825).
| True | refactor to use drreg - Not only will refactoring to use drreg clean up the existing code and make it more maintainable (xref #164, xref #243 where it was not easy to safely use reg3 where we wanted to) but it will simultaneously help porting to 64-bit (#111), porting to ARM (#1726), add ymm propagation (#1485), adding shadow race options (#195), and creating a shadow value propagation library (#825).
| main | refactor to use drreg not only will refactoring to use drreg clean up the existing code and make it more maintainable xref xref where it was not easy to safely use where we wanted to but it will simultaneously help porting to bit porting to arm add ymm propagation adding shadow race options and creating a shadow value propagation library | 1 |
197,303 | 14,917,698,039 | IssuesEvent | 2021-01-22 20:19:06 | dotnet/runtime | https://api.github.com/repos/dotnet/runtime | opened | We should host our own httpbin server for HttpClient tests | area-System.Net.Http test enhancement | Currently we are using the public server at http://httpbin.org for some decompression tests, see here: https://github.com/dotnet/runtime/blob/master/src/libraries/Common/tests/System/Net/Http/HttpClientHandlerTest.Decompression.cs#L206
Rather than just use the public server, we really should host our own httpbin server in Azure or similar. This should be easy to do. | 1.0 | We should host our own httpbin server for HttpClient tests - Currently we are using the public server at http://httpbin.org for some decompression tests, see here: https://github.com/dotnet/runtime/blob/master/src/libraries/Common/tests/System/Net/Http/HttpClientHandlerTest.Decompression.cs#L206
Rather than just use the public server, we really should host our own httpbin server in Azure or similar. This should be easy to do. | non_main | we should host our own httpbin server for httpclient tests currently we are using the public server at for some decompression tests see here rather than just use the public server we really should host our own httpbin server in azure or similar this should be easy to do | 0 |
5,854 | 31,280,536,636 | IssuesEvent | 2023-08-22 09:18:15 | toolbx-images/images | https://api.github.com/repos/toolbx-images/images | closed | Add distribution: Amazon Linux 2023 | new-image-request maintainers-wanted | ### Distribution name and versions requested
Amazon Linux 2022 no longer exist as is listed here. It was surpassed by AL2023
### Where are the official container images from the distribution published?
https://dso.docker.com/images/amazonlinux?platform=linux%2Famd64
### Will you be interested in maintaining this image?
No | True | Add distribution: Amazon Linux 2023 - ### Distribution name and versions requested
Amazon Linux 2022 no longer exist as is listed here. It was surpassed by AL2023
### Where are the official container images from the distribution published?
https://dso.docker.com/images/amazonlinux?platform=linux%2Famd64
### Will you be interested in maintaining this image?
No | main | add distribution amazon linux distribution name and versions requested amazon linux no longer exist as is listed here it was surpassed by where are the official container images from the distribution published will you be interested in maintaining this image no | 1 |
881 | 4,543,466,320 | IssuesEvent | 2016-09-10 04:55:31 | ansible/ansible-modules-core | https://api.github.com/repos/ansible/ansible-modules-core | closed | newly created containers always have default network even with purge_networks | affects_2.2 bug_report cloud docker in progress waiting_on_maintainer | ##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
docker_container.py
##### ANSIBLE VERSION
<!--- Paste verbatim output from “ansible --version” between quotes below -->
```
root@nuid:/usr/local/nubeva/config# ansible --version
ansible 2.2.0
config file =
configured module search path = Default w/o overrides
```
##### CONFIGURATION
default
##### OS / ENVIRONMENT
docker for mac
##### SUMMARY
When creating a completely new container instance with purge_networks set to "yes", the container is created with the default bridge. The second time that the same playbook is run, the playbook updates the container and removes the default bridge.
##### STEPS TO REPRODUCE
With the following playbook snippet:
```
- name: test container
docker_container:
name: test_ubuntu
image: "{{ registry }}/ubuntu:latest"
detach: True
networks:
- name: trafficentry
- name: trafficexit
purge_networks: yes
```
Run it once on a blank system (or after docker rm -f test_ubuntu). You will see the container is created, but inspecting it shows it is connected to three networks, bridge, trafficentry and trafficexit.
Run it a second time and you will see that the container is now connected to just the trafficentry and trafficexit networks as expected.
##### EXPECTED RESULTS
Newly created containers, not just updated containers, should not have the default network connected when purge_networks is configured.
##### ACTUAL RESULTS
After initial run:
```
docker inspect test_ubuntu
<snip>
"NetworkSettings": {
"Bridge": "",
"SandboxID": "bef4d5559dfcf8bc98756caea2d3a47c906718f5c0ea3475ea7e4ab70f68c3ea",
"HairpinMode": false,
"LinkLocalIPv6Address": "",
"LinkLocalIPv6PrefixLen": 0,
"Ports": {},
"SandboxKey": "/var/run/docker/netns/bef4d5559dfc",
"SecondaryIPAddresses": null,
"SecondaryIPv6Addresses": null,
"EndpointID": "c93709aa1119526627933435e47352230ecaa73445ddf0556d8ef1ae6211d351",
"Gateway": "172.18.0.1",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"IPAddress": "172.18.0.2",
"IPPrefixLen": 16,
"IPv6Gateway": "",
"MacAddress": "02:42:ac:12:00:02",
"Networks": {
"bridge": {
"IPAMConfig": null,
"Links": null,
"Aliases": null,
"NetworkID": "239e364714b16413115d71582c96e05e78b7d135572e393f4fbf3c7c14052511",
"EndpointID": "c93709aa1119526627933435e47352230ecaa73445ddf0556d8ef1ae6211d351",
"Gateway": "172.18.0.1",
"IPAddress": "172.18.0.2",
"IPPrefixLen": 16,
"IPv6Gateway": "",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"MacAddress": "02:42:ac:12:00:02"
},
"trafficentry": {
"IPAMConfig": null,
"Links": null,
"Aliases": [
"540bab56ef98"
],
"NetworkID": "c87e984a0a94aa11408d6f709e1cc8d35e402323e1de216b2a7958aeb7b7cc53",
"EndpointID": "abc3a2634ba561229f9050b05714b2ecc76b78442ddec48d76fd998d05ca659b",
"Gateway": "172.20.0.1",
"IPAddress": "172.20.0.2",
"IPPrefixLen": 16,
"IPv6Gateway": "",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"MacAddress": "02:42:ac:14:00:02"
},
"trafficexit": {
"IPAMConfig": null,
"Links": null,
"Aliases": [
"540bab56ef98"
],
"NetworkID": "25c50de46ae3a63ddc36d71e8c24a789052dd0e6bf9487d70f310c00f5688c38",
"EndpointID": "6c7d18d74f66a8e58c4c46c3f0845fb4c8cad30a7eaac14386b1ca7f1159b660",
"Gateway": "172.23.0.1",
"IPAddress": "172.23.0.2",
"IPPrefixLen": 16,
"IPv6Gateway": "",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"MacAddress": "02:42:ac:17:00:02"
}
}
```
After second run:
```
docker inspect test_ubuntu
<snip>
"Networks": {
"trafficentry": {
"IPAMConfig": null,
"Links": null,
"Aliases": [
"540bab56ef98"
],
"NetworkID": "c87e984a0a94aa11408d6f709e1cc8d35e402323e1de216b2a7958aeb7b7cc53",
"EndpointID": "abc3a2634ba561229f9050b05714b2ecc76b78442ddec48d76fd998d05ca659b",
"Gateway": "172.20.0.1",
"IPAddress": "172.20.0.2",
"IPPrefixLen": 16,
"IPv6Gateway": "",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"MacAddress": "02:42:ac:14:00:02"
},
"trafficexit": {
"IPAMConfig": null,
"Links": null,
"Aliases": [
"540bab56ef98"
],
"NetworkID": "25c50de46ae3a63ddc36d71e8c24a789052dd0e6bf9487d70f310c00f5688c38",
"EndpointID": "6c7d18d74f66a8e58c4c46c3f0845fb4c8cad30a7eaac14386b1ca7f1159b660",
"Gateway": "172.23.0.1",
"IPAddress": "172.23.0.2",
"IPPrefixLen": 16,
"IPv6Gateway": "",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"MacAddress": "02:42:ac:17:00:02"
}
}
```
| True | newly created containers always have default network even with purge_networks - ##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
docker_container.py
##### ANSIBLE VERSION
<!--- Paste verbatim output from “ansible --version” between quotes below -->
```
root@nuid:/usr/local/nubeva/config# ansible --version
ansible 2.2.0
config file =
configured module search path = Default w/o overrides
```
##### CONFIGURATION
default
##### OS / ENVIRONMENT
docker for mac
##### SUMMARY
When creating a completely new container instance with purge_networks set to "yes", the container is created with the default bridge. The second time that the same playbook is run, the playbook updates the container and removes the default bridge.
##### STEPS TO REPRODUCE
With the following playbook snippet:
```
- name: test container
docker_container:
name: test_ubuntu
image: "{{ registry }}/ubuntu:latest"
detach: True
networks:
- name: trafficentry
- name: trafficexit
purge_networks: yes
```
Run it once on a blank system (or after docker rm -f test_ubuntu). You will see the container is created, but inspecting it shows it is connected to three networks, bridge, trafficentry and trafficexit.
Run it a second time and you will see that the container is now connected to just the trafficentry and trafficexit networks as expected.
##### EXPECTED RESULTS
Newly created containers, not just updated containers, should not have the default network connected when purge_networks is configured.
##### ACTUAL RESULTS
After initial run:
```
docker inspect test_ubuntu
<snip>
"NetworkSettings": {
"Bridge": "",
"SandboxID": "bef4d5559dfcf8bc98756caea2d3a47c906718f5c0ea3475ea7e4ab70f68c3ea",
"HairpinMode": false,
"LinkLocalIPv6Address": "",
"LinkLocalIPv6PrefixLen": 0,
"Ports": {},
"SandboxKey": "/var/run/docker/netns/bef4d5559dfc",
"SecondaryIPAddresses": null,
"SecondaryIPv6Addresses": null,
"EndpointID": "c93709aa1119526627933435e47352230ecaa73445ddf0556d8ef1ae6211d351",
"Gateway": "172.18.0.1",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"IPAddress": "172.18.0.2",
"IPPrefixLen": 16,
"IPv6Gateway": "",
"MacAddress": "02:42:ac:12:00:02",
"Networks": {
"bridge": {
"IPAMConfig": null,
"Links": null,
"Aliases": null,
"NetworkID": "239e364714b16413115d71582c96e05e78b7d135572e393f4fbf3c7c14052511",
"EndpointID": "c93709aa1119526627933435e47352230ecaa73445ddf0556d8ef1ae6211d351",
"Gateway": "172.18.0.1",
"IPAddress": "172.18.0.2",
"IPPrefixLen": 16,
"IPv6Gateway": "",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"MacAddress": "02:42:ac:12:00:02"
},
"trafficentry": {
"IPAMConfig": null,
"Links": null,
"Aliases": [
"540bab56ef98"
],
"NetworkID": "c87e984a0a94aa11408d6f709e1cc8d35e402323e1de216b2a7958aeb7b7cc53",
"EndpointID": "abc3a2634ba561229f9050b05714b2ecc76b78442ddec48d76fd998d05ca659b",
"Gateway": "172.20.0.1",
"IPAddress": "172.20.0.2",
"IPPrefixLen": 16,
"IPv6Gateway": "",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"MacAddress": "02:42:ac:14:00:02"
},
"trafficexit": {
"IPAMConfig": null,
"Links": null,
"Aliases": [
"540bab56ef98"
],
"NetworkID": "25c50de46ae3a63ddc36d71e8c24a789052dd0e6bf9487d70f310c00f5688c38",
"EndpointID": "6c7d18d74f66a8e58c4c46c3f0845fb4c8cad30a7eaac14386b1ca7f1159b660",
"Gateway": "172.23.0.1",
"IPAddress": "172.23.0.2",
"IPPrefixLen": 16,
"IPv6Gateway": "",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"MacAddress": "02:42:ac:17:00:02"
}
}
```
After second run:
```
docker inspect test_ubuntu
<snip>
"Networks": {
"trafficentry": {
"IPAMConfig": null,
"Links": null,
"Aliases": [
"540bab56ef98"
],
"NetworkID": "c87e984a0a94aa11408d6f709e1cc8d35e402323e1de216b2a7958aeb7b7cc53",
"EndpointID": "abc3a2634ba561229f9050b05714b2ecc76b78442ddec48d76fd998d05ca659b",
"Gateway": "172.20.0.1",
"IPAddress": "172.20.0.2",
"IPPrefixLen": 16,
"IPv6Gateway": "",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"MacAddress": "02:42:ac:14:00:02"
},
"trafficexit": {
"IPAMConfig": null,
"Links": null,
"Aliases": [
"540bab56ef98"
],
"NetworkID": "25c50de46ae3a63ddc36d71e8c24a789052dd0e6bf9487d70f310c00f5688c38",
"EndpointID": "6c7d18d74f66a8e58c4c46c3f0845fb4c8cad30a7eaac14386b1ca7f1159b660",
"Gateway": "172.23.0.1",
"IPAddress": "172.23.0.2",
"IPPrefixLen": 16,
"IPv6Gateway": "",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"MacAddress": "02:42:ac:17:00:02"
}
}
```
| main | newly created containers always have default network even with purge networks issue type bug report component name docker container py ansible version root nuid usr local nubeva config ansible version ansible config file configured module search path default w o overrides configuration default os environment docker for mac summary when creating a completely new container instance with purge networks set to yes the container is created with the default bridge the second time that the same playbook is run the playbook updates the container and removes the default bridge steps to reproduce with the following playbook snippet name test container docker container name test ubuntu image registry ubuntu latest detach true networks name trafficentry name trafficexit purge networks yes run it once on a blank system or after docker rm f test ubuntu you will see the container is created but inspecting it shows it is connected to three networks bridge trafficentry and trafficexit run it a second time and you will see that the container is now connected to just the trafficentry and trafficexit networks as expected expected results newly created containers not just updated containers should not have the default network connected when purge networks is configured actual results after initial run docker inspect test ubuntu networksettings bridge sandboxid hairpinmode false ports sandboxkey var run docker netns secondaryipaddresses null null endpointid gateway ipaddress ipprefixlen macaddress ac networks bridge ipamconfig null links null aliases null networkid endpointid gateway ipaddress ipprefixlen macaddress ac trafficentry ipamconfig null links null aliases networkid endpointid gateway ipaddress ipprefixlen macaddress ac trafficexit ipamconfig null links null aliases networkid endpointid gateway ipaddress ipprefixlen macaddress ac after second run docker inspect test ubuntu networks trafficentry ipamconfig null links null aliases networkid endpointid gateway ipaddress ipprefixlen macaddress ac trafficexit ipamconfig null links null aliases networkid endpointid gateway ipaddress ipprefixlen macaddress ac | 1 |
1,122 | 4,990,293,427 | IssuesEvent | 2016-12-08 14:42:42 | ansible/ansible-modules-core | https://api.github.com/repos/ansible/ansible-modules-core | closed | Git module has different behaviors when cloning fresh and updating git repository (unable to update properly) | affects_2.1 bug_report waiting_on_maintainer | ##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
Git
##### ANSIBLE VERSION
```
ansible 2.1.1.0
config file =
configured module search path = Default w/o overrides
```
##### CONFIGURATION
n/a
##### OS / ENVIRONMENT
Fedora 24
##### SUMMARY
Seeing different (unexpected) behavior when trying to update an existing repository.
- If a repository is cloned "fresh" with a refspec and FETCH_HEAD (i.e, a Gerrit patchset review), it will work
- If a repository is first cloned from master branch and then updated with the refspec and FETCH_HEAD, it will not work
##### STEPS TO REPRODUCE
Full reproducer gist: https://gist.github.com/dmsimard/be06e4269ab094952db18da88c9ba70f
tl;dr:
```
# Bash works
git clone https://git.openstack.org/openstack/puppet-openstack-integration; cd puppet-openstack-integration
git fetch https://git.openstack.org/openstack/puppet-openstack-integration refs/changes/60/337860/26 && git checkout FETCH_HEAD
# This works
ansible -i hosts localhost -m git -a "repo=https://git.openstack.org/openstack/puppet-openstack-integration dest=/tmp/puppet-openstack-integration refspec=refs/changes/60/337860/26 version=FETCH_HEAD"
# This doesn't work
ansible -i hosts localhost -m git -a "repo=https://git.openstack.org/openstack/puppet-openstack-integration dest=/tmp/puppet-openstack-integration"
ansible -i hosts localhost -m git -a "repo=https://git.openstack.org/openstack/puppet-openstack-integration dest=/tmp/puppet-openstack-integration refspec=refs/changes/60/337860/26 version=FETCH_HEAD"
```
##### EXPECTED RESULTS
Expecting Ansible to be able to fetch a refspec for a repository that has already been cloned and checkout the FETCH_HEAD reference.
##### ACTUAL RESULTS
```
# This works
(openstack)┬─[dmsimard@hostname:/tmp]─[03:28:51 PM]
╰─>$ rm -rf puppet-openstack-integration/
(openstack)┬─[dmsimard@hostname:/tmp]─[03:28:57 PM]
╰─>$ ansible -vvv -i hosts localhost -m git -a "repo=https://git.openstack.org/openstack/puppet-openstack-integration dest=/tmp/puppet-openstack-integration refspec=refs/changes/60/337860/26 version=FETCH_HEAD"
No config file found; using defaults
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: dmsimard
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1471290547.98-200796271054450 `" && echo ansible-tmp-1471290547.98-200796271054450="` echo $HOME/.ansible/tmp/ansible-tmp-1471290547.98-200796271054450 `" ) && sleep 0'
<localhost> PUT /tmp/tmpeHsdHm TO /home/dmsimard/.ansible/tmp/ansible-tmp-1471290547.98-200796271054450/git
<localhost> EXEC /bin/sh -c 'LANG=en_CA.UTF-8 LC_ALL=en_CA.UTF-8 LC_MESSAGES=en_CA.UTF-8 /usr/bin/python /home/dmsimard/.ansible/tmp/ansible-tmp-1471290547.98-200796271054450/git; rm -rf "/home/dmsimard/.ansible/tmp/ansible-tmp-1471290547.98-200796271054450/" > /dev/null 2>&1 && sleep 0'
localhost | SUCCESS => {
"after": "a9b53b7acfc0146c931ea95cb2168d165f5edbbd",
"before": null,
"changed": true,
"invocation": {
"module_args": {
"accept_hostkey": false,
"bare": false,
"clone": true,
"depth": null,
"dest": "/tmp/puppet-openstack-integration",
"executable": null,
"force": false,
"key_file": null,
"recursive": true,
"reference": null,
"refspec": "refs/changes/60/337860/26",
"remote": "origin",
"repo": "https://git.openstack.org/openstack/puppet-openstack-integration",
"ssh_opts": null,
"track_submodules": false,
"update": true,
"verify_commit": false,
"version": "FETCH_HEAD"
},
"module_name": "git"
},
"warnings": []
}
(openstack)┬─[dmsimard@hostname:/tmp]─[03:49:11 PM]
╰─>$ cd puppet-openstack-integration/
(openstack)┬─[dmsimard@hostname:/tmp/puppet-openstack-integration]─[03:49:26 PM]
╰─>$ git log --pretty=format:"%h%x09%an%x09%ad%x09%s" -n2
a9b53b7 David Moreau-Simard Tue Jul 5 16:45:23 2016 -0400 Add designate test coverage to scenario003
f8aa97d Jenkins Mon Aug 8 11:34:27 2016 +0000 Merge "In-process token caching is deprecated, use memcached instead"
# This doesn't
(openstack)┬─[dmsimard@hostname:/tmp]─[03:50:21 PM]
╰─>$ rm -rf puppet-openstack-integration/
(openstack)┬─[dmsimard@hostname:/tmp]─[03:50:58 PM]
╰─>$ ansible -vvv -i hosts localhost -m git -a "repo=https://git.openstack.org/openstack/puppet-openstack-integration dest=/tmp/puppet-openstack-integration"
No config file found; using defaults
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: dmsimard
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1471290721.44-129257258043449 `" && echo ansible-tmp-1471290721.44-129257258043449="` echo $HOME/.ansible/tmp/ansible-tmp-1471290721.44-129257258043449 `" ) && sleep 0'
<localhost> PUT /tmp/tmpn03XKS TO /home/dmsimard/.ansible/tmp/ansible-tmp-1471290721.44-129257258043449/git
<localhost> EXEC /bin/sh -c 'LANG=en_CA.UTF-8 LC_ALL=en_CA.UTF-8 LC_MESSAGES=en_CA.UTF-8 /usr/bin/python /home/dmsimard/.ansible/tmp/ansible-tmp-1471290721.44-129257258043449/git; rm -rf "/home/dmsimard/.ansible/tmp/ansible-tmp-1471290721.44-129257258043449/" > /dev/null 2>&1 && sleep 0'
localhost | SUCCESS => {
"after": "da748ad437bc1f3165929b2b69208f7c58e62699",
"before": null,
"changed": true,
"invocation": {
"module_args": {
"accept_hostkey": false,
"bare": false,
"clone": true,
"depth": null,
"dest": "/tmp/puppet-openstack-integration",
"executable": null,
"force": false,
"key_file": null,
"recursive": true,
"reference": null,
"refspec": null,
"remote": "origin",
"repo": "https://git.openstack.org/openstack/puppet-openstack-integration",
"ssh_opts": null,
"track_submodules": false,
"update": true,
"verify_commit": false,
"version": "HEAD"
},
"module_name": "git"
},
"warnings": []
}
(openstack)┬─[dmsimard@hostname:/tmp]─[03:52:03 PM]
╰─>$ ansible -vvv -i hosts localhost -m git -a "repo=https://git.openstack.org/openstack/puppet-openstack-integration dest=/tmp/puppet-openstack-integration refspec=refs/changes/60/337860/26 version=FETCH_HEAD"
No config file found; using defaults
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: dmsimard
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1471290728.81-173161814030036 `" && echo ansible-tmp-1471290728.81-173161814030036="` echo $HOME/.ansible/tmp/ansible-tmp-1471290728.81-173161814030036 `" ) && sleep 0'
<localhost> PUT /tmp/tmpq7cdDu TO /home/dmsimard/.ansible/tmp/ansible-tmp-1471290728.81-173161814030036/git
<localhost> EXEC /bin/sh -c 'LANG=en_CA.UTF-8 LC_ALL=en_CA.UTF-8 LC_MESSAGES=en_CA.UTF-8 /usr/bin/python /home/dmsimard/.ansible/tmp/ansible-tmp-1471290728.81-173161814030036/git; rm -rf "/home/dmsimard/.ansible/tmp/ansible-tmp-1471290728.81-173161814030036/" > /dev/null 2>&1 && sleep 0'
localhost | SUCCESS => {
"after": "c2b8906f4779418a6192f4339d79a297b97f328d",
"before": "da748ad437bc1f3165929b2b69208f7c58e62699",
"changed": true,
"invocation": {
"module_args": {
"accept_hostkey": false,
"bare": false,
"clone": true,
"depth": null,
"dest": "/tmp/puppet-openstack-integration",
"executable": null,
"force": false,
"key_file": null,
"recursive": true,
"reference": null,
"refspec": "refs/changes/60/337860/26",
"remote": "origin",
"repo": "https://git.openstack.org/openstack/puppet-openstack-integration",
"ssh_opts": null,
"track_submodules": false,
"update": true,
"verify_commit": false,
"version": "FETCH_HEAD"
},
"module_name": "git"
},
"warnings": []
}
(openstack)┬─[dmsimard@hostname:/tmp]─[03:52:11 PM]
╰─>$ cd puppet-openstack-integration/
(openstack)┬─[dmsimard@hostname:/tmp/puppet-openstack-integration]─[03:52:16 PM]
╰─>$ git log --pretty=format:"%h%x09%an%x09%ad%x09%s" -n2
c2b8906 Jenkins Mon Nov 2 15:44:19 2015 +0000 Merge "puppetfile: bump corosync to 0.8.0" into stable/kilo
4341f5d Jenkins Mon Nov 2 15:44:01 2015 +0000 Merge "puppetfile: Added corosync module" into stable/kilo
```
| True | Git module has different behaviors when cloning fresh and updating git repository (unable to update properly) - ##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
Git
##### ANSIBLE VERSION
```
ansible 2.1.1.0
config file =
configured module search path = Default w/o overrides
```
##### CONFIGURATION
n/a
##### OS / ENVIRONMENT
Fedora 24
##### SUMMARY
Seeing different (unexpected) behavior when trying to update an existing repository.
- If a repository is cloned "fresh" with a refspec and FETCH_HEAD (i.e, a Gerrit patchset review), it will work
- If a repository is first cloned from master branch and then updated with the refspec and FETCH_HEAD, it will not work
##### STEPS TO REPRODUCE
Full reproducer gist: https://gist.github.com/dmsimard/be06e4269ab094952db18da88c9ba70f
tl;dr:
```
# Bash works
git clone https://git.openstack.org/openstack/puppet-openstack-integration; cd puppet-openstack-integration
git fetch https://git.openstack.org/openstack/puppet-openstack-integration refs/changes/60/337860/26 && git checkout FETCH_HEAD
# This works
ansible -i hosts localhost -m git -a "repo=https://git.openstack.org/openstack/puppet-openstack-integration dest=/tmp/puppet-openstack-integration refspec=refs/changes/60/337860/26 version=FETCH_HEAD"
# This doesn't work
ansible -i hosts localhost -m git -a "repo=https://git.openstack.org/openstack/puppet-openstack-integration dest=/tmp/puppet-openstack-integration"
ansible -i hosts localhost -m git -a "repo=https://git.openstack.org/openstack/puppet-openstack-integration dest=/tmp/puppet-openstack-integration refspec=refs/changes/60/337860/26 version=FETCH_HEAD"
```
##### EXPECTED RESULTS
Expecting Ansible to be able to fetch a refspec for a repository that has already been cloned and checkout the FETCH_HEAD reference.
##### ACTUAL RESULTS
```
# This works
(openstack)┬─[dmsimard@hostname:/tmp]─[03:28:51 PM]
╰─>$ rm -rf puppet-openstack-integration/
(openstack)┬─[dmsimard@hostname:/tmp]─[03:28:57 PM]
╰─>$ ansible -vvv -i hosts localhost -m git -a "repo=https://git.openstack.org/openstack/puppet-openstack-integration dest=/tmp/puppet-openstack-integration refspec=refs/changes/60/337860/26 version=FETCH_HEAD"
No config file found; using defaults
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: dmsimard
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1471290547.98-200796271054450 `" && echo ansible-tmp-1471290547.98-200796271054450="` echo $HOME/.ansible/tmp/ansible-tmp-1471290547.98-200796271054450 `" ) && sleep 0'
<localhost> PUT /tmp/tmpeHsdHm TO /home/dmsimard/.ansible/tmp/ansible-tmp-1471290547.98-200796271054450/git
<localhost> EXEC /bin/sh -c 'LANG=en_CA.UTF-8 LC_ALL=en_CA.UTF-8 LC_MESSAGES=en_CA.UTF-8 /usr/bin/python /home/dmsimard/.ansible/tmp/ansible-tmp-1471290547.98-200796271054450/git; rm -rf "/home/dmsimard/.ansible/tmp/ansible-tmp-1471290547.98-200796271054450/" > /dev/null 2>&1 && sleep 0'
localhost | SUCCESS => {
"after": "a9b53b7acfc0146c931ea95cb2168d165f5edbbd",
"before": null,
"changed": true,
"invocation": {
"module_args": {
"accept_hostkey": false,
"bare": false,
"clone": true,
"depth": null,
"dest": "/tmp/puppet-openstack-integration",
"executable": null,
"force": false,
"key_file": null,
"recursive": true,
"reference": null,
"refspec": "refs/changes/60/337860/26",
"remote": "origin",
"repo": "https://git.openstack.org/openstack/puppet-openstack-integration",
"ssh_opts": null,
"track_submodules": false,
"update": true,
"verify_commit": false,
"version": "FETCH_HEAD"
},
"module_name": "git"
},
"warnings": []
}
(openstack)┬─[dmsimard@hostname:/tmp]─[03:49:11 PM]
╰─>$ cd puppet-openstack-integration/
(openstack)┬─[dmsimard@hostname:/tmp/puppet-openstack-integration]─[03:49:26 PM]
╰─>$ git log --pretty=format:"%h%x09%an%x09%ad%x09%s" -n2
a9b53b7 David Moreau-Simard Tue Jul 5 16:45:23 2016 -0400 Add designate test coverage to scenario003
f8aa97d Jenkins Mon Aug 8 11:34:27 2016 +0000 Merge "In-process token caching is deprecated, use memcached instead"
# This doesn't
(openstack)┬─[dmsimard@hostname:/tmp]─[03:50:21 PM]
╰─>$ rm -rf puppet-openstack-integration/
(openstack)┬─[dmsimard@hostname:/tmp]─[03:50:58 PM]
╰─>$ ansible -vvv -i hosts localhost -m git -a "repo=https://git.openstack.org/openstack/puppet-openstack-integration dest=/tmp/puppet-openstack-integration"
No config file found; using defaults
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: dmsimard
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1471290721.44-129257258043449 `" && echo ansible-tmp-1471290721.44-129257258043449="` echo $HOME/.ansible/tmp/ansible-tmp-1471290721.44-129257258043449 `" ) && sleep 0'
<localhost> PUT /tmp/tmpn03XKS TO /home/dmsimard/.ansible/tmp/ansible-tmp-1471290721.44-129257258043449/git
<localhost> EXEC /bin/sh -c 'LANG=en_CA.UTF-8 LC_ALL=en_CA.UTF-8 LC_MESSAGES=en_CA.UTF-8 /usr/bin/python /home/dmsimard/.ansible/tmp/ansible-tmp-1471290721.44-129257258043449/git; rm -rf "/home/dmsimard/.ansible/tmp/ansible-tmp-1471290721.44-129257258043449/" > /dev/null 2>&1 && sleep 0'
localhost | SUCCESS => {
"after": "da748ad437bc1f3165929b2b69208f7c58e62699",
"before": null,
"changed": true,
"invocation": {
"module_args": {
"accept_hostkey": false,
"bare": false,
"clone": true,
"depth": null,
"dest": "/tmp/puppet-openstack-integration",
"executable": null,
"force": false,
"key_file": null,
"recursive": true,
"reference": null,
"refspec": null,
"remote": "origin",
"repo": "https://git.openstack.org/openstack/puppet-openstack-integration",
"ssh_opts": null,
"track_submodules": false,
"update": true,
"verify_commit": false,
"version": "HEAD"
},
"module_name": "git"
},
"warnings": []
}
(openstack)┬─[dmsimard@hostname:/tmp]─[03:52:03 PM]
╰─>$ ansible -vvv -i hosts localhost -m git -a "repo=https://git.openstack.org/openstack/puppet-openstack-integration dest=/tmp/puppet-openstack-integration refspec=refs/changes/60/337860/26 version=FETCH_HEAD"
No config file found; using defaults
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: dmsimard
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1471290728.81-173161814030036 `" && echo ansible-tmp-1471290728.81-173161814030036="` echo $HOME/.ansible/tmp/ansible-tmp-1471290728.81-173161814030036 `" ) && sleep 0'
<localhost> PUT /tmp/tmpq7cdDu TO /home/dmsimard/.ansible/tmp/ansible-tmp-1471290728.81-173161814030036/git
<localhost> EXEC /bin/sh -c 'LANG=en_CA.UTF-8 LC_ALL=en_CA.UTF-8 LC_MESSAGES=en_CA.UTF-8 /usr/bin/python /home/dmsimard/.ansible/tmp/ansible-tmp-1471290728.81-173161814030036/git; rm -rf "/home/dmsimard/.ansible/tmp/ansible-tmp-1471290728.81-173161814030036/" > /dev/null 2>&1 && sleep 0'
localhost | SUCCESS => {
"after": "c2b8906f4779418a6192f4339d79a297b97f328d",
"before": "da748ad437bc1f3165929b2b69208f7c58e62699",
"changed": true,
"invocation": {
"module_args": {
"accept_hostkey": false,
"bare": false,
"clone": true,
"depth": null,
"dest": "/tmp/puppet-openstack-integration",
"executable": null,
"force": false,
"key_file": null,
"recursive": true,
"reference": null,
"refspec": "refs/changes/60/337860/26",
"remote": "origin",
"repo": "https://git.openstack.org/openstack/puppet-openstack-integration",
"ssh_opts": null,
"track_submodules": false,
"update": true,
"verify_commit": false,
"version": "FETCH_HEAD"
},
"module_name": "git"
},
"warnings": []
}
(openstack)┬─[dmsimard@hostname:/tmp]─[03:52:11 PM]
╰─>$ cd puppet-openstack-integration/
(openstack)┬─[dmsimard@hostname:/tmp/puppet-openstack-integration]─[03:52:16 PM]
╰─>$ git log --pretty=format:"%h%x09%an%x09%ad%x09%s" -n2
c2b8906 Jenkins Mon Nov 2 15:44:19 2015 +0000 Merge "puppetfile: bump corosync to 0.8.0" into stable/kilo
4341f5d Jenkins Mon Nov 2 15:44:01 2015 +0000 Merge "puppetfile: Added corosync module" into stable/kilo
```
| main | git module has different behaviors when cloning fresh and updating git repository unable to update properly issue type bug report component name git ansible version ansible config file configured module search path default w o overrides configuration n a os environment fedora summary seeing different unexpected behavior when trying to update an existing repository if a repository is cloned fresh with a refspec and fetch head i e a gerrit patchset review it will work if a repository is first cloned from master branch and then updated with the refspec and fetch head it will not work steps to reproduce full reproducer gist tl dr bash works git clone cd puppet openstack integration git fetch refs changes git checkout fetch head this works ansible i hosts localhost m git a repo dest tmp puppet openstack integration refspec refs changes version fetch head this doesn t work ansible i hosts localhost m git a repo dest tmp puppet openstack integration ansible i hosts localhost m git a repo dest tmp puppet openstack integration refspec refs changes version fetch head expected results expecting ansible to be able to fetch a refspec for a repository that has already been cloned and checkout the fetch head reference actual results this works openstack ┬─ ─ ╰─ rm rf puppet openstack integration openstack ┬─ ─ ╰─ ansible vvv i hosts localhost m git a repo dest tmp puppet openstack integration refspec refs changes version fetch head no config file found using defaults establish local connection for user dmsimard exec bin sh c umask mkdir p echo home ansible tmp ansible tmp echo ansible tmp echo home ansible tmp ansible tmp sleep put tmp tmpehsdhm to home dmsimard ansible tmp ansible tmp git exec bin sh c lang en ca utf lc all en ca utf lc messages en ca utf usr bin python home dmsimard ansible tmp ansible tmp git rm rf home dmsimard ansible tmp ansible tmp dev null sleep localhost success after before null changed true invocation module args accept hostkey false bare false clone true depth null dest tmp puppet openstack integration executable null force false key file null recursive true reference null refspec refs changes remote origin repo ssh opts null track submodules false update true verify commit false version fetch head module name git warnings openstack ┬─ ─ ╰─ cd puppet openstack integration openstack ┬─ ─ ╰─ git log pretty format h an ad s david moreau simard tue jul add designate test coverage to jenkins mon aug merge in process token caching is deprecated use memcached instead this doesn t openstack ┬─ ─ ╰─ rm rf puppet openstack integration openstack ┬─ ─ ╰─ ansible vvv i hosts localhost m git a repo dest tmp puppet openstack integration no config file found using defaults establish local connection for user dmsimard exec bin sh c umask mkdir p echo home ansible tmp ansible tmp echo ansible tmp echo home ansible tmp ansible tmp sleep put tmp to home dmsimard ansible tmp ansible tmp git exec bin sh c lang en ca utf lc all en ca utf lc messages en ca utf usr bin python home dmsimard ansible tmp ansible tmp git rm rf home dmsimard ansible tmp ansible tmp dev null sleep localhost success after before null changed true invocation module args accept hostkey false bare false clone true depth null dest tmp puppet openstack integration executable null force false key file null recursive true reference null refspec null remote origin repo ssh opts null track submodules false update true verify commit false version head module name git warnings openstack ┬─ ─ ╰─ ansible vvv i hosts localhost m git a repo dest tmp puppet openstack integration refspec refs changes version fetch head no config file found using defaults establish local connection for user dmsimard exec bin sh c umask mkdir p echo home ansible tmp ansible tmp echo ansible tmp echo home ansible tmp ansible tmp sleep put tmp to home dmsimard ansible tmp ansible tmp git exec bin sh c lang en ca utf lc all en ca utf lc messages en ca utf usr bin python home dmsimard ansible tmp ansible tmp git rm rf home dmsimard ansible tmp ansible tmp dev null sleep localhost success after before changed true invocation module args accept hostkey false bare false clone true depth null dest tmp puppet openstack integration executable null force false key file null recursive true reference null refspec refs changes remote origin repo ssh opts null track submodules false update true verify commit false version fetch head module name git warnings openstack ┬─ ─ ╰─ cd puppet openstack integration openstack ┬─ ─ ╰─ git log pretty format h an ad s jenkins mon nov merge puppetfile bump corosync to into stable kilo jenkins mon nov merge puppetfile added corosync module into stable kilo | 1 |
4,215 | 20,828,814,114 | IssuesEvent | 2022-03-19 04:23:57 | microsoft/DirectXTex | https://api.github.com/repos/microsoft/DirectXTex | opened | ARM64 version of texassemble, texconv, and texdiag | maintainence | The VC++ projects only currently build x86/x64 versions of these tools for desktop.
The CMakeLists.txt supports creating the ARM64 version of these tools, so this should be added to the VC++ projects. | True | ARM64 version of texassemble, texconv, and texdiag - The VC++ projects only currently build x86/x64 versions of these tools for desktop.
The CMakeLists.txt supports creating the ARM64 version of these tools, so this should be added to the VC++ projects. | main | version of texassemble texconv and texdiag the vc projects only currently build versions of these tools for desktop the cmakelists txt supports creating the version of these tools so this should be added to the vc projects | 1 |
33,552 | 6,219,028,368 | IssuesEvent | 2017-07-09 09:18:58 | gnocchixyz/gnocchi | https://api.github.com/repos/gnocchixyz/gnocchi | closed | when running gnocchi-config-generator there are missing depends | documentation | Hi,
I have installed gnocchi 4.0.0 on ubuntu 17.04 (x64) via `pip install gnocchi`. It is running on python 2.7.13 in a virtualenv (15.1.0).
When executing `gnocchi-config-generator` the following modules are missing:
futurist,
tooz,
oslo_db
lz4
I have managed to install them manually and got gnocchi-config-generator executing, but I would argue for a smooth start the dependencies should be included into the requirements.txt or the setuptools config or whereever...
Cheers
Carsten | 1.0 | when running gnocchi-config-generator there are missing depends - Hi,
I have installed gnocchi 4.0.0 on ubuntu 17.04 (x64) via `pip install gnocchi`. It is running on python 2.7.13 in a virtualenv (15.1.0).
When executing `gnocchi-config-generator` the following modules are missing:
futurist,
tooz,
oslo_db
lz4
I have managed to install them manually and got gnocchi-config-generator executing, but I would argue for a smooth start the dependencies should be included into the requirements.txt or the setuptools config or whereever...
Cheers
Carsten | non_main | when running gnocchi config generator there are missing depends hi i have installed gnocchi on ubuntu via pip install gnocchi it is running on python in a virtualenv when executing gnocchi config generator the following modules are missing futurist tooz oslo db i have managed to install them manually and got gnocchi config generator executing but i would argue for a smooth start the dependencies should be included into the requirements txt or the setuptools config or whereever cheers carsten | 0 |
224,836 | 7,473,416,526 | IssuesEvent | 2018-04-03 15:18:48 | OpenNebula/one | https://api.github.com/repos/OpenNebula/one | closed | Ubuntu 14.04 gem mysql2 install failure | Category: Packages Priority: High Status: Accepted Type: Bug | # Bug Report
## Version of OpenNebula
<!--Mark the relevant versions affected with [X] -->
- [X] 5.4.10
- [X] Development build
## Component
<!-- Mark the relevant versions affected with [X] -->
- [ ] Authorization (LDAP, x509 certs...)
- [ ] Command Line Interface (CLI)
- [ ] Contextualization
- [ ] Documentation
- [ ] Federation and HA
- [ ] Host, Clusters and Monitorization
- [ ] KVM
- [ ] Networking
- [ ] Orchestration (OpenNebula Flow)
- [X] Packages
- [ ] Scheduler
- [ ] Storage & Images
- [ ] Sunstone
- [ ] Upgrades
- [ ] User, Groups, VDCs and ACL
- [ ] vCenter
## Description
New gem **mysql2** 0.5.0 has been released, and as we don't have updated Gemfile.lock for the **mysql2**, it tries to install the latest version incompatible with Ruby 1.9.
```
# yes | /usr/share/one/install_gems --yes
...
Fetching mysql2 0.5.0
Installing mysql2 0.5.0 with native extensions
Gem::InstallError: mysql2 requires Ruby version >= 2.0.0.
An error occurred while installing mysql2 (0.5.0), and Bundler
cannot continue.
Make sure that `gem install mysql2 -v '0.5.0'` succeeds before bundling.
In Gemfile:
mysql2
```
# Progress Status
- [x] Branch created
- [x] Code committed to development branch
- [x] Testing - QA
- [x] Documentation
- [ ] Release notes - resolved issues, compatibility, known issues
- [ ] Code committed to upstream release/hotfix branches
- [ ] Documentation committed to upstream release/hotfix branches
| 1.0 | Ubuntu 14.04 gem mysql2 install failure - # Bug Report
## Version of OpenNebula
<!--Mark the relevant versions affected with [X] -->
- [X] 5.4.10
- [X] Development build
## Component
<!-- Mark the relevant versions affected with [X] -->
- [ ] Authorization (LDAP, x509 certs...)
- [ ] Command Line Interface (CLI)
- [ ] Contextualization
- [ ] Documentation
- [ ] Federation and HA
- [ ] Host, Clusters and Monitorization
- [ ] KVM
- [ ] Networking
- [ ] Orchestration (OpenNebula Flow)
- [X] Packages
- [ ] Scheduler
- [ ] Storage & Images
- [ ] Sunstone
- [ ] Upgrades
- [ ] User, Groups, VDCs and ACL
- [ ] vCenter
## Description
New gem **mysql2** 0.5.0 has been released, and as we don't have updated Gemfile.lock for the **mysql2**, it tries to install the latest version incompatible with Ruby 1.9.
```
# yes | /usr/share/one/install_gems --yes
...
Fetching mysql2 0.5.0
Installing mysql2 0.5.0 with native extensions
Gem::InstallError: mysql2 requires Ruby version >= 2.0.0.
An error occurred while installing mysql2 (0.5.0), and Bundler
cannot continue.
Make sure that `gem install mysql2 -v '0.5.0'` succeeds before bundling.
In Gemfile:
mysql2
```
# Progress Status
- [x] Branch created
- [x] Code committed to development branch
- [x] Testing - QA
- [x] Documentation
- [ ] Release notes - resolved issues, compatibility, known issues
- [ ] Code committed to upstream release/hotfix branches
- [ ] Documentation committed to upstream release/hotfix branches
| non_main | ubuntu gem install failure bug report version of opennebula development build component authorization ldap certs command line interface cli contextualization documentation federation and ha host clusters and monitorization kvm networking orchestration opennebula flow packages scheduler storage images sunstone upgrades user groups vdcs and acl vcenter description new gem has been released and as we don t have updated gemfile lock for the it tries to install the latest version incompatible with ruby yes usr share one install gems yes fetching installing with native extensions gem installerror requires ruby version an error occurred while installing and bundler cannot continue make sure that gem install v succeeds before bundling in gemfile progress status branch created code committed to development branch testing qa documentation release notes resolved issues compatibility known issues code committed to upstream release hotfix branches documentation committed to upstream release hotfix branches | 0 |
992 | 4,756,814,279 | IssuesEvent | 2016-10-24 14:58:42 | ansible/ansible-modules-core | https://api.github.com/repos/ansible/ansible-modules-core | closed | hostname: support alpine(3.4) | affects_2.2 feature_idea waiting_on_maintainer |
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
hostname
##### ANSIBLE VERSION
<!--- Paste verbatim output from “ansible --version” between quotes below -->
```
ansible 2.2.0 (devel e326da28ff) last updated 2016/09/13 13:46:08 (GMT -300)
lib/ansible/modules/core: (detached HEAD ae6992bf8c) last updated 2016/09/13 13:47:28 (GMT -300)
lib/ansible/modules/extras: (detached HEAD afd0b23836) last updated 2016/09/13 13:47:28 (GMT -300)
config file = /Users/jbergstroem/Work/node/build/ansible/ansible.cfg
configured module search path = ['plugins/library']
```
##### CONFIGURATION
##### OS / ENVIRONMENT
N/A
##### SUMMARY
Ansible currently returns this if you try to use the hostname module on alpine:
`fatal: [test-joyent-alpine34-x64-2]: FAILED! => {"changed": false, "failed": true, "msg": "hostname module cannot be used on platform Linux (Alpine)"}`
Supporting it should be relatively straightforward seeing how it utilizes both `/etc/hostname` and `hostname` if busybox is installed. I can see the rationale for not supporting a lot of stuff on alpine but I think people that use it to a point where even busybox isn't installed wouldn't try and call hostname either.
##### STEPS TO REPRODUCE
Just call hostname
##### EXPECTED RESULTS
Setting the hostname
##### ACTUAL RESULTS
```
fatal: [test-joyent-alpine34-x64-2]: FAILED! => {"changed": false, "failed": true, "msg": "hostname module cannot be used on platform Linux (Alpine)"}
```
| True | hostname: support alpine(3.4) -
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
hostname
##### ANSIBLE VERSION
<!--- Paste verbatim output from “ansible --version” between quotes below -->
```
ansible 2.2.0 (devel e326da28ff) last updated 2016/09/13 13:46:08 (GMT -300)
lib/ansible/modules/core: (detached HEAD ae6992bf8c) last updated 2016/09/13 13:47:28 (GMT -300)
lib/ansible/modules/extras: (detached HEAD afd0b23836) last updated 2016/09/13 13:47:28 (GMT -300)
config file = /Users/jbergstroem/Work/node/build/ansible/ansible.cfg
configured module search path = ['plugins/library']
```
##### CONFIGURATION
##### OS / ENVIRONMENT
N/A
##### SUMMARY
Ansible currently returns this if you try to use the hostname module on alpine:
`fatal: [test-joyent-alpine34-x64-2]: FAILED! => {"changed": false, "failed": true, "msg": "hostname module cannot be used on platform Linux (Alpine)"}`
Supporting it should be relatively straightforward seeing how it utilizes both `/etc/hostname` and `hostname` if busybox is installed. I can see the rationale for not supporting a lot of stuff on alpine but I think people that use it to a point where even busybox isn't installed wouldn't try and call hostname either.
##### STEPS TO REPRODUCE
Just call hostname
##### EXPECTED RESULTS
Setting the hostname
##### ACTUAL RESULTS
```
fatal: [test-joyent-alpine34-x64-2]: FAILED! => {"changed": false, "failed": true, "msg": "hostname module cannot be used on platform Linux (Alpine)"}
```
| main | hostname support alpine issue type feature idea component name hostname ansible version ansible devel last updated gmt lib ansible modules core detached head last updated gmt lib ansible modules extras detached head last updated gmt config file users jbergstroem work node build ansible ansible cfg configured module search path configuration os environment n a summary ansible currently returns this if you try to use the hostname module on alpine fatal failed changed false failed true msg hostname module cannot be used on platform linux alpine supporting it should be relatively straightforward seeing how it utilizes both etc hostname and hostname if busybox is installed i can see the rationale for not supporting a lot of stuff on alpine but i think people that use it to a point where even busybox isn t installed wouldn t try and call hostname either steps to reproduce just call hostname expected results setting the hostname actual results fatal failed changed false failed true msg hostname module cannot be used on platform linux alpine | 1 |
449,345 | 31,839,646,980 | IssuesEvent | 2023-09-14 15:28:45 | nicschumann/many-rivers | https://api.github.com/repos/nicschumann/many-rivers | closed | Create DEM Processing Video | documentation | Make a short video showing the process of extracting the DEMs and turning them into usable data for the simulator. | 1.0 | Create DEM Processing Video - Make a short video showing the process of extracting the DEMs and turning them into usable data for the simulator. | non_main | create dem processing video make a short video showing the process of extracting the dems and turning them into usable data for the simulator | 0 |
6,984 | 4,713,887,546 | IssuesEvent | 2016-10-14 21:41:05 | postmanlabs/postman-app-support | https://api.github.com/repos/postmanlabs/postman-app-support | closed | When viewing an authorization token, content overflows its panel. | pending-close Usability | <!--
Welcome to the Postman Issue tracker. Any feature requests / bug reports can be posted here.
Any security-related bugs should be reported directly to security@getpostman.com
Version/App Information:
-->
1. Postman Version: 4.4.2
2. App: Mac & Chrome
3. OS details: OS X 10.11.2
4. Is the Interceptor on and enabled in the app: No
5. Did you encounter this recently, or has this bug always been there: Always
6. Expected behaviour: Should be able to resize panel or to scroll.
7. Console logs:
8. Screenshots


I've encountered an issue when requesting an authorization token. If more information is received in the token than the panel is sized for then the overflow is simply hidden from view. See screenshots: I need to be able to copy the entirety ID Token to use it as a header, and double clicking the text doesn't highlight the entire block. In the Chrome App I am able to copy it because the overflow seems to flow over the top of the response panel rather than underneath, but this is still less than ideal.
Great app on the whole, makes testing my API a breeze thank you :)
<!--
Some guidelines:
1. Please file Newman-related issues at https://github.com/postmanlabs/newman/issues
2. If it’s a Cloud-related issue, or you want to include personal information like your username / collection names, mail us at help@getpostman.com
3. If it’s a question (anything along the lines of “How do I … in Postman”), the answer might lie in our documentation - http://getpostman.com/docs.
-->
| True | When viewing an authorization token, content overflows its panel. - <!--
Welcome to the Postman Issue tracker. Any feature requests / bug reports can be posted here.
Any security-related bugs should be reported directly to security@getpostman.com
Version/App Information:
-->
1. Postman Version: 4.4.2
2. App: Mac & Chrome
3. OS details: OS X 10.11.2
4. Is the Interceptor on and enabled in the app: No
5. Did you encounter this recently, or has this bug always been there: Always
6. Expected behaviour: Should be able to resize panel or to scroll.
7. Console logs:
8. Screenshots


I've encountered an issue when requesting an authorization token. If more information is received in the token than the panel is sized for then the overflow is simply hidden from view. See screenshots: I need to be able to copy the entirety ID Token to use it as a header, and double clicking the text doesn't highlight the entire block. In the Chrome App I am able to copy it because the overflow seems to flow over the top of the response panel rather than underneath, but this is still less than ideal.
Great app on the whole, makes testing my API a breeze thank you :)
<!--
Some guidelines:
1. Please file Newman-related issues at https://github.com/postmanlabs/newman/issues
2. If it’s a Cloud-related issue, or you want to include personal information like your username / collection names, mail us at help@getpostman.com
3. If it’s a question (anything along the lines of “How do I … in Postman”), the answer might lie in our documentation - http://getpostman.com/docs.
-->
| non_main | when viewing an authorization token content overflows its panel welcome to the postman issue tracker any feature requests bug reports can be posted here any security related bugs should be reported directly to security getpostman com version app information postman version app mac chrome os details os x is the interceptor on and enabled in the app no did you encounter this recently or has this bug always been there always expected behaviour should be able to resize panel or to scroll console logs screenshots i ve encountered an issue when requesting an authorization token if more information is received in the token than the panel is sized for then the overflow is simply hidden from view see screenshots i need to be able to copy the entirety id token to use it as a header and double clicking the text doesn t highlight the entire block in the chrome app i am able to copy it because the overflow seems to flow over the top of the response panel rather than underneath but this is still less than ideal great app on the whole makes testing my api a breeze thank you some guidelines please file newman related issues at if it’s a cloud related issue or you want to include personal information like your username collection names mail us at help getpostman com if it’s a question anything along the lines of “how do i … in postman” the answer might lie in our documentation | 0 |
2,177 | 7,632,527,283 | IssuesEvent | 2018-05-05 16:15:54 | Microsoft/DirectXTex | https://api.github.com/repos/Microsoft/DirectXTex | closed | Invalid character in DirectXTexConvert.cpp. | maintainence | I used this library in visual studio 2015 for Japanese. The warning of C4819 arrived in DirectXTexConvert.cpp. It looks like used special character as bellow:
```
// Y? = Y - 16
// Cb? = Cb - 128
// Cr? = Cr - 128
// R = 1.1644Y? + 1.5960Cr?
// G = 1.1644Y? - 0.3917Cb? - 0.8128Cr?
// B = 1.1644Y? + 2.0172Cb?
```
so please don't use special character in this source. | True | Invalid character in DirectXTexConvert.cpp. - I used this library in visual studio 2015 for Japanese. The warning of C4819 arrived in DirectXTexConvert.cpp. It looks like used special character as bellow:
```
// Y? = Y - 16
// Cb? = Cb - 128
// Cr? = Cr - 128
// R = 1.1644Y? + 1.5960Cr?
// G = 1.1644Y? - 0.3917Cb? - 0.8128Cr?
// B = 1.1644Y? + 2.0172Cb?
```
so please don't use special character in this source. | main | invalid character in directxtexconvert cpp i used this library in visual studio for japanese the warning of arrived in directxtexconvert cpp it looks like used special character as bellow y y cb cb cr cr r g b so please don t use special character in this source | 1 |
13,685 | 8,641,227,650 | IssuesEvent | 2018-11-24 15:31:50 | matomo-org/matomo | https://api.github.com/repos/matomo-org/matomo | opened | Updater should make a backup of all files before replacing to allow easy rollback | c: Usability | This is just a basic idea, I am not sure how useful it is or how easy it is to implement.
------
I just updated Nextcloud and took a look at the upgrade screen (which nicely ticks one step after another after it is done)

I noticed that Nextcloud seems to backup it's files before the update. Maybe doing that would help get back to a working state in case Matomo is completly broken after an update (e.g. because of missing Classes or corrupted files which seems to be reported quite often)
Of course we would need to find a way to do the rollback (as the web ui is probably already broken at this point) and it wouldn't help if the database migration have already happened (but that would only happen in the next step). | True | Updater should make a backup of all files before replacing to allow easy rollback - This is just a basic idea, I am not sure how useful it is or how easy it is to implement.
------
I just updated Nextcloud and took a look at the upgrade screen (which nicely ticks one step after another after it is done)

I noticed that Nextcloud seems to backup it's files before the update. Maybe doing that would help get back to a working state in case Matomo is completly broken after an update (e.g. because of missing Classes or corrupted files which seems to be reported quite often)
Of course we would need to find a way to do the rollback (as the web ui is probably already broken at this point) and it wouldn't help if the database migration have already happened (but that would only happen in the next step). | non_main | updater should make a backup of all files before replacing to allow easy rollback this is just a basic idea i am not sure how useful it is or how easy it is to implement i just updated nextcloud and took a look at the upgrade screen which nicely ticks one step after another after it is done i noticed that nextcloud seems to backup it s files before the update maybe doing that would help get back to a working state in case matomo is completly broken after an update e g because of missing classes or corrupted files which seems to be reported quite often of course we would need to find a way to do the rollback as the web ui is probably already broken at this point and it wouldn t help if the database migration have already happened but that would only happen in the next step | 0 |
3,653 | 14,918,488,725 | IssuesEvent | 2021-01-22 21:47:39 | Twasi/websocket-obs-java | https://api.github.com/repos/Twasi/websocket-obs-java | opened | Add a CONTRIBUTING file | help wanted maintainability | Should answer questions such as:
- branch expectations/conventions
- overall vision/architecture?
- how to communicate / possibly Discord
I'm tempted to add a brief note on the overall architecture without going into too many details | True | Add a CONTRIBUTING file - Should answer questions such as:
- branch expectations/conventions
- overall vision/architecture?
- how to communicate / possibly Discord
I'm tempted to add a brief note on the overall architecture without going into too many details | main | add a contributing file should answer questions such as branch expectations conventions overall vision architecture how to communicate possibly discord i m tempted to add a brief note on the overall architecture without going into too many details | 1 |
10,754 | 13,543,312,404 | IssuesEvent | 2020-09-16 18:45:28 | kubernetes/minikube | https://api.github.com/repos/kubernetes/minikube | closed | add minikube to github virtual environment base images | kind/process lifecycle/rotten priority/backlog | there are many projects and tools installed by default in the github base image (including other local-kubernetes) tools https://github.com/actions/virtual-environments
minikube is missing there. | 1.0 | add minikube to github virtual environment base images - there are many projects and tools installed by default in the github base image (including other local-kubernetes) tools https://github.com/actions/virtual-environments
minikube is missing there. | non_main | add minikube to github virtual environment base images there are many projects and tools installed by default in the github base image including other local kubernetes tools minikube is missing there | 0 |
2,259 | 7,934,525,672 | IssuesEvent | 2018-07-08 20:16:19 | chocolatey/chocolatey-package-requests | https://api.github.com/repos/chocolatey/chocolatey-package-requests | closed | RFM - Centbrowser | Status: Available For Maintainer(s) | Looking for someone to take over this package https://chocolatey.org/packages/CentBrowser as I no longer want to deal with it anymore.. Just getting too annoying having to deal with their crap tier CDN that they use..
Until someone picks this package up, it's going to sit dormant in my deprecated folder https://github.com/JourneyOver/chocolatey-packages/tree/master/deprecated/centbrowser. | True | RFM - Centbrowser - Looking for someone to take over this package https://chocolatey.org/packages/CentBrowser as I no longer want to deal with it anymore.. Just getting too annoying having to deal with their crap tier CDN that they use..
Until someone picks this package up, it's going to sit dormant in my deprecated folder https://github.com/JourneyOver/chocolatey-packages/tree/master/deprecated/centbrowser. | main | rfm centbrowser looking for someone to take over this package as i no longer want to deal with it anymore just getting too annoying having to deal with their crap tier cdn that they use until someone picks this package up it s going to sit dormant in my deprecated folder | 1 |
30,920 | 5,887,467,551 | IssuesEvent | 2017-05-17 07:32:54 | gradle/gradle-script-kotlin | https://api.github.com/repos/gradle/gradle-script-kotlin | opened | kotlin/dokka#158: New lines in Java source Javadoc code blocks are not preserved in dokka HTML output | a:bug in:kt-dokka re:documentation | Given a Java type with some Javadoc, e.g.:
```java
/**
* Something that matters.
*
* <pre>
* with( some ) {
* multi = lines
* sample()
* }
* </pre>
*/
public interface Something {}
```
All lines in the `<pre/>` element are rendered as a single line in dokka HTML output.
As an example:
- [Java source for Copy](https://github.com/gradle/gradle/blob/d60b527ff902fd0710f64972ad25f7ed02c289bc/subprojects/core/src/main/java/org/gradle/api/tasks/Copy.java#L29-L69)
- [Javadoc output for Copy](https://docs.gradle.org/current/javadoc/org/gradle/api/tasks/Copy.html)
- [dokka output for Copy](https://gradle.github.io/gradle-script-kotlin-docs/api/org.gradle.api.tasks/-copy/index.html)
See kotlin/dokka#158 | 1.0 | kotlin/dokka#158: New lines in Java source Javadoc code blocks are not preserved in dokka HTML output - Given a Java type with some Javadoc, e.g.:
```java
/**
* Something that matters.
*
* <pre>
* with( some ) {
* multi = lines
* sample()
* }
* </pre>
*/
public interface Something {}
```
All lines in the `<pre/>` element are rendered as a single line in dokka HTML output.
As an example:
- [Java source for Copy](https://github.com/gradle/gradle/blob/d60b527ff902fd0710f64972ad25f7ed02c289bc/subprojects/core/src/main/java/org/gradle/api/tasks/Copy.java#L29-L69)
- [Javadoc output for Copy](https://docs.gradle.org/current/javadoc/org/gradle/api/tasks/Copy.html)
- [dokka output for Copy](https://gradle.github.io/gradle-script-kotlin-docs/api/org.gradle.api.tasks/-copy/index.html)
See kotlin/dokka#158 | non_main | kotlin dokka new lines in java source javadoc code blocks are not preserved in dokka html output given a java type with some javadoc e g java something that matters with some multi lines sample public interface something all lines in the element are rendered as a single line in dokka html output as an example see kotlin dokka | 0 |
316,398 | 23,629,908,643 | IssuesEvent | 2022-08-25 08:27:14 | rancher/elemental | https://api.github.com/repos/rancher/elemental | closed | How to install elemental without hardware TPM support | documentation | Document the use case and procedure of setting emulated TPM
It could be very first item of an examples section | 1.0 | How to install elemental without hardware TPM support - Document the use case and procedure of setting emulated TPM
It could be very first item of an examples section | non_main | how to install elemental without hardware tpm support document the use case and procedure of setting emulated tpm it could be very first item of an examples section | 0 |
656,939 | 21,780,253,316 | IssuesEvent | 2022-05-13 18:02:03 | wso2/api-manager | https://api.github.com/repos/wso2/api-manager | opened | [4.1.0] Unable to create free Subscription policy after configuring Monetization | Type/Bug Priority/Normal | ### Description:
<!-- Describe the issue -->
$subject. Getting the following error.
```
TID: [-1234] [api/am/admin] [2022-05-13 08:00:26,636] ERROR {org.wso2.carbon.apimgt.rest.api.util.exception.GlobalThrowableMapper} - An unknown exception has been captured by the global exception mapper. java.lang.NumberFormatException: empty String
at java.base/jdk.internal.math.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)
at java.base/jdk.internal.math.FloatingDecimal.parseFloat(FloatingDecimal.java:122)
at java.base/java.lang.Float.parseFloat(Float.java:455)
at org.wso2.apim.monetization.impl.StripeMonetizationImpl.createBillingPlan(StripeMonetizationImpl.java:152)
at org.wso2.carbon.apimgt.impl.APIProviderImpl.createMonetizationPlan_aroundBody310(APIProviderImpl.java:6243)
at org.wso2.carbon.apimgt.impl.APIProviderImpl.createMonetizationPlan(APIProviderImpl.java:6238)
```
### Steps to reproduce:
<!-- List the steps you followed when you encountered the issue -->
1. Configure monetization with Stripe
https://apim.docs.wso2.com/en/latest/design/api-monetization/monetizing-an-api/
2. Create a Free Subcription policy
### Affected product version:
<!-- Members can use Affected/*** labels -->
4.1.0
| 1.0 | [4.1.0] Unable to create free Subscription policy after configuring Monetization - ### Description:
<!-- Describe the issue -->
$subject. Getting the following error.
```
TID: [-1234] [api/am/admin] [2022-05-13 08:00:26,636] ERROR {org.wso2.carbon.apimgt.rest.api.util.exception.GlobalThrowableMapper} - An unknown exception has been captured by the global exception mapper. java.lang.NumberFormatException: empty String
at java.base/jdk.internal.math.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842)
at java.base/jdk.internal.math.FloatingDecimal.parseFloat(FloatingDecimal.java:122)
at java.base/java.lang.Float.parseFloat(Float.java:455)
at org.wso2.apim.monetization.impl.StripeMonetizationImpl.createBillingPlan(StripeMonetizationImpl.java:152)
at org.wso2.carbon.apimgt.impl.APIProviderImpl.createMonetizationPlan_aroundBody310(APIProviderImpl.java:6243)
at org.wso2.carbon.apimgt.impl.APIProviderImpl.createMonetizationPlan(APIProviderImpl.java:6238)
```
### Steps to reproduce:
<!-- List the steps you followed when you encountered the issue -->
1. Configure monetization with Stripe
https://apim.docs.wso2.com/en/latest/design/api-monetization/monetizing-an-api/
2. Create a Free Subcription policy
### Affected product version:
<!-- Members can use Affected/*** labels -->
4.1.0
| non_main | unable to create free subscription policy after configuring monetization description subject getting the following error tid error org carbon apimgt rest api util exception globalthrowablemapper an unknown exception has been captured by the global exception mapper java lang numberformatexception empty string at java base jdk internal math floatingdecimal readjavaformatstring floatingdecimal java at java base jdk internal math floatingdecimal parsefloat floatingdecimal java at java base java lang float parsefloat float java at org apim monetization impl stripemonetizationimpl createbillingplan stripemonetizationimpl java at org carbon apimgt impl apiproviderimpl createmonetizationplan apiproviderimpl java at org carbon apimgt impl apiproviderimpl createmonetizationplan apiproviderimpl java steps to reproduce configure monetization with stripe create a free subcription policy affected product version | 0 |
4,007 | 18,685,389,263 | IssuesEvent | 2021-11-01 11:46:26 | WarenGonzaga/gather.js | https://api.github.com/repos/WarenGonzaga/gather.js | opened | official documentation for GatherJS | maintainers-only p3 todo | I might need help on this but I will create a separate issue for someone interested in working on the documentation.
I'll add basic documentation for now after the pre-release of the project. | True | official documentation for GatherJS - I might need help on this but I will create a separate issue for someone interested in working on the documentation.
I'll add basic documentation for now after the pre-release of the project. | main | official documentation for gatherjs i might need help on this but i will create a separate issue for someone interested in working on the documentation i ll add basic documentation for now after the pre release of the project | 1 |
17,118 | 23,632,759,932 | IssuesEvent | 2022-08-25 10:44:59 | ricksouth/serilum-mc-mods | https://api.github.com/repos/ricksouth/serilum-mc-mods | closed | Village Spawn Point (Forge Mod Loader) Not Working | A Bug Mod: Village Spawn Point Compatibility issue | As mentioned when I contacted over Curse Forge - Village Spawn Point is one of my absolute fave must-have mods. So this has been an issue for a while for me with a mix of mod combinations - in 1.17 it'd spawn me in the overworld jus' above the nearest abandoned Mineshaft although since 1.18 that is not the case - but it doesn't spawn me at a village like I would like NOR at spawn - just random spawn.
I don't know if perhaps its a mod conflict the list of Mods I have is:
1. Slab Machines
2. No More Wandering Trader
3. Double Doors
4. Awesome Flooring
5. Flat Bedrock
6. Villagers Drop Emeralds on Death
7. Areas
8. Snad
9. Superflat World No Slimes
10. Reap Mod
11. Klee Slabs
12. Villager Names
13. Realistic Bees
14. Gravel Miner
15. Clumps
16. Xaero's World Map
17. Just Another Rotten Flesh to Leather mod
18. Comforts
19. Humbling Bundle
20. Curio of Undying
21. Squat Grow
22. Grass Seeds
23. Set World Spawn Point
24. Keep My Soil Tilled
25. Balm
26. Copper Equipment
27. Move Minecarts
28. Cloth Config v4 API (for Forge)
29. Mineral Chance
30. Supplementaries
31. Mo'Blocks
32. Wool Tweaks
33. Selene
34. Weaker Spiderwebs
35. More Villagers
36. No Hostiles Around Campfire
37. Better Spawner Control
38. Tool Swap
39. Inventory Totem
40. Rapid Leaf Decay
41. Edit Sign
42. Zombie Proof Doors
43. Horizontal Glass Panes
44. Village Spawn Point
45. Roughly Enough Items
46. Roughly Enough Items (JEI Stub)
47. What The Hell Is That (Forge)
48. Move Boats
49. Curios API
50. Xaero's Minimap
51. Collective
52. Cycle Paintings
53. Sit
54. Random Village Names
55. Surface Mushrooms
56. Builders Crafts and Additions
57. Better Conduit Placement
58. Starter Kit
59. Sleep Sooner
60. Architectury
61. AppleSkin
62. Name Tag Tweaks
63. Crying Portals
64. Trample Stopper
65. Falling Tree
66. Craftable Horse Armor and Saddle (CHA&S Forge)
67. Smaller Nether Portals
68. Infinite Trading
69. Cosmetic Armor Reworked
70. Vein Mining
71. All Loot Drops
I'm sorry for not providing links but I was pressed for time as I wrote this - some of these are obviously dependents of others (such as all the mods dependant on Collective being there etc). Hope this helps for tracking down any possible conflicts or letting me know if the problem is just my computer. | True | Village Spawn Point (Forge Mod Loader) Not Working - As mentioned when I contacted over Curse Forge - Village Spawn Point is one of my absolute fave must-have mods. So this has been an issue for a while for me with a mix of mod combinations - in 1.17 it'd spawn me in the overworld jus' above the nearest abandoned Mineshaft although since 1.18 that is not the case - but it doesn't spawn me at a village like I would like NOR at spawn - just random spawn.
I don't know if perhaps its a mod conflict the list of Mods I have is:
1. Slab Machines
2. No More Wandering Trader
3. Double Doors
4. Awesome Flooring
5. Flat Bedrock
6. Villagers Drop Emeralds on Death
7. Areas
8. Snad
9. Superflat World No Slimes
10. Reap Mod
11. Klee Slabs
12. Villager Names
13. Realistic Bees
14. Gravel Miner
15. Clumps
16. Xaero's World Map
17. Just Another Rotten Flesh to Leather mod
18. Comforts
19. Humbling Bundle
20. Curio of Undying
21. Squat Grow
22. Grass Seeds
23. Set World Spawn Point
24. Keep My Soil Tilled
25. Balm
26. Copper Equipment
27. Move Minecarts
28. Cloth Config v4 API (for Forge)
29. Mineral Chance
30. Supplementaries
31. Mo'Blocks
32. Wool Tweaks
33. Selene
34. Weaker Spiderwebs
35. More Villagers
36. No Hostiles Around Campfire
37. Better Spawner Control
38. Tool Swap
39. Inventory Totem
40. Rapid Leaf Decay
41. Edit Sign
42. Zombie Proof Doors
43. Horizontal Glass Panes
44. Village Spawn Point
45. Roughly Enough Items
46. Roughly Enough Items (JEI Stub)
47. What The Hell Is That (Forge)
48. Move Boats
49. Curios API
50. Xaero's Minimap
51. Collective
52. Cycle Paintings
53. Sit
54. Random Village Names
55. Surface Mushrooms
56. Builders Crafts and Additions
57. Better Conduit Placement
58. Starter Kit
59. Sleep Sooner
60. Architectury
61. AppleSkin
62. Name Tag Tweaks
63. Crying Portals
64. Trample Stopper
65. Falling Tree
66. Craftable Horse Armor and Saddle (CHA&S Forge)
67. Smaller Nether Portals
68. Infinite Trading
69. Cosmetic Armor Reworked
70. Vein Mining
71. All Loot Drops
I'm sorry for not providing links but I was pressed for time as I wrote this - some of these are obviously dependents of others (such as all the mods dependant on Collective being there etc). Hope this helps for tracking down any possible conflicts or letting me know if the problem is just my computer. | non_main | village spawn point forge mod loader not working as mentioned when i contacted over curse forge village spawn point is one of my absolute fave must have mods so this has been an issue for a while for me with a mix of mod combinations in it d spawn me in the overworld jus above the nearest abandoned mineshaft although since that is not the case but it doesn t spawn me at a village like i would like nor at spawn just random spawn i don t know if perhaps its a mod conflict the list of mods i have is slab machines no more wandering trader double doors awesome flooring flat bedrock villagers drop emeralds on death areas snad superflat world no slimes reap mod klee slabs villager names realistic bees gravel miner clumps xaero s world map just another rotten flesh to leather mod comforts humbling bundle curio of undying squat grow grass seeds set world spawn point keep my soil tilled balm copper equipment move minecarts cloth config api for forge mineral chance supplementaries mo blocks wool tweaks selene weaker spiderwebs more villagers no hostiles around campfire better spawner control tool swap inventory totem rapid leaf decay edit sign zombie proof doors horizontal glass panes village spawn point roughly enough items roughly enough items jei stub what the hell is that forge move boats curios api xaero s minimap collective cycle paintings sit random village names surface mushrooms builders crafts and additions better conduit placement starter kit sleep sooner architectury appleskin name tag tweaks crying portals trample stopper falling tree craftable horse armor and saddle cha s forge smaller nether portals infinite trading cosmetic armor reworked vein mining all loot drops i m sorry for not providing links but i was pressed for time as i wrote this some of these are obviously dependents of others such as all the mods dependant on collective being there etc hope this helps for tracking down any possible conflicts or letting me know if the problem is just my computer | 0 |
262 | 3,015,647,838 | IssuesEvent | 2015-07-29 20:42:36 | canadainc/canadainclib | https://api.github.com/repos/canadainc/canadainclib | opened | Perform as much analytic gathering on the device as possible | Maintainability Performance Security Type-Enhancement | So server has least amount of work to do. | True | Perform as much analytic gathering on the device as possible - So server has least amount of work to do. | main | perform as much analytic gathering on the device as possible so server has least amount of work to do | 1 |
3,130 | 12,003,306,626 | IssuesEvent | 2020-04-09 09:22:21 | precice/precice | https://api.github.com/repos/precice/precice | closed | Refactor initialization and finalization of tests | maintainability | We currently start all our tests from a single entry point in `src/testing/main.cpp` with some things initialized and finalized there:
https://github.com/precice/precice/blob/a5bc1e7652ed957d817c9fcb46848fd760a474fc/src/testing/main.cpp#L98-L131
These initialization and finalization efforts however make not sense for every test, e.g. in an integration test, the `EventRegistry` should be initialized with the `GlobalCommunicator` of the participant (so through the preCICE API), not with the `GlobalCommunicator` of the whole tests.
For other tests, certain initialization is also missing: e.g. for integration tests in `PreciceTests/Serial/*`, we don't restrict the MPI communicator as we do for integration tests in `PreciceTests/Parallel/*`.
https://github.com/precice/precice/blob/a5bc1e7652ed957d817c9fcb46848fd760a474fc/src/testing/Fixtures.hpp#L81-L99
We currently work around this issues with several (historically grown) hacks, explicit setup of environments directly in the tests and prohibiting certain features for certain tests (e.g. PETSc based RBF in serial integration tests). Furthermore, our tests are sensitive to their order, which makes them very hard (thus time consuming) to maintain.
Let's do a clean separation of different testcases and their initialization / finalization and a clean definition which things we initialize / finalize and which things we treat with `precice::testMode`. Either through different entry points or through fixtures alone. We need to distinguish the following cases:
* **(US)** serial unit tests, example: `MappingTests/NearestNeighborMapping`
* **(UP)** parallel unit tests on a single participant, example: `MappingTests/PetRadialBasisFunctionMapping/Parallel`
* **(UMS)** unit tests with two or more serial participants, example: `CplSchemeTests/SerialImplicitCouplingSchemeTests/*`
* **(UMP)** unit tests with multiple participants and at least one parallel participant, example: `PartitionTests/*`
* **(IS)** integration tests with one, two or more serial participants, example: `PreciceTests/Serial/*`
* **(IP)** integration tests with multiple participants and at least one parallel participant, example: `PreciceTests/Parallel/*` | True | Refactor initialization and finalization of tests - We currently start all our tests from a single entry point in `src/testing/main.cpp` with some things initialized and finalized there:
https://github.com/precice/precice/blob/a5bc1e7652ed957d817c9fcb46848fd760a474fc/src/testing/main.cpp#L98-L131
These initialization and finalization efforts however make not sense for every test, e.g. in an integration test, the `EventRegistry` should be initialized with the `GlobalCommunicator` of the participant (so through the preCICE API), not with the `GlobalCommunicator` of the whole tests.
For other tests, certain initialization is also missing: e.g. for integration tests in `PreciceTests/Serial/*`, we don't restrict the MPI communicator as we do for integration tests in `PreciceTests/Parallel/*`.
https://github.com/precice/precice/blob/a5bc1e7652ed957d817c9fcb46848fd760a474fc/src/testing/Fixtures.hpp#L81-L99
We currently work around this issues with several (historically grown) hacks, explicit setup of environments directly in the tests and prohibiting certain features for certain tests (e.g. PETSc based RBF in serial integration tests). Furthermore, our tests are sensitive to their order, which makes them very hard (thus time consuming) to maintain.
Let's do a clean separation of different testcases and their initialization / finalization and a clean definition which things we initialize / finalize and which things we treat with `precice::testMode`. Either through different entry points or through fixtures alone. We need to distinguish the following cases:
* **(US)** serial unit tests, example: `MappingTests/NearestNeighborMapping`
* **(UP)** parallel unit tests on a single participant, example: `MappingTests/PetRadialBasisFunctionMapping/Parallel`
* **(UMS)** unit tests with two or more serial participants, example: `CplSchemeTests/SerialImplicitCouplingSchemeTests/*`
* **(UMP)** unit tests with multiple participants and at least one parallel participant, example: `PartitionTests/*`
* **(IS)** integration tests with one, two or more serial participants, example: `PreciceTests/Serial/*`
* **(IP)** integration tests with multiple participants and at least one parallel participant, example: `PreciceTests/Parallel/*` | main | refactor initialization and finalization of tests we currently start all our tests from a single entry point in src testing main cpp with some things initialized and finalized there these initialization and finalization efforts however make not sense for every test e g in an integration test the eventregistry should be initialized with the globalcommunicator of the participant so through the precice api not with the globalcommunicator of the whole tests for other tests certain initialization is also missing e g for integration tests in precicetests serial we don t restrict the mpi communicator as we do for integration tests in precicetests parallel we currently work around this issues with several historically grown hacks explicit setup of environments directly in the tests and prohibiting certain features for certain tests e g petsc based rbf in serial integration tests furthermore our tests are sensitive to their order which makes them very hard thus time consuming to maintain let s do a clean separation of different testcases and their initialization finalization and a clean definition which things we initialize finalize and which things we treat with precice testmode either through different entry points or through fixtures alone we need to distinguish the following cases us serial unit tests example mappingtests nearestneighbormapping up parallel unit tests on a single participant example mappingtests petradialbasisfunctionmapping parallel ums unit tests with two or more serial participants example cplschemetests serialimplicitcouplingschemetests ump unit tests with multiple participants and at least one parallel participant example partitiontests is integration tests with one two or more serial participants example precicetests serial ip integration tests with multiple participants and at least one parallel participant example precicetests parallel | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.