Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
13,091
15,440,018,606
IssuesEvent
2021-03-08 02:08:54
DevExpress/testcafe-hammerhead
https://api.github.com/repos/DevExpress/testcafe-hammerhead
closed
Clarify `console` methods arguments representation
AREA: client STATE: Stale SYSTEM: client side processing TYPE: enhancement
Now we have simply `String(arg)`/`'object'` representation (src/client/sandbox/console.js): ```js _toString (obj) { try { return String(obj); } catch (e) { return 'object'; } } ``` **Native behavior:** ```js const tst = Object.create(null); tst.key = 'val'; console.log(tst) // shows {key: "val"} object ``` **Proxied:** `console.log(tst)` represents above `tst` object as `'object'` string. ### Proposed `stringify`-based solution (objects without __proto__, object with circular references cases ) ```js import { stringify as stringifyJSON } from '../json'; ... _toString (obj) { try { return String(obj); } catch (e) { function replacer(key, value) { ... if (typeof value === 'object') { ... } return stringifyJSON(obj, replacer); } } ``` **Note, that client JSON (src/client/json.js) has NO `replacer` parameter for `stringify` method.** ### Notes https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify#Issue_with_JSON.stringify()_when_serializing_circular_references: > ```js > const circularReference = {}; > circularReference.myself = circularReference; > > // Serializing circular references throws "TypeError: cyclic object value" > JSON.stringify(circularReference); > ```
1.0
Clarify `console` methods arguments representation - Now we have simply `String(arg)`/`'object'` representation (src/client/sandbox/console.js): ```js _toString (obj) { try { return String(obj); } catch (e) { return 'object'; } } ``` **Native behavior:** ```js const tst = Object.create(null); tst.key = 'val'; console.log(tst) // shows {key: "val"} object ``` **Proxied:** `console.log(tst)` represents above `tst` object as `'object'` string. ### Proposed `stringify`-based solution (objects without __proto__, object with circular references cases ) ```js import { stringify as stringifyJSON } from '../json'; ... _toString (obj) { try { return String(obj); } catch (e) { function replacer(key, value) { ... if (typeof value === 'object') { ... } return stringifyJSON(obj, replacer); } } ``` **Note, that client JSON (src/client/json.js) has NO `replacer` parameter for `stringify` method.** ### Notes https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify#Issue_with_JSON.stringify()_when_serializing_circular_references: > ```js > const circularReference = {}; > circularReference.myself = circularReference; > > // Serializing circular references throws "TypeError: cyclic object value" > JSON.stringify(circularReference); > ```
process
clarify console methods arguments representation now we have simply string arg object representation src client sandbox console js js tostring obj try return string obj catch e return object native behavior js const tst object create null tst key val console log tst shows key val object proxied console log tst represents above tst object as object string proposed stringify based solution objects without proto object with circular references cases js import stringify as stringifyjson from json tostring obj try return string obj catch e function replacer key value if typeof value object return stringifyjson obj replacer note that client json src client json js has no replacer parameter for stringify method notes js const circularreference circularreference myself circularreference serializing circular references throws typeerror cyclic object value json stringify circularreference
1
12,605
15,008,141,354
IssuesEvent
2021-01-31 08:41:17
panther-labs/panther
https://api.github.com/repos/panther-labs/panther
closed
Parser: NGINX JSON
enhancement logs p2 team:data processing
### Describe the ideal solution Support for NGINX Access logs in JSON format ### References http://nginx.org/en/docs/http/ngx_http_log_module.html#log_format NGINX can also be installed locally with brew for end-to-end validation
1.0
Parser: NGINX JSON - ### Describe the ideal solution Support for NGINX Access logs in JSON format ### References http://nginx.org/en/docs/http/ngx_http_log_module.html#log_format NGINX can also be installed locally with brew for end-to-end validation
process
parser nginx json describe the ideal solution support for nginx access logs in json format references nginx can also be installed locally with brew for end to end validation
1
200,156
22,739,462,609
IssuesEvent
2022-07-07 01:16:08
scrapedia/scrapy-cookies
https://api.github.com/repos/scrapedia/scrapy-cookies
opened
CVE-2022-31116 (High) detected in ujson-2.0.3-cp27-cp27mu-manylinux1_x86_64.whl
security vulnerability
## CVE-2022-31116 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ujson-2.0.3-cp27-cp27mu-manylinux1_x86_64.whl</b></p></summary> <p>Ultra fast JSON encoder and decoder for Python</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/0d/ca/404a902e7fc2d39796b01f72e90a2b32e7ca25a3708bcf1d602ccf9e3658/ujson-2.0.3-cp27-cp27mu-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/0d/ca/404a902e7fc2d39796b01f72e90a2b32e7ca25a3708bcf1d602ccf9e3658/ujson-2.0.3-cp27-cp27mu-manylinux1_x86_64.whl</a></p> <p>Path to dependency file: /tmp/ws-scm/scrapy-cookies</p> <p>Path to vulnerable library: /scrapy-cookies</p> <p> Dependency Hierarchy: - :x: **ujson-2.0.3-cp27-cp27mu-manylinux1_x86_64.whl** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> UltraJSON is a fast JSON encoder and decoder written in pure C with bindings for Python 3.7+. Affected versions were found to improperly decode certain characters. JSON strings that contain escaped surrogate characters not part of a proper surrogate pair were decoded incorrectly. Besides corrupting strings, this allowed for potential key confusion and value overwriting in dictionaries. All users parsing JSON from untrusted sources are vulnerable. From version 5.4.0, UltraJSON decodes lone surrogates in the same way as the standard library's `json` module does, preserving them in the parsed output. Users are advised to upgrade. There are no known workarounds for this issue. <p>Publish Date: 2022-07-05 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31116>CVE-2022-31116</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31116">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31116</a></p> <p>Release Date: 2022-07-05</p> <p>Fix Resolution: ujson - 5.4.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-31116 (High) detected in ujson-2.0.3-cp27-cp27mu-manylinux1_x86_64.whl - ## CVE-2022-31116 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ujson-2.0.3-cp27-cp27mu-manylinux1_x86_64.whl</b></p></summary> <p>Ultra fast JSON encoder and decoder for Python</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/0d/ca/404a902e7fc2d39796b01f72e90a2b32e7ca25a3708bcf1d602ccf9e3658/ujson-2.0.3-cp27-cp27mu-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/0d/ca/404a902e7fc2d39796b01f72e90a2b32e7ca25a3708bcf1d602ccf9e3658/ujson-2.0.3-cp27-cp27mu-manylinux1_x86_64.whl</a></p> <p>Path to dependency file: /tmp/ws-scm/scrapy-cookies</p> <p>Path to vulnerable library: /scrapy-cookies</p> <p> Dependency Hierarchy: - :x: **ujson-2.0.3-cp27-cp27mu-manylinux1_x86_64.whl** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> UltraJSON is a fast JSON encoder and decoder written in pure C with bindings for Python 3.7+. Affected versions were found to improperly decode certain characters. JSON strings that contain escaped surrogate characters not part of a proper surrogate pair were decoded incorrectly. Besides corrupting strings, this allowed for potential key confusion and value overwriting in dictionaries. All users parsing JSON from untrusted sources are vulnerable. From version 5.4.0, UltraJSON decodes lone surrogates in the same way as the standard library's `json` module does, preserving them in the parsed output. Users are advised to upgrade. There are no known workarounds for this issue. <p>Publish Date: 2022-07-05 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31116>CVE-2022-31116</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31116">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31116</a></p> <p>Release Date: 2022-07-05</p> <p>Fix Resolution: ujson - 5.4.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in ujson whl cve high severity vulnerability vulnerable library ujson whl ultra fast json encoder and decoder for python library home page a href path to dependency file tmp ws scm scrapy cookies path to vulnerable library scrapy cookies dependency hierarchy x ujson whl vulnerable library found in base branch master vulnerability details ultrajson is a fast json encoder and decoder written in pure c with bindings for python affected versions were found to improperly decode certain characters json strings that contain escaped surrogate characters not part of a proper surrogate pair were decoded incorrectly besides corrupting strings this allowed for potential key confusion and value overwriting in dictionaries all users parsing json from untrusted sources are vulnerable from version ultrajson decodes lone surrogates in the same way as the standard library s json module does preserving them in the parsed output users are advised to upgrade there are no known workarounds for this issue publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ujson step up your open source security game with mend
0
120,942
10,142,799,919
IssuesEvent
2019-08-04 05:29:23
motlabs/awesome-ml-demos-with-ios
https://api.github.com/repos/motlabs/awesome-ml-demos-with-ios
opened
Change performance table format
enhancement performance test
Example: ## Inference Time(ms) | Repo | Model | XS | XS<br>Max | XR | X | 8 | 8+ | 7 | 7+ | 6S+ | 6+ | | ----- | ----: | :----: | :----: | :----: | :----: | :----: | :----: | :----: | :----: | :----: | :----: | | PoseEstimation-CoreML | cpm | - | 27 | 27 | 32 | 31 | 31 | 39 | 37 | 44 | 115 | | PoseEstimation-CoreML | hourhglass | - | 6 | 7 | 29 | 31 | 32 | 37 | 42 | 48 | 94 |
1.0
Change performance table format - Example: ## Inference Time(ms) | Repo | Model | XS | XS<br>Max | XR | X | 8 | 8+ | 7 | 7+ | 6S+ | 6+ | | ----- | ----: | :----: | :----: | :----: | :----: | :----: | :----: | :----: | :----: | :----: | :----: | | PoseEstimation-CoreML | cpm | - | 27 | 27 | 32 | 31 | 31 | 39 | 37 | 44 | 115 | | PoseEstimation-CoreML | hourhglass | - | 6 | 7 | 29 | 31 | 32 | 37 | 42 | 48 | 94 |
non_process
change performance table format example inference time ms repo model xs xs max xr x poseestimation coreml cpm poseestimation coreml hourhglass
0
157,863
12,393,958,899
IssuesEvent
2020-05-20 16:11:27
spack/spack
https://api.github.com/repos/spack/spack
closed
Bug/tests: three relocate.py tests fail with additional gcc libs
bug tests
I ran `spack test` on an LLNL LC machine and three relocate tests fail. I confirmed this is the case with commit `c50b586`. The bug appears to be related to a too-specific check of the RPATHs. ### Error Message ``` ___________________________ test_replace_prefix_bin ____________________________ hello_world = <function hello_world.<locals>._factory at 0x2aaab7813d08> @pytest.mark.requires_executables('patchelf', 'strings', 'file', 'gcc') def test_replace_prefix_bin(hello_world): # Compile an "Hello world!" executable and set RPATHs executable = hello_world(rpaths=['/usr/lib', '/usr/lib64']) # Relocate the RPATHs spack.relocate._replace_prefix_bin(str(executable), '/usr', '/foo') # Check that the RPATHs changed patchelf = spack.util.executable.which('patchelf') output = patchelf('--print-rpath', str(executable), output=str) > assert output.strip() == '/foo/lib:/foo/lib64' E AssertionError: assert '/foo/tce/pac...ib:/foo/lib64' == '/foo/lib:/foo/lib64' E - /foo/tce/packages/gcc/gcc-4.9.3/lib:/foo/tce/packages/gcc/gcc-4.9.3/lib64:/foo/lib:/foo/lib64 E + /foo/lib:/foo/lib64 lib/spack/spack/test/relocate.py:256: AssertionError __________________ test_relocate_elf_binaries_absolute_paths ___________________ hello_world = <function hello_world.<locals>._factory at 0x2aaab7813950> tmpdir = local('/tmp/$USER/pytest-of-$USER/pytest-7/test_relocate_elf_binaries_abs0') @pytest.mark.requires_executables('patchelf', 'strings', 'file', 'gcc') def test_relocate_elf_binaries_absolute_paths(hello_world, tmpdir): # Create an executable, set some RPATHs, copy it to another location orig_binary = hello_world(rpaths=[str(tmpdir.mkdir('lib')), '/usr/lib64']) new_root = tmpdir.mkdir('another_dir') shutil.copy(str(orig_binary), str(new_root)) # Relocate the binary new_binary = new_root.join('main.x') spack.relocate.relocate_elf_binaries( binaries=[str(new_binary)], orig_root=str(orig_binary.dirpath()), new_root=None, # Not needed when relocating absolute paths new_prefixes={ str(tmpdir): '/foo' }, rel=False, # Not needed when relocating absolute paths orig_prefix=None, new_prefix=None ) # Check that the RPATHs changed patchelf = spack.util.executable.which('patchelf') output = patchelf('--print-rpath', str(new_binary), output=str) > assert output.strip() == '/foo/lib:/usr/lib64' E AssertionError: assert '/usr/tce/pac...ib:/usr/lib64' == '/foo/lib:/usr/lib64' E - /usr/tce/packages/gcc/gcc-4.9.3/lib:/usr/tce/packages/gcc/gcc-4.9.3/lib64:/foo/lib:/usr/lib64 E + /foo/lib:/usr/lib64 lib/spack/spack/test/relocate.py:283: AssertionError __________________ test_relocate_elf_binaries_relative_paths ___________________ hello_world = <function hello_world.<locals>._factory at 0x2aaab77b3158> tmpdir = local('/tmp/$USER/pytest-of-$USER/pytest-7/test_relocate_elf_binaries_rel0') @pytest.mark.requires_executables('patchelf', 'strings', 'file', 'gcc') def test_relocate_elf_binaries_relative_paths(hello_world, tmpdir): # Create an executable, set some RPATHs, copy it to another location orig_binary = hello_world( rpaths=['$ORIGIN/lib', '$ORIGIN/lib64', '/opt/local/lib'] ) new_root = tmpdir.mkdir('another_dir') shutil.copy(str(orig_binary), str(new_root)) # Relocate the binary new_binary = new_root.join('main.x') spack.relocate.relocate_elf_binaries( binaries=[str(new_binary)], orig_root=str(orig_binary.dirpath()), new_root=str(new_root), new_prefixes={str(tmpdir): '/foo'}, rel=True, orig_prefix=str(orig_binary.dirpath()), new_prefix=str(new_root) ) # Check that the RPATHs changed patchelf = spack.util.executable.which('patchelf') output = patchelf('--print-rpath', str(new_binary), output=str) > assert output.strip() == '/foo/lib:/foo/lib64:/opt/local/lib' E AssertionError: assert '/usr/tce/pac...opt/local/lib' == '/foo/lib:/foo...opt/local/lib' E - /usr/tce/packages/gcc/gcc-4.9.3/lib:/usr/tce/packages/gcc/gcc-4.9.3/lib64:/foo/lib:/foo/lib64:/opt/local/lib E + /foo/lib:/foo/lib64:/opt/local/lib lib/spack/spack/test/relocate.py:310: AssertionError ``` ### Information on your system * **Spack:** 0.14.2-1095-8d73ca3 * **Python:** 3.7.2 * **Platform:** linux-rhel7-zen ### Additional information - [x] I have run `spack debug report` and reported the version of Spack/Python/Platform - [x] I have searched the issues of this repo and believe this is not a duplicate
1.0
Bug/tests: three relocate.py tests fail with additional gcc libs - I ran `spack test` on an LLNL LC machine and three relocate tests fail. I confirmed this is the case with commit `c50b586`. The bug appears to be related to a too-specific check of the RPATHs. ### Error Message ``` ___________________________ test_replace_prefix_bin ____________________________ hello_world = <function hello_world.<locals>._factory at 0x2aaab7813d08> @pytest.mark.requires_executables('patchelf', 'strings', 'file', 'gcc') def test_replace_prefix_bin(hello_world): # Compile an "Hello world!" executable and set RPATHs executable = hello_world(rpaths=['/usr/lib', '/usr/lib64']) # Relocate the RPATHs spack.relocate._replace_prefix_bin(str(executable), '/usr', '/foo') # Check that the RPATHs changed patchelf = spack.util.executable.which('patchelf') output = patchelf('--print-rpath', str(executable), output=str) > assert output.strip() == '/foo/lib:/foo/lib64' E AssertionError: assert '/foo/tce/pac...ib:/foo/lib64' == '/foo/lib:/foo/lib64' E - /foo/tce/packages/gcc/gcc-4.9.3/lib:/foo/tce/packages/gcc/gcc-4.9.3/lib64:/foo/lib:/foo/lib64 E + /foo/lib:/foo/lib64 lib/spack/spack/test/relocate.py:256: AssertionError __________________ test_relocate_elf_binaries_absolute_paths ___________________ hello_world = <function hello_world.<locals>._factory at 0x2aaab7813950> tmpdir = local('/tmp/$USER/pytest-of-$USER/pytest-7/test_relocate_elf_binaries_abs0') @pytest.mark.requires_executables('patchelf', 'strings', 'file', 'gcc') def test_relocate_elf_binaries_absolute_paths(hello_world, tmpdir): # Create an executable, set some RPATHs, copy it to another location orig_binary = hello_world(rpaths=[str(tmpdir.mkdir('lib')), '/usr/lib64']) new_root = tmpdir.mkdir('another_dir') shutil.copy(str(orig_binary), str(new_root)) # Relocate the binary new_binary = new_root.join('main.x') spack.relocate.relocate_elf_binaries( binaries=[str(new_binary)], orig_root=str(orig_binary.dirpath()), new_root=None, # Not needed when relocating absolute paths new_prefixes={ str(tmpdir): '/foo' }, rel=False, # Not needed when relocating absolute paths orig_prefix=None, new_prefix=None ) # Check that the RPATHs changed patchelf = spack.util.executable.which('patchelf') output = patchelf('--print-rpath', str(new_binary), output=str) > assert output.strip() == '/foo/lib:/usr/lib64' E AssertionError: assert '/usr/tce/pac...ib:/usr/lib64' == '/foo/lib:/usr/lib64' E - /usr/tce/packages/gcc/gcc-4.9.3/lib:/usr/tce/packages/gcc/gcc-4.9.3/lib64:/foo/lib:/usr/lib64 E + /foo/lib:/usr/lib64 lib/spack/spack/test/relocate.py:283: AssertionError __________________ test_relocate_elf_binaries_relative_paths ___________________ hello_world = <function hello_world.<locals>._factory at 0x2aaab77b3158> tmpdir = local('/tmp/$USER/pytest-of-$USER/pytest-7/test_relocate_elf_binaries_rel0') @pytest.mark.requires_executables('patchelf', 'strings', 'file', 'gcc') def test_relocate_elf_binaries_relative_paths(hello_world, tmpdir): # Create an executable, set some RPATHs, copy it to another location orig_binary = hello_world( rpaths=['$ORIGIN/lib', '$ORIGIN/lib64', '/opt/local/lib'] ) new_root = tmpdir.mkdir('another_dir') shutil.copy(str(orig_binary), str(new_root)) # Relocate the binary new_binary = new_root.join('main.x') spack.relocate.relocate_elf_binaries( binaries=[str(new_binary)], orig_root=str(orig_binary.dirpath()), new_root=str(new_root), new_prefixes={str(tmpdir): '/foo'}, rel=True, orig_prefix=str(orig_binary.dirpath()), new_prefix=str(new_root) ) # Check that the RPATHs changed patchelf = spack.util.executable.which('patchelf') output = patchelf('--print-rpath', str(new_binary), output=str) > assert output.strip() == '/foo/lib:/foo/lib64:/opt/local/lib' E AssertionError: assert '/usr/tce/pac...opt/local/lib' == '/foo/lib:/foo...opt/local/lib' E - /usr/tce/packages/gcc/gcc-4.9.3/lib:/usr/tce/packages/gcc/gcc-4.9.3/lib64:/foo/lib:/foo/lib64:/opt/local/lib E + /foo/lib:/foo/lib64:/opt/local/lib lib/spack/spack/test/relocate.py:310: AssertionError ``` ### Information on your system * **Spack:** 0.14.2-1095-8d73ca3 * **Python:** 3.7.2 * **Platform:** linux-rhel7-zen ### Additional information - [x] I have run `spack debug report` and reported the version of Spack/Python/Platform - [x] I have searched the issues of this repo and believe this is not a duplicate
non_process
bug tests three relocate py tests fail with additional gcc libs i ran spack test on an llnl lc machine and three relocate tests fail i confirmed this is the case with commit the bug appears to be related to a too specific check of the rpaths error message test replace prefix bin hello world factory at pytest mark requires executables patchelf strings file gcc def test replace prefix bin hello world compile an hello world executable and set rpaths executable hello world rpaths relocate the rpaths spack relocate replace prefix bin str executable usr foo check that the rpaths changed patchelf spack util executable which patchelf output patchelf print rpath str executable output str assert output strip foo lib foo e assertionerror assert foo tce pac ib foo foo lib foo e foo tce packages gcc gcc lib foo tce packages gcc gcc foo lib foo e foo lib foo lib spack spack test relocate py assertionerror test relocate elf binaries absolute paths hello world factory at tmpdir local tmp user pytest of user pytest test relocate elf binaries pytest mark requires executables patchelf strings file gcc def test relocate elf binaries absolute paths hello world tmpdir create an executable set some rpaths copy it to another location orig binary hello world rpaths new root tmpdir mkdir another dir shutil copy str orig binary str new root relocate the binary new binary new root join main x spack relocate relocate elf binaries binaries orig root str orig binary dirpath new root none not needed when relocating absolute paths new prefixes str tmpdir foo rel false not needed when relocating absolute paths orig prefix none new prefix none check that the rpaths changed patchelf spack util executable which patchelf output patchelf print rpath str new binary output str assert output strip foo lib usr e assertionerror assert usr tce pac ib usr foo lib usr e usr tce packages gcc gcc lib usr tce packages gcc gcc foo lib usr e foo lib usr lib spack spack test relocate py assertionerror test relocate elf binaries relative paths hello world factory at tmpdir local tmp user pytest of user pytest test relocate elf binaries pytest mark requires executables patchelf strings file gcc def test relocate elf binaries relative paths hello world tmpdir create an executable set some rpaths copy it to another location orig binary hello world rpaths new root tmpdir mkdir another dir shutil copy str orig binary str new root relocate the binary new binary new root join main x spack relocate relocate elf binaries binaries orig root str orig binary dirpath new root str new root new prefixes str tmpdir foo rel true orig prefix str orig binary dirpath new prefix str new root check that the rpaths changed patchelf spack util executable which patchelf output patchelf print rpath str new binary output str assert output strip foo lib foo opt local lib e assertionerror assert usr tce pac opt local lib foo lib foo opt local lib e usr tce packages gcc gcc lib usr tce packages gcc gcc foo lib foo opt local lib e foo lib foo opt local lib lib spack spack test relocate py assertionerror information on your system spack python platform linux zen additional information i have run spack debug report and reported the version of spack python platform i have searched the issues of this repo and believe this is not a duplicate
0
19,504
25,812,565,446
IssuesEvent
2022-12-12 00:22:05
esmero/strawberryfield
https://api.github.com/repos/esmero/strawberryfield
closed
Trigger a parent Node (or parent parent) Index tracker update on last sequence of a SBF + Computed field
enhancement Drupal Views JSON Postprocessors Property Keys Providers Events and Subscriber Typed Data and Search Strawberry Flavor
# What? We need to aggregate Strawberry Flavors at the ADO level for unified search to happen (performance reasons means a subquery is a bad idea given how Drupal Views work related to Search API functionality...) The idea is basic. An ADO Solr Document is ready way before all their Flavor Children are ready. So we need a performant way (means not on every Child generation) to tell Solr that the parent needs to be reindexed so it can gather all the little children at once. This also means we need a Computed (global field?) That does the gathering: Two options: - Use the same idea of the one that already exist and renders a View Mode, but this one will do "a Solr query" to gather all Text only Values from all Flavors. - Add logic to the Key Name provider we already have for this and do the logic there. Means expose the Text as a Field property.
1.0
Trigger a parent Node (or parent parent) Index tracker update on last sequence of a SBF + Computed field - # What? We need to aggregate Strawberry Flavors at the ADO level for unified search to happen (performance reasons means a subquery is a bad idea given how Drupal Views work related to Search API functionality...) The idea is basic. An ADO Solr Document is ready way before all their Flavor Children are ready. So we need a performant way (means not on every Child generation) to tell Solr that the parent needs to be reindexed so it can gather all the little children at once. This also means we need a Computed (global field?) That does the gathering: Two options: - Use the same idea of the one that already exist and renders a View Mode, but this one will do "a Solr query" to gather all Text only Values from all Flavors. - Add logic to the Key Name provider we already have for this and do the logic there. Means expose the Text as a Field property.
process
trigger a parent node or parent parent index tracker update on last sequence of a sbf computed field what we need to aggregate strawberry flavors at the ado level for unified search to happen performance reasons means a subquery is a bad idea given how drupal views work related to search api functionality the idea is basic an ado solr document is ready way before all their flavor children are ready so we need a performant way means not on every child generation to tell solr that the parent needs to be reindexed so it can gather all the little children at once this also means we need a computed global field that does the gathering two options use the same idea of the one that already exist and renders a view mode but this one will do a solr query to gather all text only values from all flavors add logic to the key name provider we already have for this and do the logic there means expose the text as a field property
1
8,603
11,761,334,129
IssuesEvent
2020-03-13 21:37:43
kubernetes/minikube
https://api.github.com/repos/kubernetes/minikube
closed
release_update_brew: access_token: unbound variable
help wanted kind/process packaging/brew priority/important-soon
minikube migrated from cask to brew, but the automation has not been updated to use `bump-formula-pr.rb` to trigger updates: https://github.com/Linuxbrew/brew/blob/master/docs/How-To-Open-a-Homebrew-Pull-Request.md Here is where the automation should go: https://github.com/kubernetes/minikube/blob/9be404689b6860f189147bb53050b6f09a15681a/hack/jenkins/release_update_brew.sh#L43
1.0
release_update_brew: access_token: unbound variable - minikube migrated from cask to brew, but the automation has not been updated to use `bump-formula-pr.rb` to trigger updates: https://github.com/Linuxbrew/brew/blob/master/docs/How-To-Open-a-Homebrew-Pull-Request.md Here is where the automation should go: https://github.com/kubernetes/minikube/blob/9be404689b6860f189147bb53050b6f09a15681a/hack/jenkins/release_update_brew.sh#L43
process
release update brew access token unbound variable minikube migrated from cask to brew but the automation has not been updated to use bump formula pr rb to trigger updates here is where the automation should go
1
81,490
15,630,052,062
IssuesEvent
2021-03-22 01:12:33
benchabot/react-native-maps
https://api.github.com/repos/benchabot/react-native-maps
opened
CVE-2020-7754 (High) detected in npm-user-validate-1.0.0.tgz, npm-user-validate-0.1.5.tgz
security vulnerability
## CVE-2020-7754 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>npm-user-validate-1.0.0.tgz</b>, <b>npm-user-validate-0.1.5.tgz</b></p></summary> <p> <details><summary><b>npm-user-validate-1.0.0.tgz</b></p></summary> <p>User validations for npm</p> <p>Library home page: <a href="https://registry.npmjs.org/npm-user-validate/-/npm-user-validate-1.0.0.tgz">https://registry.npmjs.org/npm-user-validate/-/npm-user-validate-1.0.0.tgz</a></p> <p>Path to dependency file: react-native-maps/package.json</p> <p>Path to vulnerable library: react-native-maps/node_modules/npm/node_modules/npm-user-validate/package.json</p> <p> Dependency Hierarchy: - gitbook-cli-2.3.2.tgz (Root Library) - npm-5.1.0.tgz - :x: **npm-user-validate-1.0.0.tgz** (Vulnerable Library) </details> <details><summary><b>npm-user-validate-0.1.5.tgz</b></p></summary> <p>User validations for npm</p> <p>Library home page: <a href="https://registry.npmjs.org/npm-user-validate/-/npm-user-validate-0.1.5.tgz">https://registry.npmjs.org/npm-user-validate/-/npm-user-validate-0.1.5.tgz</a></p> <p>Path to dependency file: react-native-maps/package.json</p> <p>Path to vulnerable library: react-native-maps/node_modules/npmi/node_modules/npm/node_modules/npm-user-validate/package.json</p> <p> Dependency Hierarchy: - gitbook-cli-2.3.2.tgz (Root Library) - npmi-1.0.1.tgz - npm-2.15.12.tgz - :x: **npm-user-validate-0.1.5.tgz** (Vulnerable Library) </details> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package npm-user-validate before 1.0.1. The regex that validates user emails took exponentially longer to process long input strings beginning with @ characters. <p>Publish Date: 2020-10-27 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7754>CVE-2020-7754</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7754">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7754</a></p> <p>Release Date: 2020-07-21</p> <p>Fix Resolution: 1.0.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-7754 (High) detected in npm-user-validate-1.0.0.tgz, npm-user-validate-0.1.5.tgz - ## CVE-2020-7754 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>npm-user-validate-1.0.0.tgz</b>, <b>npm-user-validate-0.1.5.tgz</b></p></summary> <p> <details><summary><b>npm-user-validate-1.0.0.tgz</b></p></summary> <p>User validations for npm</p> <p>Library home page: <a href="https://registry.npmjs.org/npm-user-validate/-/npm-user-validate-1.0.0.tgz">https://registry.npmjs.org/npm-user-validate/-/npm-user-validate-1.0.0.tgz</a></p> <p>Path to dependency file: react-native-maps/package.json</p> <p>Path to vulnerable library: react-native-maps/node_modules/npm/node_modules/npm-user-validate/package.json</p> <p> Dependency Hierarchy: - gitbook-cli-2.3.2.tgz (Root Library) - npm-5.1.0.tgz - :x: **npm-user-validate-1.0.0.tgz** (Vulnerable Library) </details> <details><summary><b>npm-user-validate-0.1.5.tgz</b></p></summary> <p>User validations for npm</p> <p>Library home page: <a href="https://registry.npmjs.org/npm-user-validate/-/npm-user-validate-0.1.5.tgz">https://registry.npmjs.org/npm-user-validate/-/npm-user-validate-0.1.5.tgz</a></p> <p>Path to dependency file: react-native-maps/package.json</p> <p>Path to vulnerable library: react-native-maps/node_modules/npmi/node_modules/npm/node_modules/npm-user-validate/package.json</p> <p> Dependency Hierarchy: - gitbook-cli-2.3.2.tgz (Root Library) - npmi-1.0.1.tgz - npm-2.15.12.tgz - :x: **npm-user-validate-0.1.5.tgz** (Vulnerable Library) </details> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package npm-user-validate before 1.0.1. The regex that validates user emails took exponentially longer to process long input strings beginning with @ characters. <p>Publish Date: 2020-10-27 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7754>CVE-2020-7754</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7754">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7754</a></p> <p>Release Date: 2020-07-21</p> <p>Fix Resolution: 1.0.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in npm user validate tgz npm user validate tgz cve high severity vulnerability vulnerable libraries npm user validate tgz npm user validate tgz npm user validate tgz user validations for npm library home page a href path to dependency file react native maps package json path to vulnerable library react native maps node modules npm node modules npm user validate package json dependency hierarchy gitbook cli tgz root library npm tgz x npm user validate tgz vulnerable library npm user validate tgz user validations for npm library home page a href path to dependency file react native maps package json path to vulnerable library react native maps node modules npmi node modules npm node modules npm user validate package json dependency hierarchy gitbook cli tgz root library npmi tgz npm tgz x npm user validate tgz vulnerable library vulnerability details this affects the package npm user validate before the regex that validates user emails took exponentially longer to process long input strings beginning with characters publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
19,264
25,455,863,274
IssuesEvent
2022-11-24 14:08:44
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[IDP] [PM] Admin is able to sign in , even when same user is disabled in the organizational directory
Bug P0 Participant manager Process: Fixed Process: Tested QA Process: Tested dev
**Steps:** 1. Login to PM 2. Click on 'Admins' tab 3. Click on 'Add admin' button 4. Add organizational user in the email dropdown 5. After receiving the link to set up your participant manager account 6. Deactivate the same user in identity platform 7. Now, Click on the link to set up your participant manager account 8. Complete the account creation process and Sign in **AR:** Admin is able to sign in to participant manager in above scenario and also admin account can be deactivated from the application level **ER:** Admin should not be able to sign in to participant manager in above scenario . Admin should not be able to deactivate organizational admin from the application level and also error message should get displayed when try to deactivate the account
3.0
[IDP] [PM] Admin is able to sign in , even when same user is disabled in the organizational directory - **Steps:** 1. Login to PM 2. Click on 'Admins' tab 3. Click on 'Add admin' button 4. Add organizational user in the email dropdown 5. After receiving the link to set up your participant manager account 6. Deactivate the same user in identity platform 7. Now, Click on the link to set up your participant manager account 8. Complete the account creation process and Sign in **AR:** Admin is able to sign in to participant manager in above scenario and also admin account can be deactivated from the application level **ER:** Admin should not be able to sign in to participant manager in above scenario . Admin should not be able to deactivate organizational admin from the application level and also error message should get displayed when try to deactivate the account
process
admin is able to sign in even when same user is disabled in the organizational directory steps login to pm click on admins tab click on add admin button add organizational user in the email dropdown after receiving the link to set up your participant manager account deactivate the same user in identity platform now click on the link to set up your participant manager account complete the account creation process and sign in ar admin is able to sign in to participant manager in above scenario and also admin account can be deactivated from the application level er admin should not be able to sign in to participant manager in above scenario admin should not be able to deactivate organizational admin from the application level and also error message should get displayed when try to deactivate the account
1
15,125
18,869,551,051
IssuesEvent
2021-11-13 00:41:45
2i2c-org/team-compass
https://api.github.com/repos/2i2c-org/team-compass
opened
Create a 2i2c sustainability team
type: enhancement :label: strategy :label: team-process :label: business development
### Description Right now there are several issues related to sustainability that are largely being spearheaded by Chris. Because he is also working on many other parts of 2i2c (see https://github.com/2i2c-org/meta/issues/256 among other conversations), there are likely sub-optimal decisions being made and balls being dropped. Moreover, Chris prefers working with teams of people rather than alone, and believes this leads to better decisions and outcomes in general. ## Suggested solution I suggest we form a "sustainability team" within 2i2c. This is a team of people dedicated with overseeing 2i2c's strategy and execution of efforts to sustain and grow the organization. It includes topics like the following: - Creating business strategy around 2i2c services - Interfacing with the engineering team to guide the creation of this strategy - Overseeing the execution of this strategy - Developing new leads and collaboration opportunities for 2i2c - Overseeing the sales process and touchpoints with CS&S (either doing the work directly, or delegating as necessary) - Communicating sustainability-related efforts with the Steering Council or external partners. ## Who is on this team? Anybody on the @2i2c-org/steering-council or @2i2c-org/tech-team is welcome to join this team. Being on this team means committing to attending most of the Sustainability Team meetings, and potentially working on items that are related to sustainability. ## Who leads this team? Initially, I propose that this team is co-led by @colliand and @choldgraf. ## How does this team operate? At its onset, this team meets once a week for a half hour, in order to triage and discuss major sustainability-related efforts and coordinate work. Over time this may change (particularly after the roll-out of the alpha service). ## Long term vision for this team If 2i2c grows its services and team enough, this team would have a dedicated lead paid via 2i2c's funds as a "Director of sustainability" or a similar title. It might also morph into a slightly different focus (and potentially name) depending on our experiences in this run-through. ### Value / benefit The goal of creating a team dedicated to this is so share knowledge and responsibility among several team members, as well as providing an explicit opportunity for more voices to be part of the "sustainability" conversation with 2i2c. This kind of work is important enough to 2i2c that it warrants dedicated thinking, and having a dedicated team to will help ensure that this happens. It is also potentially a nice pre-cursor to a persistent group that can work on these kinds of "businessy" things. Another benefit is that this may be an opportunity to show leadership and innovation in the non-profit space. There aren't that many organizations that are governed and organized as a non-profit, but that also do things like "sales" and "customer service". This might be an opportunity for 2i2c to show leadership in how a values- and mission-driven organization can nonetheless achieve sustainability through a service model. ### Implementation details _No response_ ### Tasks to complete - [ ] No objections to forming this team - [ ] 2i2c members who are interested in joining this team ping in this issue - [ ] Have our first team meeting and figure out what our work and communication cadence looks like - [ ] Document this team in our team compass and website ### Updates _No response_
1.0
Create a 2i2c sustainability team - ### Description Right now there are several issues related to sustainability that are largely being spearheaded by Chris. Because he is also working on many other parts of 2i2c (see https://github.com/2i2c-org/meta/issues/256 among other conversations), there are likely sub-optimal decisions being made and balls being dropped. Moreover, Chris prefers working with teams of people rather than alone, and believes this leads to better decisions and outcomes in general. ## Suggested solution I suggest we form a "sustainability team" within 2i2c. This is a team of people dedicated with overseeing 2i2c's strategy and execution of efforts to sustain and grow the organization. It includes topics like the following: - Creating business strategy around 2i2c services - Interfacing with the engineering team to guide the creation of this strategy - Overseeing the execution of this strategy - Developing new leads and collaboration opportunities for 2i2c - Overseeing the sales process and touchpoints with CS&S (either doing the work directly, or delegating as necessary) - Communicating sustainability-related efforts with the Steering Council or external partners. ## Who is on this team? Anybody on the @2i2c-org/steering-council or @2i2c-org/tech-team is welcome to join this team. Being on this team means committing to attending most of the Sustainability Team meetings, and potentially working on items that are related to sustainability. ## Who leads this team? Initially, I propose that this team is co-led by @colliand and @choldgraf. ## How does this team operate? At its onset, this team meets once a week for a half hour, in order to triage and discuss major sustainability-related efforts and coordinate work. Over time this may change (particularly after the roll-out of the alpha service). ## Long term vision for this team If 2i2c grows its services and team enough, this team would have a dedicated lead paid via 2i2c's funds as a "Director of sustainability" or a similar title. It might also morph into a slightly different focus (and potentially name) depending on our experiences in this run-through. ### Value / benefit The goal of creating a team dedicated to this is so share knowledge and responsibility among several team members, as well as providing an explicit opportunity for more voices to be part of the "sustainability" conversation with 2i2c. This kind of work is important enough to 2i2c that it warrants dedicated thinking, and having a dedicated team to will help ensure that this happens. It is also potentially a nice pre-cursor to a persistent group that can work on these kinds of "businessy" things. Another benefit is that this may be an opportunity to show leadership and innovation in the non-profit space. There aren't that many organizations that are governed and organized as a non-profit, but that also do things like "sales" and "customer service". This might be an opportunity for 2i2c to show leadership in how a values- and mission-driven organization can nonetheless achieve sustainability through a service model. ### Implementation details _No response_ ### Tasks to complete - [ ] No objections to forming this team - [ ] 2i2c members who are interested in joining this team ping in this issue - [ ] Have our first team meeting and figure out what our work and communication cadence looks like - [ ] Document this team in our team compass and website ### Updates _No response_
process
create a sustainability team description right now there are several issues related to sustainability that are largely being spearheaded by chris because he is also working on many other parts of see among other conversations there are likely sub optimal decisions being made and balls being dropped moreover chris prefers working with teams of people rather than alone and believes this leads to better decisions and outcomes in general suggested solution i suggest we form a sustainability team within this is a team of people dedicated with overseeing s strategy and execution of efforts to sustain and grow the organization it includes topics like the following creating business strategy around services interfacing with the engineering team to guide the creation of this strategy overseeing the execution of this strategy developing new leads and collaboration opportunities for overseeing the sales process and touchpoints with cs s either doing the work directly or delegating as necessary communicating sustainability related efforts with the steering council or external partners who is on this team anybody on the org steering council or org tech team is welcome to join this team being on this team means committing to attending most of the sustainability team meetings and potentially working on items that are related to sustainability who leads this team initially i propose that this team is co led by colliand and choldgraf how does this team operate at its onset this team meets once a week for a half hour in order to triage and discuss major sustainability related efforts and coordinate work over time this may change particularly after the roll out of the alpha service long term vision for this team if grows its services and team enough this team would have a dedicated lead paid via s funds as a director of sustainability or a similar title it might also morph into a slightly different focus and potentially name depending on our experiences in this run through value benefit the goal of creating a team dedicated to this is so share knowledge and responsibility among several team members as well as providing an explicit opportunity for more voices to be part of the sustainability conversation with this kind of work is important enough to that it warrants dedicated thinking and having a dedicated team to will help ensure that this happens it is also potentially a nice pre cursor to a persistent group that can work on these kinds of businessy things another benefit is that this may be an opportunity to show leadership and innovation in the non profit space there aren t that many organizations that are governed and organized as a non profit but that also do things like sales and customer service this might be an opportunity for to show leadership in how a values and mission driven organization can nonetheless achieve sustainability through a service model implementation details no response tasks to complete no objections to forming this team members who are interested in joining this team ping in this issue have our first team meeting and figure out what our work and communication cadence looks like document this team in our team compass and website updates no response
1
10,000
13,042,376,094
IssuesEvent
2020-07-28 22:21:05
hashicorp/packer
https://api.github.com/repos/hashicorp/packer
closed
Vsphere postprocessor hangs on ssl fingerprint verification
bug post-processor/vsphere
Packer 0.12.0 on linux. Ovftools 4.10, vcenter 5.10 The postprocessor vsphere hangs forever without any message (even with PACKER_LOG=1) ```javascript "post-processors": [{ "type": "vsphere", "disk_mode": "thin", "host": "{{user `vcenter_host`}}", "datastore": "{{user `vcenter_datastore`}}", "username": "{{user `vcenter_username`}}", "password": "{{user `vcenter_password`}}", "vm_name": "{{ user `image_name` }}_template", "vm_network": "{{user `vcenter_network`}}", "cluster": "{{user `vcenter_cluster`}}", "datacenter": "{{user `vcenter_datacenter`}}", "overwrite": "true" }, ``` I've tried to do the same operation directly with ovftools /usr/lib/vmware-ovftool/ovftool --acceptAllEulas --name=mytemplate --datastore=myds --diskMode=thin --network=VM\ Network mytemplate.ova vi://me:password@myvcenter/mydatacenter/host/mycluster Opening OVA source: mytemplate.ova The manifest validates Accept SSL fingerprint (AE:83:46:34:7F:DD:40:53:CB:69:B2:F4:15:2F:2C:0B:00:77:49:BD) for myvcenter as target type. Fingerprint will be added to the known host file Write 'yes' or 'no' I've accepted with yes, it's worked. I've restarted packer build and the postprocessor doesn't hangs any more.
1.0
Vsphere postprocessor hangs on ssl fingerprint verification - Packer 0.12.0 on linux. Ovftools 4.10, vcenter 5.10 The postprocessor vsphere hangs forever without any message (even with PACKER_LOG=1) ```javascript "post-processors": [{ "type": "vsphere", "disk_mode": "thin", "host": "{{user `vcenter_host`}}", "datastore": "{{user `vcenter_datastore`}}", "username": "{{user `vcenter_username`}}", "password": "{{user `vcenter_password`}}", "vm_name": "{{ user `image_name` }}_template", "vm_network": "{{user `vcenter_network`}}", "cluster": "{{user `vcenter_cluster`}}", "datacenter": "{{user `vcenter_datacenter`}}", "overwrite": "true" }, ``` I've tried to do the same operation directly with ovftools /usr/lib/vmware-ovftool/ovftool --acceptAllEulas --name=mytemplate --datastore=myds --diskMode=thin --network=VM\ Network mytemplate.ova vi://me:password@myvcenter/mydatacenter/host/mycluster Opening OVA source: mytemplate.ova The manifest validates Accept SSL fingerprint (AE:83:46:34:7F:DD:40:53:CB:69:B2:F4:15:2F:2C:0B:00:77:49:BD) for myvcenter as target type. Fingerprint will be added to the known host file Write 'yes' or 'no' I've accepted with yes, it's worked. I've restarted packer build and the postprocessor doesn't hangs any more.
process
vsphere postprocessor hangs on ssl fingerprint verification packer on linux ovftools vcenter the postprocessor vsphere hangs forever without any message even with packer log javascript post processors type vsphere disk mode thin host user vcenter host datastore user vcenter datastore username user vcenter username password user vcenter password vm name user image name template vm network user vcenter network cluster user vcenter cluster datacenter user vcenter datacenter overwrite true i ve tried to do the same operation directly with ovftools usr lib vmware ovftool ovftool acceptalleulas name mytemplate datastore myds diskmode thin network vm network mytemplate ova vi me password myvcenter mydatacenter host mycluster opening ova source mytemplate ova the manifest validates accept ssl fingerprint ae dd cb bd for myvcenter as target type fingerprint will be added to the known host file write yes or no i ve accepted with yes it s worked i ve restarted packer build and the postprocessor doesn t hangs any more
1
15,933
20,158,933,300
IssuesEvent
2022-02-09 19:16:37
2i2c-org/team-compass
https://api.github.com/repos/2i2c-org/team-compass
opened
Use GitHub as a CDN for our images / GIFs in documentation
:label: team-process
### Description of problem and opportunity to address it **Context to understand the problem** It is common for us to include images and GIFs as a part of our documentation. Currently, we just check those into git like any other file. However, binary files like these will beef up our git history by quite a lot, especially over time, which can make the repository more unwieldy to download and work with. **Proposed solution** Instead of directly adding PNG files to our repository, we use a dedicated "Media Upload Issue" for each repository. We upload images to this repo as comments via GitHub's UI, and then link them from within our docs, rather than committing the binary files to our git history. This would mean that our git history is simply tracking links to images, rather than the images themselves. ### Implementation guide and constraints **If we wanted to update an image** Here's what the process would look like: - Make changes to image on your computer - Go to the "media upload" issue - Find the comment associated with this image - Re-upload your new image, and delete the link to the old image - Copy the github URL that's generated, and paste that into our docs If we do this, we should probably choose a standardized name, something like: Title: MEDIA UPLOAD ISSUE` Top comment: This is an issue for uploading images, GIFs, and other assets to GitHub that we then link directly from our documentation. Each comment should correspond to a media asset, with a brief explanation or link about where it goes in our documentation. Upload the image manually into the comment by dragging and dropping it into the text box. ### Updates and ongoing work _No response_
1.0
Use GitHub as a CDN for our images / GIFs in documentation - ### Description of problem and opportunity to address it **Context to understand the problem** It is common for us to include images and GIFs as a part of our documentation. Currently, we just check those into git like any other file. However, binary files like these will beef up our git history by quite a lot, especially over time, which can make the repository more unwieldy to download and work with. **Proposed solution** Instead of directly adding PNG files to our repository, we use a dedicated "Media Upload Issue" for each repository. We upload images to this repo as comments via GitHub's UI, and then link them from within our docs, rather than committing the binary files to our git history. This would mean that our git history is simply tracking links to images, rather than the images themselves. ### Implementation guide and constraints **If we wanted to update an image** Here's what the process would look like: - Make changes to image on your computer - Go to the "media upload" issue - Find the comment associated with this image - Re-upload your new image, and delete the link to the old image - Copy the github URL that's generated, and paste that into our docs If we do this, we should probably choose a standardized name, something like: Title: MEDIA UPLOAD ISSUE` Top comment: This is an issue for uploading images, GIFs, and other assets to GitHub that we then link directly from our documentation. Each comment should correspond to a media asset, with a brief explanation or link about where it goes in our documentation. Upload the image manually into the comment by dragging and dropping it into the text box. ### Updates and ongoing work _No response_
process
use github as a cdn for our images gifs in documentation description of problem and opportunity to address it context to understand the problem it is common for us to include images and gifs as a part of our documentation currently we just check those into git like any other file however binary files like these will beef up our git history by quite a lot especially over time which can make the repository more unwieldy to download and work with proposed solution instead of directly adding png files to our repository we use a dedicated media upload issue for each repository we upload images to this repo as comments via github s ui and then link them from within our docs rather than committing the binary files to our git history this would mean that our git history is simply tracking links to images rather than the images themselves implementation guide and constraints if we wanted to update an image here s what the process would look like make changes to image on your computer go to the media upload issue find the comment associated with this image re upload your new image and delete the link to the old image copy the github url that s generated and paste that into our docs if we do this we should probably choose a standardized name something like title media upload issue top comment this is an issue for uploading images gifs and other assets to github that we then link directly from our documentation each comment should correspond to a media asset with a brief explanation or link about where it goes in our documentation upload the image manually into the comment by dragging and dropping it into the text box updates and ongoing work no response
1
17,653
23,472,139,697
IssuesEvent
2022-08-16 23:33:03
brucemiller/LaTeXML
https://api.github.com/repos/brucemiller/LaTeXML
closed
download button in listings data broken in EPUB3
bug packages postprocessing
When clicking on the download button of a `\begin{listings}` environment, some EPUB readers (Apple Books!) navigate to the file but do not offer a way back. In the case of Apple Books, one needs to trigger a reflow by resizing the window to resume reading. This is arguably a bug in Apple Books. This happens even if I add the `download` attribute using #1451. Still, I checked the EPUB3 spec, and if I understand correctly, using data URLs in `<a>` tags breaks the standard: - resources of non-core media types are foreign resources [1], which can be attached, but must be treated according to certain rules, and have a reasonable fallback; note that even `text/plain` is *not* a core media-type; - I also guess that all navigation links must correspond to entries in the manifest, which is probably the reason for Apple Books getting lost. Given that the button is broken, I suggest to remove it (PR to follow soon). The note in [1] suggests an alternative implementation, but it seems like the users are supposed to unzip the EPUB package by themselves. I should add that `epubcheck` does not complain if a listing, and thus its download button, is present, so I may be getting this very wrong. [1] https://www.w3.org/publishing/epub32/epub-spec.html#sec-foreign-restrictions PS: minimal example ```LaTeX \documentclass{article} \usepackage{listings} \begin{document} Click the download button to navigate away... and never come back. \begin{lstlisting} # code \end{lstlisting} \end{document} ```
1.0
download button in listings data broken in EPUB3 - When clicking on the download button of a `\begin{listings}` environment, some EPUB readers (Apple Books!) navigate to the file but do not offer a way back. In the case of Apple Books, one needs to trigger a reflow by resizing the window to resume reading. This is arguably a bug in Apple Books. This happens even if I add the `download` attribute using #1451. Still, I checked the EPUB3 spec, and if I understand correctly, using data URLs in `<a>` tags breaks the standard: - resources of non-core media types are foreign resources [1], which can be attached, but must be treated according to certain rules, and have a reasonable fallback; note that even `text/plain` is *not* a core media-type; - I also guess that all navigation links must correspond to entries in the manifest, which is probably the reason for Apple Books getting lost. Given that the button is broken, I suggest to remove it (PR to follow soon). The note in [1] suggests an alternative implementation, but it seems like the users are supposed to unzip the EPUB package by themselves. I should add that `epubcheck` does not complain if a listing, and thus its download button, is present, so I may be getting this very wrong. [1] https://www.w3.org/publishing/epub32/epub-spec.html#sec-foreign-restrictions PS: minimal example ```LaTeX \documentclass{article} \usepackage{listings} \begin{document} Click the download button to navigate away... and never come back. \begin{lstlisting} # code \end{lstlisting} \end{document} ```
process
download button in listings data broken in when clicking on the download button of a begin listings environment some epub readers apple books navigate to the file but do not offer a way back in the case of apple books one needs to trigger a reflow by resizing the window to resume reading this is arguably a bug in apple books this happens even if i add the download attribute using still i checked the spec and if i understand correctly using data urls in tags breaks the standard resources of non core media types are foreign resources which can be attached but must be treated according to certain rules and have a reasonable fallback note that even text plain is not a core media type i also guess that all navigation links must correspond to entries in the manifest which is probably the reason for apple books getting lost given that the button is broken i suggest to remove it pr to follow soon the note in suggests an alternative implementation but it seems like the users are supposed to unzip the epub package by themselves i should add that epubcheck does not complain if a listing and thus its download button is present so i may be getting this very wrong ps minimal example latex documentclass article usepackage listings begin document click the download button to navigate away and never come back begin lstlisting code end lstlisting end document
1
11,556
7,293,676,986
IssuesEvent
2018-02-25 16:34:22
eclipse/dirigible
https://api.github.com/repos/eclipse/dirigible
closed
Schema Modeler
component-ide component-workspace enhancement usability web-ide
An editor for database schema model. It has to provide a generic definition of tables with relations as well as the supported column attributes by the current data structure model.
True
Schema Modeler - An editor for database schema model. It has to provide a generic definition of tables with relations as well as the supported column attributes by the current data structure model.
non_process
schema modeler an editor for database schema model it has to provide a generic definition of tables with relations as well as the supported column attributes by the current data structure model
0
8,149
11,354,729,456
IssuesEvent
2020-01-24 18:19:17
googleapis/java-cloudbuild
https://api.github.com/repos/googleapis/java-cloudbuild
closed
Promote to GA
type: process
Package name: **google-cloud-build** Current release: **beta** Proposed release: **GA** ## Instructions Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue. ## Required - [ ] 28 days elapsed since last beta release with new API surface - [ ] Server API is GA - [ ] Package API is stable, and we can commit to backward compatibility - [ ] All dependencies are GA ## Optional - [ ] Most common / important scenarios have descriptive samples - [ ] Public manual methods have at least one usage sample each (excluding overloads) - [ ] Per-API README includes a full description of the API - [ ] Per-API README contains at least one โ€œgetting startedโ€ sample using the most common API scenario - [ ] Manual code has been reviewed by API producer - [ ] Manual code has been reviewed by a DPE responsible for samples - [ ] 'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
1.0
Promote to GA - Package name: **google-cloud-build** Current release: **beta** Proposed release: **GA** ## Instructions Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue. ## Required - [ ] 28 days elapsed since last beta release with new API surface - [ ] Server API is GA - [ ] Package API is stable, and we can commit to backward compatibility - [ ] All dependencies are GA ## Optional - [ ] Most common / important scenarios have descriptive samples - [ ] Public manual methods have at least one usage sample each (excluding overloads) - [ ] Per-API README includes a full description of the API - [ ] Per-API README contains at least one โ€œgetting startedโ€ sample using the most common API scenario - [ ] Manual code has been reviewed by API producer - [ ] Manual code has been reviewed by a DPE responsible for samples - [ ] 'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
process
promote to ga package name google cloud build current release beta proposed release ga instructions check the lists below adding tests documentation as required once all the required boxes are ticked please create a release and close this issue required days elapsed since last beta release with new api surface server api is ga package api is stable and we can commit to backward compatibility all dependencies are ga optional most common important scenarios have descriptive samples public manual methods have at least one usage sample each excluding overloads per api readme includes a full description of the api per api readme contains at least one โ€œgetting startedโ€ sample using the most common api scenario manual code has been reviewed by api producer manual code has been reviewed by a dpe responsible for samples client libraries page is added to the product documentation in apis reference section of the product s documentation on cloud site
1
799
3,097,399,655
IssuesEvent
2015-08-28 01:29:32
cucyberdefense/defense-hackpack
https://api.github.com/repos/cucyberdefense/defense-hackpack
opened
Service management in SysV init
services
Examples of how to manage services on Linux using `/etc/init.d/*`
1.0
Service management in SysV init - Examples of how to manage services on Linux using `/etc/init.d/*`
non_process
service management in sysv init examples of how to manage services on linux using etc init d
0
20,304
26,944,137,913
IssuesEvent
2023-02-08 06:18:17
bobocode-blyznytsia/bring-framework
https://api.github.com/repos/bobocode-blyznytsia/bring-framework
closed
Implement RawBeanProcessor
bean-post-processor
### Description The `RawBeanProcessor` is responsible for the construction and initialization logic of `Bean`. ### Solution In context of this story the `RawBeanProcessor` should be implemented. `RawBeanProcessor` creates beans and puts them into the `Map<String, Object> rawBeanMap`. Injecting of dependencies, which bean has, **is not needed**. We suppose that fields of the bean are not final. The dependency fields will be `null`. The injection of dependencies will be done by `AutowiredBeanPostProcessor`. Please, check the details in the diagram _(provided in Notes section)_ ### DoD - [ x] `RawBeanProcessor` creates basic instance of bean - [ x] Created beans are present in the `rawBeanMap` - [ x] Dependencies are not injected - [ x] Unit tests are green - [ x] JavaDoc is provided ### Resources - Please see the drawing with related interface: [bring-framework.drawio](https://viewer.diagrams.net/?tags=%7B%7D&highlight=0000ff&edit=_blank&layers=1&nav=1#G1DO0TqjCtae64B741QGthiLZ3S2vjKLSy) - _If you want to correct the diagram, use the link:_ https://app.diagrams.net/#G1DO0TqjCtae64B741QGthiLZ3S2vjKLSy
1.0
Implement RawBeanProcessor - ### Description The `RawBeanProcessor` is responsible for the construction and initialization logic of `Bean`. ### Solution In context of this story the `RawBeanProcessor` should be implemented. `RawBeanProcessor` creates beans and puts them into the `Map<String, Object> rawBeanMap`. Injecting of dependencies, which bean has, **is not needed**. We suppose that fields of the bean are not final. The dependency fields will be `null`. The injection of dependencies will be done by `AutowiredBeanPostProcessor`. Please, check the details in the diagram _(provided in Notes section)_ ### DoD - [ x] `RawBeanProcessor` creates basic instance of bean - [ x] Created beans are present in the `rawBeanMap` - [ x] Dependencies are not injected - [ x] Unit tests are green - [ x] JavaDoc is provided ### Resources - Please see the drawing with related interface: [bring-framework.drawio](https://viewer.diagrams.net/?tags=%7B%7D&highlight=0000ff&edit=_blank&layers=1&nav=1#G1DO0TqjCtae64B741QGthiLZ3S2vjKLSy) - _If you want to correct the diagram, use the link:_ https://app.diagrams.net/#G1DO0TqjCtae64B741QGthiLZ3S2vjKLSy
process
implement rawbeanprocessor description the rawbeanprocessor is responsible for the construction and initialization logic of bean solution in context of this story the rawbeanprocessor should be implemented rawbeanprocessor creates beans and puts them into the map rawbeanmap injecting of dependencies which bean has is not needed we suppose that fields of the bean are not final the dependency fields will be null the injection of dependencies will be done by autowiredbeanpostprocessor please check the details in the diagram provided in notes section dod rawbeanprocessor creates basic instance of bean created beans are present in the rawbeanmap dependencies are not injected unit tests are green javadoc is provided resources please see the drawing with related interface if you want to correct the diagram use the link
1
10,337
13,165,465,500
IssuesEvent
2020-08-11 06:39:30
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
NTR: effector-mediated supression of host host salicylic acid-mediated signal transduction pathway
multi-species process
NTR: effector-mediated suppression of host salicylic acid-mediated host innate immune signalling. A process mediated by a molecule secreted by a symbiont that results in the suppression of host salicylic acid-mediated host innate immune signalling. descendant of GO:0052003 suppression by symbiont of defense-related host salicylic acid-mediated signal transduction pathway and GO:0140404 | effector-mediated modulation of host innate immune response by symbiont I suspect this will always be "effector-mediatedf' so we might be able to make GO:0052003 more specific
1.0
NTR: effector-mediated supression of host host salicylic acid-mediated signal transduction pathway - NTR: effector-mediated suppression of host salicylic acid-mediated host innate immune signalling. A process mediated by a molecule secreted by a symbiont that results in the suppression of host salicylic acid-mediated host innate immune signalling. descendant of GO:0052003 suppression by symbiont of defense-related host salicylic acid-mediated signal transduction pathway and GO:0140404 | effector-mediated modulation of host innate immune response by symbiont I suspect this will always be "effector-mediatedf' so we might be able to make GO:0052003 more specific
process
ntr effector mediated supression of host host salicylic acid mediated signal transduction pathway ntr effector mediated suppression of host salicylic acid mediated host innate immune signalling a process mediated by a molecule secreted by a symbiont that results in the suppression of host salicylic acid mediated host innate immune signalling descendant of go suppression by symbiont of defense related host salicylic acid mediated signal transduction pathway and go effector mediated modulation of host innate immune response by symbiont i suspect this will always be effector mediatedf so we might be able to make go more specific
1
9,543
12,510,250,221
IssuesEvent
2020-06-02 18:18:46
MicrosoftDocs/azure-devops-docs
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
closed
workspace: clean: all doesn't work for deployment jobs
Pri1 devops-cicd-process/tech devops/prod doc-bug
The docs regarding cleaning a workspace don't work for deployment jobs. Either the docs are wrong or this is a bug. See [these other people](https://developercommunity.visualstudio.com/content/problem/614016/there-is-no-way-how-to-clean-workspace-in-deployme.html) also having problems. Please update appropriately. --- #### Document Details โš  *Do not edit this section. It is required for docs.microsoft.com โžŸ GitHub issue linking.* * ID: 67504b34-d64b-02a4-2e10-ab99f3b8cfe4 * Version Independent ID: 2cf63b2e-184b-7726-3b8a-d8baffd6fcce * Content: [Jobs in Azure Pipelines and TFS - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/phases?view=azure-devops&tabs=yaml#agent-phase) * Content Source: [docs/pipelines/process/phases.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/phases.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
1.0
workspace: clean: all doesn't work for deployment jobs - The docs regarding cleaning a workspace don't work for deployment jobs. Either the docs are wrong or this is a bug. See [these other people](https://developercommunity.visualstudio.com/content/problem/614016/there-is-no-way-how-to-clean-workspace-in-deployme.html) also having problems. Please update appropriately. --- #### Document Details โš  *Do not edit this section. It is required for docs.microsoft.com โžŸ GitHub issue linking.* * ID: 67504b34-d64b-02a4-2e10-ab99f3b8cfe4 * Version Independent ID: 2cf63b2e-184b-7726-3b8a-d8baffd6fcce * Content: [Jobs in Azure Pipelines and TFS - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/phases?view=azure-devops&tabs=yaml#agent-phase) * Content Source: [docs/pipelines/process/phases.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/phases.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
process
workspace clean all doesn t work for deployment jobs the docs regarding cleaning a workspace don t work for deployment jobs either the docs are wrong or this is a bug see also having problems please update appropriately document details โš  do not edit this section it is required for docs microsoft com โžŸ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
1
7,913
11,092,955,772
IssuesEvent
2019-12-15 22:31:27
shirou/gopsutil
https://api.github.com/repos/shirou/gopsutil
closed
[process][darwin] Process names are truncated to 16 characters
os:darwin os:freebsd os:openbsd package:process
**Describe the bug** Process names are truncated to 16 characters **To Reproduce** ```go p, _ := process.NewProcess(pidOfProcessWithLongName) pName, _ := p.Name() // pName is truncated ``` **Expected behavior** Process names should be returned in full. **Environment (please complete the following information):** - [x] Mac OS: ``` > sw_vers ProductName: Mac OS X ProductVersion: 10.14.3 BuildVersion: 18D109 > uname -a Darwin [redacted] 18.2.0 Darwin Kernel Version 18.2.0: Thu Dec 20 20:46:53 PST 2018; root:xnu-4903.241.1~1/RELEASE_X86_64 x86_64 ``` **Additional context** Same or similar issue was reported for Linux in issue #300 and fixed in PR #303
1.0
[process][darwin] Process names are truncated to 16 characters - **Describe the bug** Process names are truncated to 16 characters **To Reproduce** ```go p, _ := process.NewProcess(pidOfProcessWithLongName) pName, _ := p.Name() // pName is truncated ``` **Expected behavior** Process names should be returned in full. **Environment (please complete the following information):** - [x] Mac OS: ``` > sw_vers ProductName: Mac OS X ProductVersion: 10.14.3 BuildVersion: 18D109 > uname -a Darwin [redacted] 18.2.0 Darwin Kernel Version 18.2.0: Thu Dec 20 20:46:53 PST 2018; root:xnu-4903.241.1~1/RELEASE_X86_64 x86_64 ``` **Additional context** Same or similar issue was reported for Linux in issue #300 and fixed in PR #303
process
process names are truncated to characters describe the bug process names are truncated to characters to reproduce go p process newprocess pidofprocesswithlongname pname p name pname is truncated expected behavior process names should be returned in full environment please complete the following information mac os sw vers productname mac os x productversion buildversion uname a darwin darwin kernel version thu dec pst root xnu release additional context same or similar issue was reported for linux in issue and fixed in pr
1
4,705
7,544,127,913
IssuesEvent
2018-04-17 17:28:54
UnbFeelings/unb-feelings-docs
https://api.github.com/repos/UnbFeelings/unb-feelings-docs
closed
Mรฉtrica de endpoints do GQM
Processo invalid question
No GQM de vocรชs, existe uma mรฉtrica documentada "Percentual de endpoints documentados) [url](https://github.com/UnbFeelings/unb-feelings-docs/wiki/Processo-de-Garantia-da-Qualidade#122-garantir-a-manuteabilidade) porรฉm na parte de plano de anรกlise [aqui](https://github.com/UnbFeelings/unb-feelings-docs/wiki/Processo-de-Garantia-da-Qualidade#12215-percentual-de-endpoints-testados) estรก como percentual de endpoints testados, nรฃo consegui entender muito bem essa mรฉtrica
1.0
Mรฉtrica de endpoints do GQM - No GQM de vocรชs, existe uma mรฉtrica documentada "Percentual de endpoints documentados) [url](https://github.com/UnbFeelings/unb-feelings-docs/wiki/Processo-de-Garantia-da-Qualidade#122-garantir-a-manuteabilidade) porรฉm na parte de plano de anรกlise [aqui](https://github.com/UnbFeelings/unb-feelings-docs/wiki/Processo-de-Garantia-da-Qualidade#12215-percentual-de-endpoints-testados) estรก como percentual de endpoints testados, nรฃo consegui entender muito bem essa mรฉtrica
process
mรฉtrica de endpoints do gqm no gqm de vocรชs existe uma mรฉtrica documentada percentual de endpoints documentados porรฉm na parte de plano de anรกlise estรก como percentual de endpoints testados nรฃo consegui entender muito bem essa mรฉtrica
1
6,051
8,872,284,764
IssuesEvent
2019-01-11 15:03:06
kiwicom/orbit-components
https://api.github.com/repos/kiwicom/orbit-components
closed
Support React Component in Select label prop
Enhancement Processing
I have a `Select` component and want to add a label. The label should use component `Text` from nitro for translations. But `label` prop accepts only string. Extend `label` prop to accept also `React.Node`. Is this valid solution? If yes, I would create a PR.
1.0
Support React Component in Select label prop - I have a `Select` component and want to add a label. The label should use component `Text` from nitro for translations. But `label` prop accepts only string. Extend `label` prop to accept also `React.Node`. Is this valid solution? If yes, I would create a PR.
process
support react component in select label prop i have a select component and want to add a label the label should use component text from nitro for translations but label prop accepts only string extend label prop to accept also react node is this valid solution if yes i would create a pr
1
7,983
11,170,752,554
IssuesEvent
2019-12-28 15:11:21
bisq-network/bisq
https://api.github.com/repos/bisq-network/bisq
closed
Trade with serious bug
an:investigation in:trade-process was:dropped
<img width="1148" alt="Captura de pantalla 2019-08-25 a las 15 25 47" src="https://user-images.githubusercontent.com/52173515/63654440-97e7d000-c77a-11e9-88d7-8c152caae453.png"> Trade "aikOA" with a very weird bug. the 3 transactions aren't related: maker tx - b4b25063df8e060348d7f859e8f6dc1ea9d068c6a58072c18ac4fef7397b8fb6 taker tx - 0750216d97ac246f9d397169cc920eb1b42a7c2c9c163c33f416dac977c7e8f9 deposit tx - 7fd51618161ffc9bf97b0ede7289d3c2168bd43f9394d23bd909c5b9741f651d the btc seller received the payment and released the btc to the buyer but the payout was never made to the correct btc address. but rather to this one: 1Frhz7rdX7onjfccV8AdeKEjagxuWQd4p7 thereโ€™s no โ€˜depositTxIdโ€™ for the trade โ€˜aikOAโ€™ when looking in the โ€˜trades_statistics.jsonโ€™ file that the Bisq software dumps. In fact the payout to 1Frhz7rdX7onjfccV8AdeKEjagxuWQd4p7 belongs to a different trade (DLPFDYIP). in which everything matches up and had no problem.
1.0
Trade with serious bug - <img width="1148" alt="Captura de pantalla 2019-08-25 a las 15 25 47" src="https://user-images.githubusercontent.com/52173515/63654440-97e7d000-c77a-11e9-88d7-8c152caae453.png"> Trade "aikOA" with a very weird bug. the 3 transactions aren't related: maker tx - b4b25063df8e060348d7f859e8f6dc1ea9d068c6a58072c18ac4fef7397b8fb6 taker tx - 0750216d97ac246f9d397169cc920eb1b42a7c2c9c163c33f416dac977c7e8f9 deposit tx - 7fd51618161ffc9bf97b0ede7289d3c2168bd43f9394d23bd909c5b9741f651d the btc seller received the payment and released the btc to the buyer but the payout was never made to the correct btc address. but rather to this one: 1Frhz7rdX7onjfccV8AdeKEjagxuWQd4p7 thereโ€™s no โ€˜depositTxIdโ€™ for the trade โ€˜aikOAโ€™ when looking in the โ€˜trades_statistics.jsonโ€™ file that the Bisq software dumps. In fact the payout to 1Frhz7rdX7onjfccV8AdeKEjagxuWQd4p7 belongs to a different trade (DLPFDYIP). in which everything matches up and had no problem.
process
trade with serious bug img width alt captura de pantalla a las src trade aikoa with a very weird bug the transactions aren t related maker tx taker tx deposit tx the btc seller received the payment and released the btc to the buyer but the payout was never made to the correct btc address but rather to this one thereโ€™s no โ€˜deposittxidโ€™ for the trade โ€˜aikoaโ€™ when looking in the โ€˜trades statistics jsonโ€™ file that the bisq software dumps in fact the payout to belongs to a different trade dlpfdyip in which everything matches up and had no problem
1
408,154
11,942,272,778
IssuesEvent
2020-04-02 19:59:19
guilds-plugin/Guilds
https://api.github.com/repos/guilds-plugin/Guilds
closed
[Feature Request] Guild Guards
Priority: Low Type: Feature
Possibility of having guards defending their guild from enemies and being upgradeable, this feature would be useful to people because so while you are offline there is someone to defend. Sorry if my english is bad, but, i'm italian.
1.0
[Feature Request] Guild Guards - Possibility of having guards defending their guild from enemies and being upgradeable, this feature would be useful to people because so while you are offline there is someone to defend. Sorry if my english is bad, but, i'm italian.
non_process
guild guards possibility of having guards defending their guild from enemies and being upgradeable this feature would be useful to people because so while you are offline there is someone to defend sorry if my english is bad but i m italian
0
30,640
11,842,011,206
IssuesEvent
2020-03-23 22:00:51
Mohib-hub/karate
https://api.github.com/repos/Mohib-hub/karate
opened
CVE-2019-20445 (High) detected in netty-codec-http-4.1.32.Final.jar
security vulnerability
## CVE-2019-20445 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-http-4.1.32.Final.jar</b></p></summary> <p>Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers and clients.</p> <p>Library home page: <a href="http://netty.io/netty-codec-http/">http://netty.io/netty-codec-http/</a></p> <p>Path to dependency file: /tmp/ws-scm/karate/examples/gatling/build.gradle</p> <p>Path to vulnerable library: /tmp/ws-ua_20200323212715/downloadResource_20478f94-1633-47a1-ad79-827f8481d3e7/20200323212749/netty-codec-http-4.1.32.Final.jar,/tmp/ws-ua_20200323212715/downloadResource_20478f94-1633-47a1-ad79-827f8481d3e7/20200323212749/netty-codec-http-4.1.32.Final.jar</p> <p> Dependency Hierarchy: - karate-gatling-0.9.5.jar (Root Library) - gatling-charts-highcharts-3.0.2.jar - gatling-recorder-3.0.2.jar - :x: **netty-codec-http-4.1.32.Final.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Mohib-hub/karate/commit/c8766c8277306046ef9c6f01148b98b0d2bafe02">c8766c8277306046ef9c6f01148b98b0d2bafe02</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> HttpObjectDecoder.java in Netty before 4.1.44 allows a Content-Length header to be accompanied by a second Content-Length header, or by a Transfer-Encoding header. <p>Publish Date: 2020-01-29 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20445>CVE-2019-20445</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-20445">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-20445</a></p> <p>Release Date: 2020-01-29</p> <p>Fix Resolution: io.netty:netty-codec-http:4.1.44</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.32.Final","isTransitiveDependency":true,"dependencyTree":"com.intuit.karate:karate-gatling:0.9.5;io.gatling.highcharts:gatling-charts-highcharts:3.0.2;io.gatling:gatling-recorder:3.0.2;io.netty:netty-codec-http:4.1.32.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.44"}],"vulnerabilityIdentifier":"CVE-2019-20445","vulnerabilityDetails":"HttpObjectDecoder.java in Netty before 4.1.44 allows a Content-Length header to be accompanied by a second Content-Length header, or by a Transfer-Encoding header.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20445","cvss3Severity":"high","cvss3Score":"9.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2019-20445 (High) detected in netty-codec-http-4.1.32.Final.jar - ## CVE-2019-20445 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-http-4.1.32.Final.jar</b></p></summary> <p>Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers and clients.</p> <p>Library home page: <a href="http://netty.io/netty-codec-http/">http://netty.io/netty-codec-http/</a></p> <p>Path to dependency file: /tmp/ws-scm/karate/examples/gatling/build.gradle</p> <p>Path to vulnerable library: /tmp/ws-ua_20200323212715/downloadResource_20478f94-1633-47a1-ad79-827f8481d3e7/20200323212749/netty-codec-http-4.1.32.Final.jar,/tmp/ws-ua_20200323212715/downloadResource_20478f94-1633-47a1-ad79-827f8481d3e7/20200323212749/netty-codec-http-4.1.32.Final.jar</p> <p> Dependency Hierarchy: - karate-gatling-0.9.5.jar (Root Library) - gatling-charts-highcharts-3.0.2.jar - gatling-recorder-3.0.2.jar - :x: **netty-codec-http-4.1.32.Final.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Mohib-hub/karate/commit/c8766c8277306046ef9c6f01148b98b0d2bafe02">c8766c8277306046ef9c6f01148b98b0d2bafe02</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> HttpObjectDecoder.java in Netty before 4.1.44 allows a Content-Length header to be accompanied by a second Content-Length header, or by a Transfer-Encoding header. <p>Publish Date: 2020-01-29 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20445>CVE-2019-20445</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-20445">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-20445</a></p> <p>Release Date: 2020-01-29</p> <p>Fix Resolution: io.netty:netty-codec-http:4.1.44</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.32.Final","isTransitiveDependency":true,"dependencyTree":"com.intuit.karate:karate-gatling:0.9.5;io.gatling.highcharts:gatling-charts-highcharts:3.0.2;io.gatling:gatling-recorder:3.0.2;io.netty:netty-codec-http:4.1.32.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.44"}],"vulnerabilityIdentifier":"CVE-2019-20445","vulnerabilityDetails":"HttpObjectDecoder.java in Netty before 4.1.44 allows a Content-Length header to be accompanied by a second Content-Length header, or by a Transfer-Encoding header.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20445","cvss3Severity":"high","cvss3Score":"9.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in netty codec http final jar cve high severity vulnerability vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file tmp ws scm karate examples gatling build gradle path to vulnerable library tmp ws ua downloadresource netty codec http final jar tmp ws ua downloadresource netty codec http final jar dependency hierarchy karate gatling jar root library gatling charts highcharts jar gatling recorder jar x netty codec http final jar vulnerable library found in head commit a href vulnerability details httpobjectdecoder java in netty before allows a content length header to be accompanied by a second content length header or by a transfer encoding header publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io netty netty codec http isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails httpobjectdecoder java in netty before allows a content length header to be accompanied by a second content length header or by a transfer encoding header vulnerabilityurl
0
568,921
16,990,595,407
IssuesEvent
2021-06-30 19:53:35
ita-social-projects/TeachUA
https://api.github.com/repos/ita-social-projects/TeachUA
closed
[ะ“ัƒั€ั‚ะบะธ] First page with the 'ะ“ัƒั€ั‚ะบะธ' list is displayed all the time after pressing 'Back' navigation button
Priority: High bug
Environment: Windows 7, Service Pack 1, Google Chrome, 90.0.4430.212 (ะ ะพะทั€ะพะฑะบะฐ) (64-ั€ะพะทั€ัะดะฝะฐ ะฒะตั€ัั–ั). Reproducible: always Build found: the last build Steps to reproduce 1. Go to 'https://speak-ukrainian.org.ua/dev' 2. Click on 'ะ“ัƒั‚ั€ะบะธ' tab 3. Scroll down to pagination items below the club's list 4. Choose any of the page numbers (except the 1) 5. Press 'ะ”ะตั‚ะฐะปัŒะฝั–ัˆะต' button on the club's card - the page with club's information is opened 6. Press on the browser's 'Back' navigation button Actual result The first page of the club's list is displayed ![2021-05-21_12h22_01](https://user-images.githubusercontent.com/83178616/119114862-35933080-ba2f-11eb-8889-4ea3adcbf52c.png) Expected result The previously chosen page should be displayed. User story and test case links [#76 ](https://github.com/ita-social-projects/TeachUA/issues/76) [#274 ](https://github.com/ita-social-projects/TeachUA/issues/274)
1.0
[ะ“ัƒั€ั‚ะบะธ] First page with the 'ะ“ัƒั€ั‚ะบะธ' list is displayed all the time after pressing 'Back' navigation button - Environment: Windows 7, Service Pack 1, Google Chrome, 90.0.4430.212 (ะ ะพะทั€ะพะฑะบะฐ) (64-ั€ะพะทั€ัะดะฝะฐ ะฒะตั€ัั–ั). Reproducible: always Build found: the last build Steps to reproduce 1. Go to 'https://speak-ukrainian.org.ua/dev' 2. Click on 'ะ“ัƒั‚ั€ะบะธ' tab 3. Scroll down to pagination items below the club's list 4. Choose any of the page numbers (except the 1) 5. Press 'ะ”ะตั‚ะฐะปัŒะฝั–ัˆะต' button on the club's card - the page with club's information is opened 6. Press on the browser's 'Back' navigation button Actual result The first page of the club's list is displayed ![2021-05-21_12h22_01](https://user-images.githubusercontent.com/83178616/119114862-35933080-ba2f-11eb-8889-4ea3adcbf52c.png) Expected result The previously chosen page should be displayed. User story and test case links [#76 ](https://github.com/ita-social-projects/TeachUA/issues/76) [#274 ](https://github.com/ita-social-projects/TeachUA/issues/274)
non_process
first page with the ะณัƒั€ั‚ะบะธ list is displayed all the time after pressing back navigation button environment windows service pack google chrome ั€ะพะทั€ะพะฑะบะฐ ั€ะพะทั€ัะดะฝะฐ ะฒะตั€ัั–ั reproducible always build found the last build steps to reproduce go to click on ะณัƒั‚ั€ะบะธ tab scroll down to pagination items below the club s list choose any of the page numbers except the press ะดะตั‚ะฐะปัŒะฝั–ัˆะต button on the club s card the page with club s information is opened press on the browser s back navigation button actual result the first page of the club s list is displayed expected result the previously chosen page should be displayed user story and test case links
0
12,723
15,093,988,206
IssuesEvent
2021-02-07 03:42:10
rdoddanavar/hpr-sim
https://api.github.com/repos/rdoddanavar/hpr-sim
closed
src/preproc: preproc_model module --> prop model
pre-processing
Proposed `preproc_model.py` that defines parsing routines specific to model input files; ex. aero model, prop model, etc. Engine model file format: http://www.thrustcurve.org/raspformat.shtml Module should create data suitable for initializing prop model
1.0
src/preproc: preproc_model module --> prop model - Proposed `preproc_model.py` that defines parsing routines specific to model input files; ex. aero model, prop model, etc. Engine model file format: http://www.thrustcurve.org/raspformat.shtml Module should create data suitable for initializing prop model
process
src preproc preproc model module prop model proposed preproc model py that defines parsing routines specific to model input files ex aero model prop model etc engine model file format module should create data suitable for initializing prop model
1
10,212
13,069,247,880
IssuesEvent
2020-07-31 06:03:18
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Processing Modeler: "Load layer into project", ignoring loaded layer name
Bug Modeller Processing Regression
**Describe the bug** When using the processing modeler algorithm "Load layer into project", there's an option called Loaded layer name. In 3.4 the loaded layer name applies to the loaded file. In 3.10 the loaded layer name is ignored, instead using the actual filename. **How to Reproduce** Creating a new model or using this: [VectorLayerName.zip](https://github.com/qgis/QGIS/files/5000416/VectorLayerName.zip) 1) Add Input: Vector layer 2) Add Algorithm: Load layer into project 2.1) Apply some loaded layer name 3) Run model from an empty project, using the dots to find a shapefile via the explorer **QGIS and OS versions** QGIS version 3.10.7-A Coruรฑa QGIS code revision 7b4ca4c8d0 Compiled against Qt 5.11.2 Running against Qt 5.11.2 Compiled against GDAL/OGR 3.0.4 Running against GDAL/OGR 3.0.4 Compiled against GEOS 3.8.1-CAPI-1.13.3 Running against GEOS 3.8.1-CAPI-1.13.3 Compiled against SQLite 3.29.0 Running against SQLite 3.29.0 PostgreSQL Client Version 11.5 SpatiaLite Version 4.3.0 QWT Version 6.1.3 QScintilla2 Version 2.10.8 Compiled against PROJ 6.3.2 Running against PROJ Rel. 6.3.2, May 1st, 2020 OS Version Windows 10 (10.0) Active python plugins clipper; Datafordeler; DigitizingTools; geosearch_dk; GroupStats; Kortforsyningen; menu_from_project; pointsamplingtool; profiletool; qchainage; Qgis2threejs; QlrBrowser; QuickOSM; valuetool; db_manager; MetaSearch; processing **Additional context** I use this in combination with download file, where the output-file is called "OUTPUT". When downloading several files in one model i can't differ the outputs.
1.0
Processing Modeler: "Load layer into project", ignoring loaded layer name - **Describe the bug** When using the processing modeler algorithm "Load layer into project", there's an option called Loaded layer name. In 3.4 the loaded layer name applies to the loaded file. In 3.10 the loaded layer name is ignored, instead using the actual filename. **How to Reproduce** Creating a new model or using this: [VectorLayerName.zip](https://github.com/qgis/QGIS/files/5000416/VectorLayerName.zip) 1) Add Input: Vector layer 2) Add Algorithm: Load layer into project 2.1) Apply some loaded layer name 3) Run model from an empty project, using the dots to find a shapefile via the explorer **QGIS and OS versions** QGIS version 3.10.7-A Coruรฑa QGIS code revision 7b4ca4c8d0 Compiled against Qt 5.11.2 Running against Qt 5.11.2 Compiled against GDAL/OGR 3.0.4 Running against GDAL/OGR 3.0.4 Compiled against GEOS 3.8.1-CAPI-1.13.3 Running against GEOS 3.8.1-CAPI-1.13.3 Compiled against SQLite 3.29.0 Running against SQLite 3.29.0 PostgreSQL Client Version 11.5 SpatiaLite Version 4.3.0 QWT Version 6.1.3 QScintilla2 Version 2.10.8 Compiled against PROJ 6.3.2 Running against PROJ Rel. 6.3.2, May 1st, 2020 OS Version Windows 10 (10.0) Active python plugins clipper; Datafordeler; DigitizingTools; geosearch_dk; GroupStats; Kortforsyningen; menu_from_project; pointsamplingtool; profiletool; qchainage; Qgis2threejs; QlrBrowser; QuickOSM; valuetool; db_manager; MetaSearch; processing **Additional context** I use this in combination with download file, where the output-file is called "OUTPUT". When downloading several files in one model i can't differ the outputs.
process
processing modeler load layer into project ignoring loaded layer name describe the bug when using the processing modeler algorithm load layer into project there s an option called loaded layer name in the loaded layer name applies to the loaded file in the loaded layer name is ignored instead using the actual filename how to reproduce creating a new model or using this add input vector layer add algorithm load layer into project apply some loaded layer name run model from an empty project using the dots to find a shapefile via the explorer qgis and os versions qgis version a coruรฑa qgis code revision compiled against qt running against qt compiled against gdal ogr running against gdal ogr compiled against geos capi running against geos capi compiled against sqlite running against sqlite postgresql client version spatialite version qwt version version compiled against proj running against proj rel may os version windows active python plugins clipper datafordeler digitizingtools geosearch dk groupstats kortforsyningen menu from project pointsamplingtool profiletool qchainage qlrbrowser quickosm valuetool db manager metasearch processing additional context i use this in combination with download file where the output file is called output when downloading several files in one model i can t differ the outputs
1
351,707
10,522,447,431
IssuesEvent
2019-09-30 08:46:15
wso2/docs-ei
https://api.github.com/repos/wso2/docs-ei
closed
Register Ballerina language for highlighting
Priority/Highest Type/Docs Type/UX ballerina
Need to highlight code and syntax for Ballerina. This code can be found here: The code for this can be found here: https://github.com/ballerina-platform/ballerina-www/blob/master/website/main-pages/ballerina.io-website-theme/js/ballerina-common.js#L79-L156
1.0
Register Ballerina language for highlighting - Need to highlight code and syntax for Ballerina. This code can be found here: The code for this can be found here: https://github.com/ballerina-platform/ballerina-www/blob/master/website/main-pages/ballerina.io-website-theme/js/ballerina-common.js#L79-L156
non_process
register ballerina language for highlighting need to highlight code and syntax for ballerina this code can be found here the code for this can be found here
0
198,601
15,713,197,687
IssuesEvent
2021-03-27 15:13:13
nim-lang/Nim
https://api.github.com/repos/nim-lang/Nim
closed
Newruntime: incomplete treatment of `owned` non-`ref`'s, including documentation...
Documentation New runtime
Although recently updated and improved, [the documentation for the newruntime destructors](https://github.com/nim-lang/Nim/blob/devel/doc/destructors.rst) still doesn't mention that the `owned` modifier can also be applied to any type and not just `ref`'s and what that implies. It is not documented that currently, non-global `proc`'s (which may be closures capturing external bindings with a heap allocated environment reference) are automatically elevated to `owned` status (unless one takes special measures to apply the `unown` template when they are created); in fact, this elevation of `proc`'s (including closures) to `owned` status is not consistently internally used as per issue #12008 The documentation should be updated to reflect the implementation, with exact specification that such elevation of status to an `owned` type does not implicitly ref count (as these types are not allocated to the heap or are primitive pointers that are outside the automatic memory management). ### Example ```nim # test newruntime `owned` anything... # compile with "--newruntime" template own[T](x: T): owned T = # is the opposite of `unown` template! (proc (): owned T = cast[ptr T](result.addr)[] = x)() proc main() = let x = own 42; x.typeof.echo # showing we have a `owned int`!= let y = unown x; y.typeof.echo # must `unown` as not last use of `x`; (unown x).echo # last use of `x`, must be unowned here in order to print # there is no `$` for `owned` types to automatically print without `unown`'ing them! main() ``` ### Current Output ``` owned int int 42 0 ``` The above behaviour is correct, just not documented fully, as without the use of `unown` on the assignment of the `owned int` (`x`), the following error is obtained: ``` Error: '=' is not available for type <owned int>; requires a copy because it's not the last read of 'x'; another read is done here: ``` This shows that `owned` of other types follows the rule of "no copying of `owned` types, only moving is possible" correctly. ### Expected Output This is as expected, but undocumented that it is possible and that the `owned` rules can be applied to any type if they are forced to be `owned`. ### Additional Information ``` $ nim -v Nim Compiler Version 0.20.99 DEVEL nightly build as of 02-August-2019 on Linux and Windows 64-bit ```
1.0
Newruntime: incomplete treatment of `owned` non-`ref`'s, including documentation... - Although recently updated and improved, [the documentation for the newruntime destructors](https://github.com/nim-lang/Nim/blob/devel/doc/destructors.rst) still doesn't mention that the `owned` modifier can also be applied to any type and not just `ref`'s and what that implies. It is not documented that currently, non-global `proc`'s (which may be closures capturing external bindings with a heap allocated environment reference) are automatically elevated to `owned` status (unless one takes special measures to apply the `unown` template when they are created); in fact, this elevation of `proc`'s (including closures) to `owned` status is not consistently internally used as per issue #12008 The documentation should be updated to reflect the implementation, with exact specification that such elevation of status to an `owned` type does not implicitly ref count (as these types are not allocated to the heap or are primitive pointers that are outside the automatic memory management). ### Example ```nim # test newruntime `owned` anything... # compile with "--newruntime" template own[T](x: T): owned T = # is the opposite of `unown` template! (proc (): owned T = cast[ptr T](result.addr)[] = x)() proc main() = let x = own 42; x.typeof.echo # showing we have a `owned int`!= let y = unown x; y.typeof.echo # must `unown` as not last use of `x`; (unown x).echo # last use of `x`, must be unowned here in order to print # there is no `$` for `owned` types to automatically print without `unown`'ing them! main() ``` ### Current Output ``` owned int int 42 0 ``` The above behaviour is correct, just not documented fully, as without the use of `unown` on the assignment of the `owned int` (`x`), the following error is obtained: ``` Error: '=' is not available for type <owned int>; requires a copy because it's not the last read of 'x'; another read is done here: ``` This shows that `owned` of other types follows the rule of "no copying of `owned` types, only moving is possible" correctly. ### Expected Output This is as expected, but undocumented that it is possible and that the `owned` rules can be applied to any type if they are forced to be `owned`. ### Additional Information ``` $ nim -v Nim Compiler Version 0.20.99 DEVEL nightly build as of 02-August-2019 on Linux and Windows 64-bit ```
non_process
newruntime incomplete treatment of owned non ref s including documentation although recently updated and improved still doesn t mention that the owned modifier can also be applied to any type and not just ref s and what that implies it is not documented that currently non global proc s which may be closures capturing external bindings with a heap allocated environment reference are automatically elevated to owned status unless one takes special measures to apply the unown template when they are created in fact this elevation of proc s including closures to owned status is not consistently internally used as per issue the documentation should be updated to reflect the implementation with exact specification that such elevation of status to an owned type does not implicitly ref count as these types are not allocated to the heap or are primitive pointers that are outside the automatic memory management example nim test newruntime owned anything compile with newruntime template own x t owned t is the opposite of unown template proc owned t cast result addr x proc main let x own x typeof echo showing we have a owned int let y unown x y typeof echo must unown as not last use of x unown x echo last use of x must be unowned here in order to print there is no for owned types to automatically print without unown ing them main current output owned int int the above behaviour is correct just not documented fully as without the use of unown on the assignment of the owned int x the following error is obtained error is not available for type requires a copy because it s not the last read of x another read is done here this shows that owned of other types follows the rule of no copying of owned types only moving is possible correctly expected output this is as expected but undocumented that it is possible and that the owned rules can be applied to any type if they are forced to be owned additional information nim v nim compiler version devel nightly build as of august on linux and windows bit
0
71,947
23,865,977,400
IssuesEvent
2022-09-07 11:03:33
matrix-org/synapse
https://api.github.com/repos/matrix-org/synapse
closed
Enable cancellation for `POST /_matrix/client/v3/keys/query`
S-Major T-Defect
`POST /_matrix/client/v3/keys/query` can take a long time. When clients retry the request, the new request gets queued behind the previous one by `E2eKeysHandler._query_devices_linearizer`, so retrying a request that timed out only makes response times worse. ``` ... 2022-05-26 14:34:57,053 - synapse.access.http.11104 - 450 - INFO - POST-2875183 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 447.546sec/-267.564sec (0.213sec, 0.015sec) (0.015sec/0.052sec/34) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:35:04,581 - synapse.access.http.11104 - 450 - INFO - POST-2875478 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 435.215sec/-255.214sec (0.224sec, 0.029sec) (0.023sec/0.064sec/40) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:35:04,591 - synapse.access.http.11104 - 450 - INFO - POST-2874965 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 467.875sec/-287.874sec (0.213sec, 0.012sec) (0.044sec/0.106sec/46) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:35:05,443 - synapse.access.http.11104 - 450 - INFO - POST-2875779 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 414.790sec/-234.788sec (0.207sec, 0.013sec) (0.017sec/0.062sec/35) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:35:12,246 - synapse.access.http.11104 - 450 - INFO - POST-2875953 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 409.941sec/-229.943sec (0.234sec, 0.016sec) (0.010sec/0.054sec/18) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:35:34,240 - synapse.access.http.11104 - 450 - INFO - POST-2875839 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 439.033sec/-259.028sec (0.253sec, 0.012sec) (0.017sec/0.042sec/35) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:35:42,721 - synapse.access.http.11104 - 450 - INFO - POST-2875818 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 448.897sec/-268.734sec (0.238sec, 0.015sec) (0.019sec/0.052sec/30) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:36:04,578 - synapse.access.http.11104 - 450 - INFO - POST-2876136 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 452.018sec/-272.030sec (0.187sec, 0.015sec) (0.013sec/0.046sec/26) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:36:04,957 - synapse.access.http.11104 - 450 - INFO - POST-2876300 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 442.008sec/-262.071sec (0.201sec, 0.012sec) (0.013sec/0.045sec/31) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:36:11,865 - synapse.access.http.11104 - 450 - INFO - POST-2877424 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 385.515sec/-205.517sec (0.182sec, 0.014sec) (0.108sec/0.087sec/26) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] ... ``` To prevent the linearizer queue from building up too much, we can enable cancellation for the endpoint. - [ ] Audit the call tree of the endpoint. There's a previous audit of `/rooms/$room_id/members` at https://github.com/matrix-org/synapse/issues/3528#issuecomment-1083481216, which may or may not have some overlap. - [x] (Optional) Relax the validation on `@cancellable` and attach it to the methods you've audited - [ ] Add the `@cancellable` flag to the endpoint - [ ] Add a test (using the machinery of #12674)
1.0
Enable cancellation for `POST /_matrix/client/v3/keys/query` - `POST /_matrix/client/v3/keys/query` can take a long time. When clients retry the request, the new request gets queued behind the previous one by `E2eKeysHandler._query_devices_linearizer`, so retrying a request that timed out only makes response times worse. ``` ... 2022-05-26 14:34:57,053 - synapse.access.http.11104 - 450 - INFO - POST-2875183 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 447.546sec/-267.564sec (0.213sec, 0.015sec) (0.015sec/0.052sec/34) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:35:04,581 - synapse.access.http.11104 - 450 - INFO - POST-2875478 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 435.215sec/-255.214sec (0.224sec, 0.029sec) (0.023sec/0.064sec/40) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:35:04,591 - synapse.access.http.11104 - 450 - INFO - POST-2874965 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 467.875sec/-287.874sec (0.213sec, 0.012sec) (0.044sec/0.106sec/46) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:35:05,443 - synapse.access.http.11104 - 450 - INFO - POST-2875779 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 414.790sec/-234.788sec (0.207sec, 0.013sec) (0.017sec/0.062sec/35) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:35:12,246 - synapse.access.http.11104 - 450 - INFO - POST-2875953 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 409.941sec/-229.943sec (0.234sec, 0.016sec) (0.010sec/0.054sec/18) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:35:34,240 - synapse.access.http.11104 - 450 - INFO - POST-2875839 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 439.033sec/-259.028sec (0.253sec, 0.012sec) (0.017sec/0.042sec/35) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:35:42,721 - synapse.access.http.11104 - 450 - INFO - POST-2875818 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 448.897sec/-268.734sec (0.238sec, 0.015sec) (0.019sec/0.052sec/30) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:36:04,578 - synapse.access.http.11104 - 450 - INFO - POST-2876136 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 452.018sec/-272.030sec (0.187sec, 0.015sec) (0.013sec/0.046sec/26) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:36:04,957 - synapse.access.http.11104 - 450 - INFO - POST-2876300 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 442.008sec/-262.071sec (0.201sec, 0.012sec) (0.013sec/0.045sec/31) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] 2022-05-26 14:36:11,865 - synapse.access.http.11104 - 450 - INFO - POST-2877424 - xxx.xxx.xxx.xxx - 11104 - {@xxxxxxxxx:matrix.org} Processed request: 385.515sec/-205.517sec (0.182sec, 0.014sec) (0.108sec/0.087sec/26) 0B 200! "POST /_matrix/client/v3/keys/query HTTP/1.1" "xxxxxxxxx" [0 dbevts] ... ``` To prevent the linearizer queue from building up too much, we can enable cancellation for the endpoint. - [ ] Audit the call tree of the endpoint. There's a previous audit of `/rooms/$room_id/members` at https://github.com/matrix-org/synapse/issues/3528#issuecomment-1083481216, which may or may not have some overlap. - [x] (Optional) Relax the validation on `@cancellable` and attach it to the methods you've audited - [ ] Add the `@cancellable` flag to the endpoint - [ ] Add a test (using the machinery of #12674)
non_process
enable cancellation for post matrix client keys query post matrix client keys query can take a long time when clients retry the request the new request gets queued behind the previous one by query devices linearizer so retrying a request that timed out only makes response times worse synapse access http info post xxx xxx xxx xxx xxxxxxxxx matrix org processed request post matrix client keys query http xxxxxxxxx synapse access http info post xxx xxx xxx xxx xxxxxxxxx matrix org processed request post matrix client keys query http xxxxxxxxx synapse access http info post xxx xxx xxx xxx xxxxxxxxx matrix org processed request post matrix client keys query http xxxxxxxxx synapse access http info post xxx xxx xxx xxx xxxxxxxxx matrix org processed request post matrix client keys query http xxxxxxxxx synapse access http info post xxx xxx xxx xxx xxxxxxxxx matrix org processed request post matrix client keys query http xxxxxxxxx synapse access http info post xxx xxx xxx xxx xxxxxxxxx matrix org processed request post matrix client keys query http xxxxxxxxx synapse access http info post xxx xxx xxx xxx xxxxxxxxx matrix org processed request post matrix client keys query http xxxxxxxxx synapse access http info post xxx xxx xxx xxx xxxxxxxxx matrix org processed request post matrix client keys query http xxxxxxxxx synapse access http info post xxx xxx xxx xxx xxxxxxxxx matrix org processed request post matrix client keys query http xxxxxxxxx synapse access http info post xxx xxx xxx xxx xxxxxxxxx matrix org processed request post matrix client keys query http xxxxxxxxx to prevent the linearizer queue from building up too much we can enable cancellation for the endpoint audit the call tree of the endpoint there s a previous audit of rooms room id members at which may or may not have some overlap optional relax the validation on cancellable and attach it to the methods you ve audited add the cancellable flag to the endpoint add a test using the machinery of
0
2,035
4,847,360,866
IssuesEvent
2016-11-10 14:46:23
Alfresco/alfresco-ng2-components
https://api.github.com/repos/Alfresco/alfresco-ng2-components
opened
Right hand side of form not displayed within form attached to start event
browser: all bug comp: activiti-processList
**activiti** ![screen shot 2016-11-10 at 14 41 38](https://cloud.githubusercontent.com/assets/13200338/20180792/d0ca80f6-a753-11e6-8c1f-986782eaf36c.png) **component** ![screen shot 2016-11-10 at 14 40 40](https://cloud.githubusercontent.com/assets/13200338/20180771/b67a24e0-a753-11e6-853f-58f2550a1f7e.png)
1.0
Right hand side of form not displayed within form attached to start event - **activiti** ![screen shot 2016-11-10 at 14 41 38](https://cloud.githubusercontent.com/assets/13200338/20180792/d0ca80f6-a753-11e6-8c1f-986782eaf36c.png) **component** ![screen shot 2016-11-10 at 14 40 40](https://cloud.githubusercontent.com/assets/13200338/20180771/b67a24e0-a753-11e6-853f-58f2550a1f7e.png)
process
right hand side of form not displayed within form attached to start event activiti component
1
237,359
7,759,105,571
IssuesEvent
2018-05-31 21:54:47
RobotLocomotion/drake
https://api.github.com/repos/RobotLocomotion/drake
opened
``//bindings/pydrake/examples`` violate ODR
configuration: bazel configuration: python priority: low
As @jwnimmer-tri pointed out on Slack, targets underneath `//bindings/pydrake/examples` presently violate ODR, similar to #8908. At present, we are getting lucky that we do not encounter the same segfault as #8908, but it may happen shortly. This should be fixed, most ideally by having all examples link to `drake_shared_library`, or even better, using granular shared libraries (#7294). In attempting to reproduce along the lines of #8908, it seems that we are *extra* lucky in that it does not cause a segfault akin to `bazel test //examples/kuka_iiwa_arm:kuka_simulation_test`: https://github.com/RobotLocomotion/drake/compare/master...EricCousineau-TRI:issue/8908-repro-wip?expand=1#diff-9454df01e25d57f36c12f3b200dc2988 For this reason, we should maybe consider this a low priority pending resolution of #7294?
1.0
``//bindings/pydrake/examples`` violate ODR - As @jwnimmer-tri pointed out on Slack, targets underneath `//bindings/pydrake/examples` presently violate ODR, similar to #8908. At present, we are getting lucky that we do not encounter the same segfault as #8908, but it may happen shortly. This should be fixed, most ideally by having all examples link to `drake_shared_library`, or even better, using granular shared libraries (#7294). In attempting to reproduce along the lines of #8908, it seems that we are *extra* lucky in that it does not cause a segfault akin to `bazel test //examples/kuka_iiwa_arm:kuka_simulation_test`: https://github.com/RobotLocomotion/drake/compare/master...EricCousineau-TRI:issue/8908-repro-wip?expand=1#diff-9454df01e25d57f36c12f3b200dc2988 For this reason, we should maybe consider this a low priority pending resolution of #7294?
non_process
bindings pydrake examples violate odr as jwnimmer tri pointed out on slack targets underneath bindings pydrake examples presently violate odr similar to at present we are getting lucky that we do not encounter the same segfault as but it may happen shortly this should be fixed most ideally by having all examples link to drake shared library or even better using granular shared libraries in attempting to reproduce along the lines of it seems that we are extra lucky in that it does not cause a segfault akin to bazel test examples kuka iiwa arm kuka simulation test for this reason we should maybe consider this a low priority pending resolution of
0
21,066
28,014,686,720
IssuesEvent
2023-03-27 21:29:38
hashgraph/hedera-json-rpc-relay
https://api.github.com/repos/hashgraph/hedera-json-rpc-relay
closed
Release 0.20.0
enhancement P1 process
### Problem 0.20.0 features are not yet deployed ### Solution manual release process - [x] Create a `release/0.20` branch off of main. Ensure github test actions run - [x] Tag as `v0.20.0-rc1` - [x] git tag - [x] Confirm new docker image version is deployed - [x] Previewnet Testing - [x] Deploy tagged version - [x] Manual testing - [x] Run newman tests - [x] Run Dapp Example Bootstrap and manual tests - [x] Run acceptance tests - [x] Run performance tests - [x] Testnet Testing - [x] Deploy tagged version - [x] Manual testing - [x] Run newman tests - [x] Run Dapp Example Bootstrap and manual tests - [x] Run acceptance tests - [x] Run performance tests - Let bake - [x] Tag as `v0.20.0` - [x] git tag - [x] Confirm new docker image version is deployed - [x] Write up release notes and changelist - [x] Mainnet Testing - [x] Deploy tagged version - [x] Manual testing Any bugs or missed features found should see a new ticket opened, addressed in main and cherry-picked to `release/0.20 ` with a new rc version tagged and docker image deployed ### Alternatives _No response_
1.0
Release 0.20.0 - ### Problem 0.20.0 features are not yet deployed ### Solution manual release process - [x] Create a `release/0.20` branch off of main. Ensure github test actions run - [x] Tag as `v0.20.0-rc1` - [x] git tag - [x] Confirm new docker image version is deployed - [x] Previewnet Testing - [x] Deploy tagged version - [x] Manual testing - [x] Run newman tests - [x] Run Dapp Example Bootstrap and manual tests - [x] Run acceptance tests - [x] Run performance tests - [x] Testnet Testing - [x] Deploy tagged version - [x] Manual testing - [x] Run newman tests - [x] Run Dapp Example Bootstrap and manual tests - [x] Run acceptance tests - [x] Run performance tests - Let bake - [x] Tag as `v0.20.0` - [x] git tag - [x] Confirm new docker image version is deployed - [x] Write up release notes and changelist - [x] Mainnet Testing - [x] Deploy tagged version - [x] Manual testing Any bugs or missed features found should see a new ticket opened, addressed in main and cherry-picked to `release/0.20 ` with a new rc version tagged and docker image deployed ### Alternatives _No response_
process
release problem features are not yet deployed solution manual release process create a release branch off of main ensure github test actions run tag as git tag confirm new docker image version is deployed previewnet testing deploy tagged version manual testing run newman tests run dapp example bootstrap and manual tests run acceptance tests run performance tests testnet testing deploy tagged version manual testing run newman tests run dapp example bootstrap and manual tests run acceptance tests run performance tests let bake tag as git tag confirm new docker image version is deployed write up release notes and changelist mainnet testing deploy tagged version manual testing any bugs or missed features found should see a new ticket opened addressed in main and cherry picked to release with a new rc version tagged and docker image deployed alternatives no response
1
21,914
30,443,499,691
IssuesEvent
2023-07-15 11:16:31
silentiumNoxe/priest_feudal_back
https://api.github.com/repos/silentiumNoxe/priest_feudal_back
opened
API processing service
enhancement processing service
Define http endpoints: * **POST** /v1/action/{enum:type} any JSON. Pass this JSON to waterpipe. Action type should be passed with a request to waterpipe. Response: Code 202. NO BODY; * Actuator with health (GET /actuator/health) Response: Code 200. {status: "OK"}
1.0
API processing service - Define http endpoints: * **POST** /v1/action/{enum:type} any JSON. Pass this JSON to waterpipe. Action type should be passed with a request to waterpipe. Response: Code 202. NO BODY; * Actuator with health (GET /actuator/health) Response: Code 200. {status: "OK"}
process
api processing service define http endpoints post action enum type any json pass this json to waterpipe action type should be passed with a request to waterpipe response code no body actuator with health get actuator health response code status ok
1
6,678
9,795,442,250
IssuesEvent
2019-06-11 03:43:40
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
GDAL Warp (Reproject) tool: make the target CRS parameter optional
Easy fix Feature Request Processing
Author Name: **Mike Taves** (Mike Taves) Original Redmine Issue: [21571](https://issues.qgis.org/issues/21571) Redmine category:processing/gdal Assignee: Giovanni Manghi --- With QGIS 3.6, From Raster > Projections > Warp (Reproject), the dialog has a few options with a few of the defaults set in. Source CRS is optional, and the first option is blank. This is appropriate: gdalwarp does not require `-s_srs`, and does not have a default. However in this dialog, Target CRS has no blank option to choose, and has a default `EPSG:4326` (??). Note that the command-line tool gdalwarp does not require `-t_srs`. Therefore it is suggested to add a blank option at the top, and make this the default. Everything else in this dialog looks sufficient, e.g. the default nearest neighbour resample method (`-r near`) is correct. A work-around is to copy the `gdalwarp ...` console call at the bottom of the dialog, edit it to remove `-t_srs EPSG:4326 `, and run this command manually.
1.0
GDAL Warp (Reproject) tool: make the target CRS parameter optional - Author Name: **Mike Taves** (Mike Taves) Original Redmine Issue: [21571](https://issues.qgis.org/issues/21571) Redmine category:processing/gdal Assignee: Giovanni Manghi --- With QGIS 3.6, From Raster > Projections > Warp (Reproject), the dialog has a few options with a few of the defaults set in. Source CRS is optional, and the first option is blank. This is appropriate: gdalwarp does not require `-s_srs`, and does not have a default. However in this dialog, Target CRS has no blank option to choose, and has a default `EPSG:4326` (??). Note that the command-line tool gdalwarp does not require `-t_srs`. Therefore it is suggested to add a blank option at the top, and make this the default. Everything else in this dialog looks sufficient, e.g. the default nearest neighbour resample method (`-r near`) is correct. A work-around is to copy the `gdalwarp ...` console call at the bottom of the dialog, edit it to remove `-t_srs EPSG:4326 `, and run this command manually.
process
gdal warp reproject tool make the target crs parameter optional author name mike taves mike taves original redmine issue redmine category processing gdal assignee giovanni manghi with qgis from raster projections warp reproject the dialog has a few options with a few of the defaults set in source crs is optional and the first option is blank this is appropriate gdalwarp does not require s srs and does not have a default however in this dialog target crs has no blank option to choose and has a default epsg note that the command line tool gdalwarp does not require t srs therefore it is suggested to add a blank option at the top and make this the default everything else in this dialog looks sufficient e g the default nearest neighbour resample method r near is correct a work around is to copy the gdalwarp console call at the bottom of the dialog edit it to remove t srs epsg and run this command manually
1
6,682
9,799,414,361
IssuesEvent
2019-06-11 14:22:24
googleapis/cloud-bigtable-client
https://api.github.com/repos/googleapis/cloud-bigtable-client
closed
Add static analysis to find bugs sooner
type: process
Static analysis tools enable finding and avoiding bugs at compile time, before they become issues at runtime, and become much harder (and hence, costlier) to find and fix. There are tools that can be run offline, such as SpotBugs (successor to FindBugs) and [ErrorProne](http://errorprone.info/), or via online services such as [Coverity](https://scan.coverity.com/) (free for open-source projects). Wikipedia has a [list static analysis tools for Java](https://en.wikipedia.org/wiki/List_of_tools_for_static_code_analysis#Java). FWIW, we've integrated Coverity scans into JanusGraph builds, though ended up not using the Travis CI integration with Coverity; here is how it was done: * [`.travis.yml`](https://github.com/JanusGraph/janusgraph/blob/master/.travis.yml) * [analysis script and Dockerfile](https://github.com/JanusGraph/janusgraph/tree/master/analysis) referenced in `.travis.yml` โ€” note that CentOS appears to behave differently from Ubuntu for Coverity I'm happy to add Coverity support for this repo, if there's interest and support (please vote with ๐Ÿ‘ / ๐Ÿ‘Ž). Thoughts?
1.0
Add static analysis to find bugs sooner - Static analysis tools enable finding and avoiding bugs at compile time, before they become issues at runtime, and become much harder (and hence, costlier) to find and fix. There are tools that can be run offline, such as SpotBugs (successor to FindBugs) and [ErrorProne](http://errorprone.info/), or via online services such as [Coverity](https://scan.coverity.com/) (free for open-source projects). Wikipedia has a [list static analysis tools for Java](https://en.wikipedia.org/wiki/List_of_tools_for_static_code_analysis#Java). FWIW, we've integrated Coverity scans into JanusGraph builds, though ended up not using the Travis CI integration with Coverity; here is how it was done: * [`.travis.yml`](https://github.com/JanusGraph/janusgraph/blob/master/.travis.yml) * [analysis script and Dockerfile](https://github.com/JanusGraph/janusgraph/tree/master/analysis) referenced in `.travis.yml` โ€” note that CentOS appears to behave differently from Ubuntu for Coverity I'm happy to add Coverity support for this repo, if there's interest and support (please vote with ๐Ÿ‘ / ๐Ÿ‘Ž). Thoughts?
process
add static analysis to find bugs sooner static analysis tools enable finding and avoiding bugs at compile time before they become issues at runtime and become much harder and hence costlier to find and fix there are tools that can be run offline such as spotbugs successor to findbugs and or via online services such as free for open source projects wikipedia has a fwiw we ve integrated coverity scans into janusgraph builds though ended up not using the travis ci integration with coverity here is how it was done referenced in travis yml โ€” note that centos appears to behave differently from ubuntu for coverity i m happy to add coverity support for this repo if there s interest and support please vote with ๐Ÿ‘ ๐Ÿ‘Ž thoughts
1
18,463
24,549,702,590
IssuesEvent
2022-10-12 11:37:45
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[PM] Status is getting displayed as 'Not provided' even when there is no data sharing permission
Bug P1 Participant manager Process: Fixed Process: Tested QA Process: Tested dev
**AR:** Status is getting displayed as 'Not provided' even when there is no data sharing permission **ER:** Status should get displayed as 'Not applicable' , when there is no data sharing permission ![11](https://user-images.githubusercontent.com/86007179/192238890-ff2ab696-4424-4e0e-83ff-4357168bc9a0.png)
3.0
[PM] Status is getting displayed as 'Not provided' even when there is no data sharing permission - **AR:** Status is getting displayed as 'Not provided' even when there is no data sharing permission **ER:** Status should get displayed as 'Not applicable' , when there is no data sharing permission ![11](https://user-images.githubusercontent.com/86007179/192238890-ff2ab696-4424-4e0e-83ff-4357168bc9a0.png)
process
status is getting displayed as not provided even when there is no data sharing permission ar status is getting displayed as not provided even when there is no data sharing permission er status should get displayed as not applicable when there is no data sharing permission
1
6,037
8,850,107,131
IssuesEvent
2019-01-08 12:16:25
tig-nl/postnl-magento2
https://api.github.com/repos/tig-nl/postnl-magento2
closed
FirstDeliveryDate returns NULL causing issues when creating slips/labels
in process on backlog
### Submitting issues through Github ## Please follow the guide below - Put an `x` into all the boxes [ ] relevant to your *issue* (like this: `[x]`) - Use the *Preview* tab to see what your issue will actually look like. - We may ask some questions or ask you to provide addition information after you placed your request. --- ### Make sure you are using the *latest* version: https://tig.nl/postnl-magento-extensies/ Issues with outdated version will be rejected. - [x] I've **verified** and **I assure** that I'm running the latest version of the TIG PostNL Magento extension. --- ### What is the purpose of your *issue*? - [ ] Feature request (request for a new functionality) - [x] Bug report (encountered problems with the TIG PostNL Magento extension) - [ ] Extension support request (request for adding support for a new extension) - [ ] Other ### If the purpose of your issue is a *bug report* Step 1: Submit order via frontend without a correct postcode/country combination Step 2: Order is created normally Step 3: Trying to create packing slips/labels will result in the following error: Expected result: Create packing slips Actual result: Check CIFException in the detail section - 2034: Bezorgdatum, mag geen maandag zijn indien er geen zondagsorting is. The error becomes an issue when an order has been created with incorrect data and the tig_postnl_order delivery_date has been set to NULL. The issue has been traced back into the following model: Service/Order/FirstDeliveryDate.php 99 ``` private function getDeliveryDate(Address $address) { try { $this->endpoint->setStoreId($this->quote->getStoreId()); $this->endpoint->setParameters([ 'country' => $address->getCountryId(), 'postcode' => $address->getPostcode(), ]); $response = $this->endpoint->call(); } catch (\Exception $exception) { return null; << value causes issues } if (is_object($response) && $response->DeliveryDate) { return $response->DeliveryDate; } return null; << value causes issues } ``` Current workaround: manually set database record delivery_date from NULL to an actual date.
1.0
FirstDeliveryDate returns NULL causing issues when creating slips/labels - ### Submitting issues through Github ## Please follow the guide below - Put an `x` into all the boxes [ ] relevant to your *issue* (like this: `[x]`) - Use the *Preview* tab to see what your issue will actually look like. - We may ask some questions or ask you to provide addition information after you placed your request. --- ### Make sure you are using the *latest* version: https://tig.nl/postnl-magento-extensies/ Issues with outdated version will be rejected. - [x] I've **verified** and **I assure** that I'm running the latest version of the TIG PostNL Magento extension. --- ### What is the purpose of your *issue*? - [ ] Feature request (request for a new functionality) - [x] Bug report (encountered problems with the TIG PostNL Magento extension) - [ ] Extension support request (request for adding support for a new extension) - [ ] Other ### If the purpose of your issue is a *bug report* Step 1: Submit order via frontend without a correct postcode/country combination Step 2: Order is created normally Step 3: Trying to create packing slips/labels will result in the following error: Expected result: Create packing slips Actual result: Check CIFException in the detail section - 2034: Bezorgdatum, mag geen maandag zijn indien er geen zondagsorting is. The error becomes an issue when an order has been created with incorrect data and the tig_postnl_order delivery_date has been set to NULL. The issue has been traced back into the following model: Service/Order/FirstDeliveryDate.php 99 ``` private function getDeliveryDate(Address $address) { try { $this->endpoint->setStoreId($this->quote->getStoreId()); $this->endpoint->setParameters([ 'country' => $address->getCountryId(), 'postcode' => $address->getPostcode(), ]); $response = $this->endpoint->call(); } catch (\Exception $exception) { return null; << value causes issues } if (is_object($response) && $response->DeliveryDate) { return $response->DeliveryDate; } return null; << value causes issues } ``` Current workaround: manually set database record delivery_date from NULL to an actual date.
process
firstdeliverydate returns null causing issues when creating slips labels submitting issues through github please follow the guide below put an x into all the boxes relevant to your issue like this use the preview tab to see what your issue will actually look like we may ask some questions or ask you to provide addition information after you placed your request make sure you are using the latest version issues with outdated version will be rejected i ve verified and i assure that i m running the latest version of the tig postnl magento extension what is the purpose of your issue feature request request for a new functionality bug report encountered problems with the tig postnl magento extension extension support request request for adding support for a new extension other if the purpose of your issue is a bug report step submit order via frontend without a correct postcode country combination step order is created normally step trying to create packing slips labels will result in the following error expected result create packing slips actual result check cifexception in the detail section bezorgdatum mag geen maandag zijn indien er geen zondagsorting is the error becomes an issue when an order has been created with incorrect data and the tig postnl order delivery date has been set to null the issue has been traced back into the following model service order firstdeliverydate php private function getdeliverydate address address try this endpoint setstoreid this quote getstoreid this endpoint setparameters country address getcountryid postcode address getpostcode response this endpoint call catch exception exception return null value causes issues if is object response response deliverydate return response deliverydate return null value causes issues current workaround manually set database record delivery date from null to an actual date
1
20,451
27,113,326,265
IssuesEvent
2023-02-15 16:44:50
googleapis/testing-infra-docker
https://api.github.com/repos/googleapis/testing-infra-docker
closed
WARNING: JAVA - DO NOT UPDATE to maven 3.8.2
type: process priority: p3
maven 3.8.2 breaks many of the java client library builds. 3.8.3 should fix this.
1.0
WARNING: JAVA - DO NOT UPDATE to maven 3.8.2 - maven 3.8.2 breaks many of the java client library builds. 3.8.3 should fix this.
process
warning java do not update to maven maven breaks many of the java client library builds should fix this
1
2,755
5,681,063,372
IssuesEvent
2017-04-13 04:26:39
inasafe/inasafe-realtime
https://api.github.com/repos/inasafe/inasafe-realtime
closed
EQ Realtime - feedback on Indonesian disclaimer on the report page
bug ready realtime processor
@adelebearcrozier, Anjar and Nugi have supplied feedback on the Indonesian disclaimer on the report page for EQ Realtime. See original ticket at https://github.com/inasafe/inasafe/issues/2698 for further discussion.
1.0
EQ Realtime - feedback on Indonesian disclaimer on the report page - @adelebearcrozier, Anjar and Nugi have supplied feedback on the Indonesian disclaimer on the report page for EQ Realtime. See original ticket at https://github.com/inasafe/inasafe/issues/2698 for further discussion.
process
eq realtime feedback on indonesian disclaimer on the report page adelebearcrozier anjar and nugi have supplied feedback on the indonesian disclaimer on the report page for eq realtime see original ticket at for further discussion
1
10,210
13,068,153,212
IssuesEvent
2020-07-31 02:42:15
nion-software/nionswift
https://api.github.com/repos/nion-software/nionswift
opened
Processing asking for new data shapes should allow user to enter percentages
f - inspector f - processing type - enhancement
Resize, reshape, re-bin, etc.
1.0
Processing asking for new data shapes should allow user to enter percentages - Resize, reshape, re-bin, etc.
process
processing asking for new data shapes should allow user to enter percentages resize reshape re bin etc
1
63,385
12,310,938,954
IssuesEvent
2020-05-12 11:31:10
GTNewHorizons/GT-New-Horizons-Modpack
https://api.github.com/repos/GTNewHorizons/GT-New-Horizons-Modpack
closed
Storage Interface on Super/QuantumChechts does not work propperly
AE2 issue Need Code changes bugMinor reminder
#### Which modpack version are you using? 2.0.3.1Dev Storage Interface on Super/QuantumChechts does not work propperly
1.0
Storage Interface on Super/QuantumChechts does not work propperly - #### Which modpack version are you using? 2.0.3.1Dev Storage Interface on Super/QuantumChechts does not work propperly
non_process
storage interface on super quantumchechts does not work propperly which modpack version are you using storage interface on super quantumchechts does not work propperly
0
10,869
13,640,422,981
IssuesEvent
2020-09-25 12:43:53
timberio/vector
https://api.github.com/repos/timberio/vector
closed
New `strip_whitespace` remap function
domain: mapping domain: processing type: feature
The `strip_whitespace` remap function strips leading and trailing whitespace ## Example Given this event: ```js { "message": "\t\tThis string has whitespace around it " } ``` And this remap instruction set: ``` .message = strip_whitespace(.message) ``` Would result in: ```js { "message": "This string has whitespace around it" } ``` ## Requirements - [ ] Strips leading _and_ trailing whitespace. - [ ] Splits on all whitespace as defined by [Unicode whitespace character](https://en.wikipedia.org/wiki/Unicode_character_property#Whitespace).
1.0
New `strip_whitespace` remap function - The `strip_whitespace` remap function strips leading and trailing whitespace ## Example Given this event: ```js { "message": "\t\tThis string has whitespace around it " } ``` And this remap instruction set: ``` .message = strip_whitespace(.message) ``` Would result in: ```js { "message": "This string has whitespace around it" } ``` ## Requirements - [ ] Strips leading _and_ trailing whitespace. - [ ] Splits on all whitespace as defined by [Unicode whitespace character](https://en.wikipedia.org/wiki/Unicode_character_property#Whitespace).
process
new strip whitespace remap function the strip whitespace remap function strips leading and trailing whitespace example given this event js message t tthis string has whitespace around it and this remap instruction set message strip whitespace message would result in js message this string has whitespace around it requirements strips leading and trailing whitespace splits on all whitespace as defined by
1
68
2,523,409,328
IssuesEvent
2015-01-20 10:16:16
Graylog2/graylog2-server
https://api.github.com/repos/Graylog2/graylog2-server
closed
Grok support for extractors
processing
When it comes to parsing complex (read:crappy) log formats Jordan Sissel hit a homerun with the concepts and ideas behind Grok. I have tried plain regex'es and drools but neither can match the ease, speed, and maintainability of grok patterns. As a bonus it would enable a lot of transparency between Logstash and Graylog2. Parsing crappy syslog like this : ``` srvesbmanxxxx api-log: 2014-01-15 14:46:33,382 INFO [cloud.api.ApiServer] (catalina-exec-13:null) (userId=x accountId=x sessionId=3A0E966E2xxxxxx) 10.x.x.x -- GET command=queryAsyncJobResult&jobId=b74d9ce1-0423-4a3e-b460-217787cf0681&response=json&sessionkey=0WfyV4trvb5ng6XrRLCAxHx3KOA%3D&_=1389793597907 200 { "queryasyncjobresultresponse" : {"accountid":"xxxxx-xxxx-11e3-bbe0-235130fb5cd9","userid":"xxxxx-60d0-xxxx-bbe0-235130fb5cd9","cmd":"org.apache.cloudstack.api.command.admin.storage.PreparePrimaryStorageForMaintenanceCmd","jobstatus":1,"jobprocstatus":0,"jobresultcode":0,"jobresulttype":"object","jobresult":{"storagepool":{"id":"xxxxxx-f3ca-3ed7-9ae2-6d79a56d3e90","zoneid":"xxxxxx-99cc-4ab6-a009-04a15d3ccd0a","zonename":"DCE-POC1","podid":"xxxxxxx-b4f7-4994-bd2f-9391e8fbd6d5","podname":"RACK5-DCE_POC1","name":"PR_NP0","ipaddress":"xxx.xxx.xxx.xxx","path":"/ds0002_nfs_fc","created":"2014-01-15T14:46:19+0100","type":"NetworkFilesystem","clusterid":"xxxxx-b113-4dbc-8301-1fd8ce1f5a60","clustername":"xxx.xxx.3.250/DCE_Zone_1/DCE_POC1","disksizetotal":1099511627776,"disksizeallocated":0,"tags":"PR_NP_0","state":"Maintenance","scope":"CLUSTER","jobid":"xxxxx-0423-4a3e-b460-217787cf0681","jobstatus":0}},"created":"2014-01-15T14:46:30+0100","jobid":"xxxx-0423-4a3e-b460-217787cf0681"} } ``` took 2 minutes using something like this : ```` %{HOSTNAME:host} %{NOTSPACE:log-type} %{NOTSPACE:date} %{NOTSPACE:time} %{WORD:level} %{NOTSPACE:path} %{NOTSPACE:thread} %{NOTSPACE:userid} %{NOTSPACE:accountid} %{NOTSPACE:sessionid} %{IP:ip} -- %{WORD:method} %{NOTSPACE:command} %{INT:status_code} %{GREEDYDATA:request} ```
1.0
Grok support for extractors - When it comes to parsing complex (read:crappy) log formats Jordan Sissel hit a homerun with the concepts and ideas behind Grok. I have tried plain regex'es and drools but neither can match the ease, speed, and maintainability of grok patterns. As a bonus it would enable a lot of transparency between Logstash and Graylog2. Parsing crappy syslog like this : ``` srvesbmanxxxx api-log: 2014-01-15 14:46:33,382 INFO [cloud.api.ApiServer] (catalina-exec-13:null) (userId=x accountId=x sessionId=3A0E966E2xxxxxx) 10.x.x.x -- GET command=queryAsyncJobResult&jobId=b74d9ce1-0423-4a3e-b460-217787cf0681&response=json&sessionkey=0WfyV4trvb5ng6XrRLCAxHx3KOA%3D&_=1389793597907 200 { "queryasyncjobresultresponse" : {"accountid":"xxxxx-xxxx-11e3-bbe0-235130fb5cd9","userid":"xxxxx-60d0-xxxx-bbe0-235130fb5cd9","cmd":"org.apache.cloudstack.api.command.admin.storage.PreparePrimaryStorageForMaintenanceCmd","jobstatus":1,"jobprocstatus":0,"jobresultcode":0,"jobresulttype":"object","jobresult":{"storagepool":{"id":"xxxxxx-f3ca-3ed7-9ae2-6d79a56d3e90","zoneid":"xxxxxx-99cc-4ab6-a009-04a15d3ccd0a","zonename":"DCE-POC1","podid":"xxxxxxx-b4f7-4994-bd2f-9391e8fbd6d5","podname":"RACK5-DCE_POC1","name":"PR_NP0","ipaddress":"xxx.xxx.xxx.xxx","path":"/ds0002_nfs_fc","created":"2014-01-15T14:46:19+0100","type":"NetworkFilesystem","clusterid":"xxxxx-b113-4dbc-8301-1fd8ce1f5a60","clustername":"xxx.xxx.3.250/DCE_Zone_1/DCE_POC1","disksizetotal":1099511627776,"disksizeallocated":0,"tags":"PR_NP_0","state":"Maintenance","scope":"CLUSTER","jobid":"xxxxx-0423-4a3e-b460-217787cf0681","jobstatus":0}},"created":"2014-01-15T14:46:30+0100","jobid":"xxxx-0423-4a3e-b460-217787cf0681"} } ``` took 2 minutes using something like this : ```` %{HOSTNAME:host} %{NOTSPACE:log-type} %{NOTSPACE:date} %{NOTSPACE:time} %{WORD:level} %{NOTSPACE:path} %{NOTSPACE:thread} %{NOTSPACE:userid} %{NOTSPACE:accountid} %{NOTSPACE:sessionid} %{IP:ip} -- %{WORD:method} %{NOTSPACE:command} %{INT:status_code} %{GREEDYDATA:request} ```
process
grok support for extractors when it comes to parsing complex read crappy log formats jordan sissel hit a homerun with the concepts and ideas behind grok i have tried plain regex es and drools but neither can match the ease speed and maintainability of grok patterns as a bonus it would enable a lot of transparency between logstash and parsing crappy syslog like this srvesbmanxxxx api log info catalina exec null userid x accountid x sessionid x x x get command queryasyncjobresult jobid response json sessionkey queryasyncjobresultresponse accountid xxxxx xxxx userid xxxxx xxxx cmd org apache cloudstack api command admin storage prepareprimarystorageformaintenancecmd jobstatus jobprocstatus jobresultcode jobresulttype object jobresult storagepool id xxxxxx zoneid xxxxxx zonename dce podid xxxxxxx podname dce name pr ipaddress xxx xxx xxx xxx path nfs fc created type networkfilesystem clusterid xxxxx clustername xxx xxx dce zone dce disksizetotal disksizeallocated tags pr np state maintenance scope cluster jobid xxxxx jobstatus created jobid xxxx took minutes using something like this hostname host notspace log type notspace date notspace time word level notspace path notspace thread notspace userid notspace accountid notspace sessionid ip ip word method notspace command int status code greedydata request
1
20,748
27,453,767,583
IssuesEvent
2023-03-02 19:30:19
pfmc-assessments/canary_2023
https://api.github.com/repos/pfmc-assessments/canary_2023
opened
Compil data - Foreign Catches
Data obtaining Data processing
The Rogers_foreign_catch (2003) pdf in the google drive contains the estimates of canary for foreign fleets. Need to add within our landings script.
1.0
Compil data - Foreign Catches - The Rogers_foreign_catch (2003) pdf in the google drive contains the estimates of canary for foreign fleets. Need to add within our landings script.
process
compil data foreign catches the rogers foreign catch pdf in the google drive contains the estimates of canary for foreign fleets need to add within our landings script
1
378
2,823,564,930
IssuesEvent
2015-05-21 09:36:31
austundag/testing
https://api.github.com/repos/austundag/testing
closed
Patient header allergies refresh issues needs to be investigated/fixed
enhancement in process
When user adds/cancels a new/old allergy the patient header allergies should be updated.
1.0
Patient header allergies refresh issues needs to be investigated/fixed - When user adds/cancels a new/old allergy the patient header allergies should be updated.
process
patient header allergies refresh issues needs to be investigated fixed when user adds cancels a new old allergy the patient header allergies should be updated
1
136,325
19,760,729,956
IssuesEvent
2022-01-16 11:19:33
TeamHavit/Havit-iOS
https://api.github.com/repos/TeamHavit/Havit-iOS
closed
[FEAT] EmptyView ๋ ˆ์ด์•„์›ƒ ๋ฐ ๋ถ„๊ธฐ์ฒ˜๋ฆฌ ๊ตฌํ˜„
๐Ÿ—‚ ์ˆ˜์—ฐ ๐ŸŸฃ Category ๐Ÿ– Design
## ๐Ÿ’ก Issue <!-- ์ด์Šˆ์— ๋Œ€ํ•œ ๋‚ด์šฉ์„ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”. --> ์นดํ…Œ๊ณ ๋ฆฌ๋ทฐ์—์„œ ์นดํ…Œ๊ณ ๋ฆฌ๊ฐ€ ์—†์„ ๋•Œ ๋นˆ ํ™”๋ฉด์— ๋„์šธ EmptyView UI๋ฅผ ๊ตฌ์„ฑํ•ฉ๋‹ˆ๋‹ค. ## ๐Ÿ“ todo <!-- ํ•ด์•ผ ํ•  ์ผ๋“ค์„ ์ ์–ด์ฃผ์„ธ์š”. --> - [x] EmptyView UI ๊ตฌ์„ฑ ๋ฐ ๋ ˆ์ด์•„์›ƒ - [x] ๊ธฐ๋ณธ hidden์œผ๋กœ ์„ค์ • ํ›„, ์นดํ…Œ๊ณ ๋ฆฌ๊ฐ€ ์—†์„ ๋•Œ hidden์†์„ฑ false ํ•ด์ฃผ๊ธฐ
1.0
[FEAT] EmptyView ๋ ˆ์ด์•„์›ƒ ๋ฐ ๋ถ„๊ธฐ์ฒ˜๋ฆฌ ๊ตฌํ˜„ - ## ๐Ÿ’ก Issue <!-- ์ด์Šˆ์— ๋Œ€ํ•œ ๋‚ด์šฉ์„ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”. --> ์นดํ…Œ๊ณ ๋ฆฌ๋ทฐ์—์„œ ์นดํ…Œ๊ณ ๋ฆฌ๊ฐ€ ์—†์„ ๋•Œ ๋นˆ ํ™”๋ฉด์— ๋„์šธ EmptyView UI๋ฅผ ๊ตฌ์„ฑํ•ฉ๋‹ˆ๋‹ค. ## ๐Ÿ“ todo <!-- ํ•ด์•ผ ํ•  ์ผ๋“ค์„ ์ ์–ด์ฃผ์„ธ์š”. --> - [x] EmptyView UI ๊ตฌ์„ฑ ๋ฐ ๋ ˆ์ด์•„์›ƒ - [x] ๊ธฐ๋ณธ hidden์œผ๋กœ ์„ค์ • ํ›„, ์นดํ…Œ๊ณ ๋ฆฌ๊ฐ€ ์—†์„ ๋•Œ hidden์†์„ฑ false ํ•ด์ฃผ๊ธฐ
non_process
emptyview ๋ ˆ์ด์•„์›ƒ ๋ฐ ๋ถ„๊ธฐ์ฒ˜๋ฆฌ ๊ตฌํ˜„ ๐Ÿ’ก issue ์นดํ…Œ๊ณ ๋ฆฌ๋ทฐ์—์„œ ์นดํ…Œ๊ณ ๋ฆฌ๊ฐ€ ์—†์„ ๋•Œ ๋นˆ ํ™”๋ฉด์— ๋„์šธ emptyview ui๋ฅผ ๊ตฌ์„ฑํ•ฉ๋‹ˆ๋‹ค ๐Ÿ“ todo emptyview ui ๊ตฌ์„ฑ ๋ฐ ๋ ˆ์ด์•„์›ƒ ๊ธฐ๋ณธ hidden์œผ๋กœ ์„ค์ • ํ›„ ์นดํ…Œ๊ณ ๋ฆฌ๊ฐ€ ์—†์„ ๋•Œ hidden์†์„ฑ false ํ•ด์ฃผ๊ธฐ
0
17,439
23,265,835,868
IssuesEvent
2022-08-04 17:16:54
MPMG-DCC-UFMG/C01
https://api.github.com/repos/MPMG-DCC-UFMG/C01
opened
Transparรชncia - Detalhes do coletor/Extrair Links e Baixar Arquivos
[1] Requisito [0] Desenvolvimento [2] Mรฉdia Prioridade [3] Processamento Dinรขmico
## Comportamento Esperado Espera-se que as configuraรงรตes `Extrair links` e `Baixar arquivos` se apliquem tambรฉm ร s coletas que usam processamento dinรขmico. ## Comportamento Atual Ao configurar um coletor dinรขmico com essa ferramenta, os links extraรญdos sรฃo, basicamente, o que podem ser obtidos atravรฉs do processamento do Scrapy. Para extrair links "dinรขmicos", apesar de haver a opรงรฃo de `clicar` ou `abrir nova aba` para xpaths especificados no mecanismo de passos, nรฃo hรก uma opรงรฃo nos detalhes do coletor para explorar todos os links disponรญveis, ou conforme filtragens. Isso pode ser pouco intuitivo para o usuรกrio que deseja explorar largamente os links. ## Passos para reproduzir o erro Nรฃo se aplica. ## Sistema - MP ou local: ambos - Branch especรญfica: master - Sistema diferente: nรฃo ## Screenshots Nรฃo se aplica.
1.0
Transparรชncia - Detalhes do coletor/Extrair Links e Baixar Arquivos - ## Comportamento Esperado Espera-se que as configuraรงรตes `Extrair links` e `Baixar arquivos` se apliquem tambรฉm ร s coletas que usam processamento dinรขmico. ## Comportamento Atual Ao configurar um coletor dinรขmico com essa ferramenta, os links extraรญdos sรฃo, basicamente, o que podem ser obtidos atravรฉs do processamento do Scrapy. Para extrair links "dinรขmicos", apesar de haver a opรงรฃo de `clicar` ou `abrir nova aba` para xpaths especificados no mecanismo de passos, nรฃo hรก uma opรงรฃo nos detalhes do coletor para explorar todos os links disponรญveis, ou conforme filtragens. Isso pode ser pouco intuitivo para o usuรกrio que deseja explorar largamente os links. ## Passos para reproduzir o erro Nรฃo se aplica. ## Sistema - MP ou local: ambos - Branch especรญfica: master - Sistema diferente: nรฃo ## Screenshots Nรฃo se aplica.
process
transparรชncia detalhes do coletor extrair links e baixar arquivos comportamento esperado espera se que as configuraรงรตes extrair links e baixar arquivos se apliquem tambรฉm ร s coletas que usam processamento dinรขmico comportamento atual ao configurar um coletor dinรขmico com essa ferramenta os links extraรญdos sรฃo basicamente o que podem ser obtidos atravรฉs do processamento do scrapy para extrair links dinรขmicos apesar de haver a opรงรฃo de clicar ou abrir nova aba para xpaths especificados no mecanismo de passos nรฃo hรก uma opรงรฃo nos detalhes do coletor para explorar todos os links disponรญveis ou conforme filtragens isso pode ser pouco intuitivo para o usuรกrio que deseja explorar largamente os links passos para reproduzir o erro nรฃo se aplica sistema mp ou local ambos branch especรญfica master sistema diferente nรฃo screenshots nรฃo se aplica
1
16,735
21,899,891,479
IssuesEvent
2022-05-20 12:26:10
camunda/zeebe-process-test
https://api.github.com/repos/camunda/zeebe-process-test
opened
Zeebe Test engine should start on a different port
kind/feature team/process-automation
**Description** When I started a local docker-compose for dev, I cannot start the tests and get an error message: `Port 26500 already in use`. It's is very inconvenient for a process automation developer to stop the runtime, run the tests, start the runtime, do the integration test, develop next feature... The Zeebe engine started by the tests should use a port like 26499, best would to make this configurable.
1.0
Zeebe Test engine should start on a different port - **Description** When I started a local docker-compose for dev, I cannot start the tests and get an error message: `Port 26500 already in use`. It's is very inconvenient for a process automation developer to stop the runtime, run the tests, start the runtime, do the integration test, develop next feature... The Zeebe engine started by the tests should use a port like 26499, best would to make this configurable.
process
zeebe test engine should start on a different port description when i started a local docker compose for dev i cannot start the tests and get an error message port already in use it s is very inconvenient for a process automation developer to stop the runtime run the tests start the runtime do the integration test develop next feature the zeebe engine started by the tests should use a port like best would to make this configurable
1
30,762
4,215,013,409
IssuesEvent
2016-06-30 01:14:55
apapadimoulis/what-bugs
https://api.github.com/repos/apapadimoulis/what-bugs
closed
Cannot reply with highlighted quote to necro topic
weird but by design
Go to a necro topic. Highlight something. Click REPLY. Get "OMFG WHY U NECRO" popup, and click "Look, asshole, I want to necro this thread" Composes appears, but no quoted reply.
1.0
Cannot reply with highlighted quote to necro topic - Go to a necro topic. Highlight something. Click REPLY. Get "OMFG WHY U NECRO" popup, and click "Look, asshole, I want to necro this thread" Composes appears, but no quoted reply.
non_process
cannot reply with highlighted quote to necro topic go to a necro topic highlight something click reply get omfg why u necro popup and click look asshole i want to necro this thread composes appears but no quoted reply
0
5,811
8,648,530,597
IssuesEvent
2018-11-26 16:48:47
googlegenomics/gcp-variant-transforms
https://api.github.com/repos/googlegenomics/gcp-variant-transforms
opened
Setup CPU profiling and optimize code
P2 process
We have not really spent too much time optimizing our code, but it seems like there may be opportunities to gain performance. For instance, a small change in PR #417 resulted in 50% speedup in one of the PTransforms (~10% overall speedup; or more depending on the size of the data). However, rather than trying 'random' things, it's better to run a CPU profiler on our code to determine the places where it makes sense to optimize (e.g. I have found that sanitizing the BQ fields also takes a nontrivial amount of time). Beam has some flags to setup profiling (see [thread](https://lists.apache.org/thread.html/71db23d40176a3e31f944bf2b8c22b670d5276327568c3023b89089a@%3Cdev.beam.apache.org%3E)). We can also consider using [Cython](https://cython.org/) once we know the areas to optimize.
1.0
Setup CPU profiling and optimize code - We have not really spent too much time optimizing our code, but it seems like there may be opportunities to gain performance. For instance, a small change in PR #417 resulted in 50% speedup in one of the PTransforms (~10% overall speedup; or more depending on the size of the data). However, rather than trying 'random' things, it's better to run a CPU profiler on our code to determine the places where it makes sense to optimize (e.g. I have found that sanitizing the BQ fields also takes a nontrivial amount of time). Beam has some flags to setup profiling (see [thread](https://lists.apache.org/thread.html/71db23d40176a3e31f944bf2b8c22b670d5276327568c3023b89089a@%3Cdev.beam.apache.org%3E)). We can also consider using [Cython](https://cython.org/) once we know the areas to optimize.
process
setup cpu profiling and optimize code we have not really spent too much time optimizing our code but it seems like there may be opportunities to gain performance for instance a small change in pr resulted in speedup in one of the ptransforms overall speedup or more depending on the size of the data however rather than trying random things it s better to run a cpu profiler on our code to determine the places where it makes sense to optimize e g i have found that sanitizing the bq fields also takes a nontrivial amount of time beam has some flags to setup profiling see we can also consider using once we know the areas to optimize
1
227
2,495,730,037
IssuesEvent
2015-01-06 14:16:00
firebug/firebug.next
https://api.github.com/repos/firebug/firebug.next
closed
Computed side panel is empty
bug inspector platform test-needed
Firebug.next seems to produce an empty computed properties panel when run on Nightly and DevEd. ![screen shot 2015-01-02 at 12 29 40 pm](https://cloud.githubusercontent.com/assets/1813816/5594675/1a5d571a-927b-11e4-846d-ff8df445131e.png) Changing themes does not seem to help. If however, I run Nightly without firebug then devtools does produce results.
1.0
Computed side panel is empty - Firebug.next seems to produce an empty computed properties panel when run on Nightly and DevEd. ![screen shot 2015-01-02 at 12 29 40 pm](https://cloud.githubusercontent.com/assets/1813816/5594675/1a5d571a-927b-11e4-846d-ff8df445131e.png) Changing themes does not seem to help. If however, I run Nightly without firebug then devtools does produce results.
non_process
computed side panel is empty firebug next seems to produce an empty computed properties panel when run on nightly and deved changing themes does not seem to help if however i run nightly without firebug then devtools does produce results
0
13,002
15,361,166,485
IssuesEvent
2021-03-01 17:48:09
ORNL-AMO/AMO-Tools-Desktop
https://api.github.com/repos/ORNL-AMO/AMO-Tools-Desktop
opened
small psychometric calc
Calculator Process Cooling Quick Fix
dropdown has "gas dew point" as an option. change to just "dew point" or "air dew point"
1.0
small psychometric calc - dropdown has "gas dew point" as an option. change to just "dew point" or "air dew point"
process
small psychometric calc dropdown has gas dew point as an option change to just dew point or air dew point
1
6,824
9,967,829,164
IssuesEvent
2019-07-08 14:23:27
AnalyticalGraphicsInc/cesium
https://api.github.com/repos/AnalyticalGraphicsInc/cesium
opened
Make bloom post process check for selected feature
category - post-processing good first issue type - enhancement
Right now you can apply certain post process effects to just one feature. The bloom shader doesn't do any check to allow you to apply it to just to the selected feature as you would expect. This is potentially pretty easy, you just need to take the check from another post process like the black and white one: https://github.com/AnalyticalGraphicsInc/cesium/blob/master/Source/Shaders/PostProcessStages/BlackAndWhite.glsl and apply it in the bloom shader: https://github.com/AnalyticalGraphicsInc/cesium/blob/master/Source/Shaders/PostProcessStages/BloomComposite.glsl The [forum thread](https://groups.google.com/d/msg/cesium-dev/GAr3d-TgQq4/ux-cxNxlDwAJ) here has an example you could use for testing. Note that you'd need to test the selected feature on a model, since entities aren't yet supported there (see https://github.com/AnalyticalGraphicsInc/cesium/issues/6705).
1.0
Make bloom post process check for selected feature - Right now you can apply certain post process effects to just one feature. The bloom shader doesn't do any check to allow you to apply it to just to the selected feature as you would expect. This is potentially pretty easy, you just need to take the check from another post process like the black and white one: https://github.com/AnalyticalGraphicsInc/cesium/blob/master/Source/Shaders/PostProcessStages/BlackAndWhite.glsl and apply it in the bloom shader: https://github.com/AnalyticalGraphicsInc/cesium/blob/master/Source/Shaders/PostProcessStages/BloomComposite.glsl The [forum thread](https://groups.google.com/d/msg/cesium-dev/GAr3d-TgQq4/ux-cxNxlDwAJ) here has an example you could use for testing. Note that you'd need to test the selected feature on a model, since entities aren't yet supported there (see https://github.com/AnalyticalGraphicsInc/cesium/issues/6705).
process
make bloom post process check for selected feature right now you can apply certain post process effects to just one feature the bloom shader doesn t do any check to allow you to apply it to just to the selected feature as you would expect this is potentially pretty easy you just need to take the check from another post process like the black and white one and apply it in the bloom shader the here has an example you could use for testing note that you d need to test the selected feature on a model since entities aren t yet supported there see
1
31,313
14,930,942,682
IssuesEvent
2021-01-25 04:27:56
iterative/dvc
https://api.github.com/repos/iterative/dvc
opened
-R collects stage from all of the repo
optimize p1-important performance
# Bug Report We are building `repo.graph` and using `path` to search for stages in the graph when `-R` instead of just reading files from the `path` directory. ### Context https://groups.google.com/a/iterative.ai/g/support/c/H_c36GuAsPM/m/6mLNdPIRAgAJ
True
-R collects stage from all of the repo - # Bug Report We are building `repo.graph` and using `path` to search for stages in the graph when `-R` instead of just reading files from the `path` directory. ### Context https://groups.google.com/a/iterative.ai/g/support/c/H_c36GuAsPM/m/6mLNdPIRAgAJ
non_process
r collects stage from all of the repo bug report we are building repo graph and using path to search for stages in the graph when r instead of just reading files from the path directory context
0
13,013
15,369,907,214
IssuesEvent
2021-03-02 08:04:56
prisma/prisma
https://api.github.com/repos/prisma/prisma
closed
prisma migrate not working with basic example `Reason: [libs/sql-schema-describer/src/getters.rs:42:14] called `Result::unwrap()` on an `Err` value: "Getting non_unique from Resultrow ResultRow { columns: [\"index_name\", \"non_unique\", \"column_name\", \"seq_in_index\", \"table_name\"], values: [Text(Some(\"PRIMARY\")), Text(Some(\"0\")), Text(Some(\"id\")), Integer(Some(1)), Text(Some(\"_prisma_migrations\"))] } as bool failed"`
bug/0-needs-info kind/bug process/candidate team/migrations
Hi Prisma Team! Prisma Migrate just crashed. ## Versions | Name | Version | |-------------|--------------------| | Platform | darwin | | Node | v14.15.0 | | Prisma CLI | 2.15.0 | | Binary | e51dc3b5a9ee790a07104bec1c9477d51740fe54| ## Error ``` Error: Error in migration engine. Reason: [libs/sql-schema-describer/src/getters.rs:42:14] called `Result::unwrap()` on an `Err` value: "Getting non_unique from Resultrow ResultRow { columns: [\"index_name\", \"non_unique\", \"column_name\", \"seq_in_index\", \"table_name\"], values: [Text(Some(\"PRIMARY\")), Text(Some(\"0\")), Text(Some(\"id\")), Integer(Some(1)), Text(Some(\"_prisma_migrations\"))] } as bool failed" Please create an issue in the migrate repo with your `schema.prisma` and the prisma command you tried to use ๐Ÿ™: https://github.com/prisma/prisma/issues/new ```
1.0
prisma migrate not working with basic example `Reason: [libs/sql-schema-describer/src/getters.rs:42:14] called `Result::unwrap()` on an `Err` value: "Getting non_unique from Resultrow ResultRow { columns: [\"index_name\", \"non_unique\", \"column_name\", \"seq_in_index\", \"table_name\"], values: [Text(Some(\"PRIMARY\")), Text(Some(\"0\")), Text(Some(\"id\")), Integer(Some(1)), Text(Some(\"_prisma_migrations\"))] } as bool failed"` - Hi Prisma Team! Prisma Migrate just crashed. ## Versions | Name | Version | |-------------|--------------------| | Platform | darwin | | Node | v14.15.0 | | Prisma CLI | 2.15.0 | | Binary | e51dc3b5a9ee790a07104bec1c9477d51740fe54| ## Error ``` Error: Error in migration engine. Reason: [libs/sql-schema-describer/src/getters.rs:42:14] called `Result::unwrap()` on an `Err` value: "Getting non_unique from Resultrow ResultRow { columns: [\"index_name\", \"non_unique\", \"column_name\", \"seq_in_index\", \"table_name\"], values: [Text(Some(\"PRIMARY\")), Text(Some(\"0\")), Text(Some(\"id\")), Integer(Some(1)), Text(Some(\"_prisma_migrations\"))] } as bool failed" Please create an issue in the migrate repo with your `schema.prisma` and the prisma command you tried to use ๐Ÿ™: https://github.com/prisma/prisma/issues/new ```
process
prisma migrate not working with basic example reason called result unwrap on an err value getting non unique from resultrow resultrow columns values as bool failed hi prisma team prisma migrate just crashed versions name version platform darwin node prisma cli binary error error error in migration engine reason called result unwrap on an err value getting non unique from resultrow resultrow columns values as bool failed please create an issue in the migrate repo with your schema prisma and the prisma command you tried to use ๐Ÿ™
1
181,864
21,664,457,514
IssuesEvent
2022-05-07 01:24:52
phunware/react-select
https://api.github.com/repos/phunware/react-select
opened
CVE-2022-29167 (High) detected in hawk-3.1.3.tgz
security vulnerability
## CVE-2022-29167 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>hawk-3.1.3.tgz</b></p></summary> <p>HTTP Hawk Authentication Scheme</p> <p>Library home page: <a href="https://registry.npmjs.org/hawk/-/hawk-3.1.3.tgz">https://registry.npmjs.org/hawk/-/hawk-3.1.3.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/hawk/package.json</p> <p> Dependency Hierarchy: - coveralls-2.13.3.tgz (Root Library) - request-2.79.0.tgz - :x: **hawk-3.1.3.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/phunware/react-select/commit/7b7ee4fda1530a8aba251e15f46bd683a40393d8">7b7ee4fda1530a8aba251e15f46bd683a40393d8</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Hawk is an HTTP authentication scheme providing mechanisms for making authenticated HTTP requests with partial cryptographic verification of the request and response, covering the HTTP method, request URI, host, and optionally the request payload. Hawk used a regular expression to parse `Host` HTTP header (`Hawk.utils.parseHost()`), which was subject to regular expression DoS attack - meaning each added character in the attacker's input increases the computation time exponentially. `parseHost()` was patched in `9.0.1` to use built-in `URL` class to parse hostname instead. `Hawk.authenticate()` accepts `options` argument. If that contains `host` and `port`, those would be used instead of a call to `utils.parseHost()`. <p>Publish Date: 2022-05-05 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-29167>CVE-2022-29167</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.4</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/mozilla/hawk/security/advisories/GHSA-44pw-h2cw-w3vq">https://github.com/mozilla/hawk/security/advisories/GHSA-44pw-h2cw-w3vq</a></p> <p>Release Date: 2022-05-05</p> <p>Fix Resolution: hawk - 9.0.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-29167 (High) detected in hawk-3.1.3.tgz - ## CVE-2022-29167 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>hawk-3.1.3.tgz</b></p></summary> <p>HTTP Hawk Authentication Scheme</p> <p>Library home page: <a href="https://registry.npmjs.org/hawk/-/hawk-3.1.3.tgz">https://registry.npmjs.org/hawk/-/hawk-3.1.3.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/hawk/package.json</p> <p> Dependency Hierarchy: - coveralls-2.13.3.tgz (Root Library) - request-2.79.0.tgz - :x: **hawk-3.1.3.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/phunware/react-select/commit/7b7ee4fda1530a8aba251e15f46bd683a40393d8">7b7ee4fda1530a8aba251e15f46bd683a40393d8</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Hawk is an HTTP authentication scheme providing mechanisms for making authenticated HTTP requests with partial cryptographic verification of the request and response, covering the HTTP method, request URI, host, and optionally the request payload. Hawk used a regular expression to parse `Host` HTTP header (`Hawk.utils.parseHost()`), which was subject to regular expression DoS attack - meaning each added character in the attacker's input increases the computation time exponentially. `parseHost()` was patched in `9.0.1` to use built-in `URL` class to parse hostname instead. `Hawk.authenticate()` accepts `options` argument. If that contains `host` and `port`, those would be used instead of a call to `utils.parseHost()`. <p>Publish Date: 2022-05-05 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-29167>CVE-2022-29167</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.4</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/mozilla/hawk/security/advisories/GHSA-44pw-h2cw-w3vq">https://github.com/mozilla/hawk/security/advisories/GHSA-44pw-h2cw-w3vq</a></p> <p>Release Date: 2022-05-05</p> <p>Fix Resolution: hawk - 9.0.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in hawk tgz cve high severity vulnerability vulnerable library hawk tgz http hawk authentication scheme library home page a href path to dependency file package json path to vulnerable library node modules hawk package json dependency hierarchy coveralls tgz root library request tgz x hawk tgz vulnerable library found in head commit a href vulnerability details hawk is an http authentication scheme providing mechanisms for making authenticated http requests with partial cryptographic verification of the request and response covering the http method request uri host and optionally the request payload hawk used a regular expression to parse host http header hawk utils parsehost which was subject to regular expression dos attack meaning each added character in the attacker s input increases the computation time exponentially parsehost was patched in to use built in url class to parse hostname instead hawk authenticate accepts options argument if that contains host and port those would be used instead of a call to utils parsehost publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution hawk step up your open source security game with whitesource
0
303,460
26,209,260,106
IssuesEvent
2023-01-04 03:51:34
phetsims/number-play
https://api.github.com/repos/phetsims/number-play
closed
CT cannot read properties of undefined
type:automated-testing
``` number-play : fuzz : built https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000 Query: fuzz&memoryLimit=1000 Uncaught TypeError: Cannot read properties of undefined (reading 'x') TypeError: Cannot read properties of undefined (reading 'x') at hi.distanceSquared (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:868:114701) at hi.distance (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:868:114579) at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:868:1672611 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:855:22645 at vt (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:855:5375) at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:855:22620 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:855:21586 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:855:28028 at Ee (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:855:21557) at Ue (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:855:22582) id: Bayes Chrome Snapshot from 11/23/2021, 3:23:09 AM ---------------------------------- number-play : multitouch-fuzz : unbuilt https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/number-play_en.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22multitouch-fuzz%22%2C%22unbuilt%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674208434%7D&brand=phet&ea&fuzz&fuzzPointers=2&memoryLimit=1000&supportsPanAndZoom=false Query: brand=phet&ea&fuzz&fuzzPointers=2&memoryLimit=1000&supportsPanAndZoom=false Uncaught TypeError: Cannot read properties of undefined (reading 'x') TypeError: Cannot read properties of undefined (reading 'x') at Vector2.distanceSquared (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/chipper/dist/dot/js/Vector2.js:84:35) at Vector2.distance (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/chipper/dist/dot/js/Vector2.js:61:31) at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/chipper/dist/number-play/js/common/model/OnesPlayArea.js:234:54 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3738:18 at arrayMap (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:660:23) at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3737:24 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3554:27 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:4920:15 at baseMap (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3553:7) at baseOrderBy (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3736:20) id: Bayes Chrome Snapshot from 11/23/2021, 3:23:09 AM ---------------------------------- number-play : pan-and-zoom-fuzz : unbuilt https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/number-play_en.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22pan-and-zoom-fuzz%22%2C%22unbuilt%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637665151189%7D&brand=phet&ea&fuzz&fuzzPointers=2&memoryLimit=1000&supportsPanAndZoom=true Query: brand=phet&ea&fuzz&fuzzPointers=2&memoryLimit=1000&supportsPanAndZoom=true Uncaught TypeError: Cannot read properties of undefined (reading 'x') TypeError: Cannot read properties of undefined (reading 'x') at Vector2.distanceSquared (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/chipper/dist/dot/js/Vector2.js:84:35) at Vector2.distance (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/chipper/dist/dot/js/Vector2.js:61:31) at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/chipper/dist/number-play/js/common/model/OnesPlayArea.js:234:54 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3738:18 at arrayMap (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:660:23) at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3737:24 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3554:27 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:4920:15 at baseMap (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3553:7) at baseOrderBy (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3736:20) id: Bayes Chrome Snapshot from 11/23/2021, 3:23:09 AM ```
1.0
CT cannot read properties of undefined - ``` number-play : fuzz : built https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000 Query: fuzz&memoryLimit=1000 Uncaught TypeError: Cannot read properties of undefined (reading 'x') TypeError: Cannot read properties of undefined (reading 'x') at hi.distanceSquared (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:868:114701) at hi.distance (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:868:114579) at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:868:1672611 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:855:22645 at vt (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:855:5375) at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:855:22620 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:855:21586 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:855:28028 at Ee (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:855:21557) at Ue (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/build/phet/number-play_en_phet.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22fuzz%22%2C%22built%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674628908%7D&fuzz&memoryLimit=1000:855:22582) id: Bayes Chrome Snapshot from 11/23/2021, 3:23:09 AM ---------------------------------- number-play : multitouch-fuzz : unbuilt https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/number-play_en.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22multitouch-fuzz%22%2C%22unbuilt%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637674208434%7D&brand=phet&ea&fuzz&fuzzPointers=2&memoryLimit=1000&supportsPanAndZoom=false Query: brand=phet&ea&fuzz&fuzzPointers=2&memoryLimit=1000&supportsPanAndZoom=false Uncaught TypeError: Cannot read properties of undefined (reading 'x') TypeError: Cannot read properties of undefined (reading 'x') at Vector2.distanceSquared (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/chipper/dist/dot/js/Vector2.js:84:35) at Vector2.distance (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/chipper/dist/dot/js/Vector2.js:61:31) at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/chipper/dist/number-play/js/common/model/OnesPlayArea.js:234:54 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3738:18 at arrayMap (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:660:23) at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3737:24 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3554:27 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:4920:15 at baseMap (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3553:7) at baseOrderBy (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3736:20) id: Bayes Chrome Snapshot from 11/23/2021, 3:23:09 AM ---------------------------------- number-play : pan-and-zoom-fuzz : unbuilt https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/number-play/number-play_en.html?continuousTest=%7B%22test%22%3A%5B%22number-play%22%2C%22pan-and-zoom-fuzz%22%2C%22unbuilt%22%5D%2C%22snapshotName%22%3A%22snapshot-1637662989114%22%2C%22timestamp%22%3A1637665151189%7D&brand=phet&ea&fuzz&fuzzPointers=2&memoryLimit=1000&supportsPanAndZoom=true Query: brand=phet&ea&fuzz&fuzzPointers=2&memoryLimit=1000&supportsPanAndZoom=true Uncaught TypeError: Cannot read properties of undefined (reading 'x') TypeError: Cannot read properties of undefined (reading 'x') at Vector2.distanceSquared (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/chipper/dist/dot/js/Vector2.js:84:35) at Vector2.distance (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/chipper/dist/dot/js/Vector2.js:61:31) at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/chipper/dist/number-play/js/common/model/OnesPlayArea.js:234:54 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3738:18 at arrayMap (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:660:23) at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3737:24 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3554:27 at https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:4920:15 at baseMap (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3553:7) at baseOrderBy (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1637662989114/sherpa/lib/lodash-4.17.4.js:3736:20) id: Bayes Chrome Snapshot from 11/23/2021, 3:23:09 AM ```
non_process
ct cannot read properties of undefined number play fuzz built query fuzz memorylimit uncaught typeerror cannot read properties of undefined reading x typeerror cannot read properties of undefined reading x at hi distancesquared at hi distance at at at vt at at at at ee at ue id bayes chrome snapshot from am number play multitouch fuzz unbuilt query brand phet ea fuzz fuzzpointers memorylimit supportspanandzoom false uncaught typeerror cannot read properties of undefined reading x typeerror cannot read properties of undefined reading x at distancesquared at distance at at at arraymap at at at at basemap at baseorderby id bayes chrome snapshot from am number play pan and zoom fuzz unbuilt query brand phet ea fuzz fuzzpointers memorylimit supportspanandzoom true uncaught typeerror cannot read properties of undefined reading x typeerror cannot read properties of undefined reading x at distancesquared at distance at at at arraymap at at at at basemap at baseorderby id bayes chrome snapshot from am
0
19,919
26,380,461,608
IssuesEvent
2023-01-12 08:11:48
zammad/zammad
https://api.github.com/repos/zammad/zammad
closed
Error while processing S/MIME signed emails when the sender name is different than CN
bug verified mail processing smime
### Used Zammad Version 5.4.x (git) ### Environment - Installation method: any - Operating system: MacOS 13.1 - Database + version: PostgreSQL 10.21 - Elasticsearch version: any - Browser + version: any ### Actual behaviour When an S/MIME signed email is being processed, the following error is logged and no ticket is created/updated: ```log "ERROR: Can't process email, you will find it for bug reporting under /opt/zammad/tmp/unprocessable_mail/1f45ea9904b0f58c530a473d76b0b913.eml, please create an issue at https://github.com/zammad/zammad/issues" "ERROR: #<NoMethodError: undefined method `[]' for nil:NilClass>" /opt/zammad/app/models/channel/email_parser.rb:138:in `rescue in process': #<NoMethodError: undefined method `[]' for nil:NilClass> (RuntimeError) /opt/zammad/lib/secure_mailing/smime/incoming.rb:234:in `block in sender_is_signer?' /opt/zammad/lib/secure_mailing/smime/incoming.rb:232:in `map' /opt/zammad/lib/secure_mailing/smime/incoming.rb:232:in `sender_is_signer?' /opt/zammad/lib/secure_mailing/smime/incoming.rb:124:in `verify_signature' /opt/zammad/lib/secure_mailing/smime/incoming.rb:23:in `process' /opt/zammad/lib/secure_mailing/backend/handler.rb:6:in `process' /opt/zammad/lib/secure_mailing.rb:8:in `block in incoming' /opt/zammad/lib/secure_mailing.rb:7:in `each' /opt/zammad/lib/secure_mailing.rb:7:in `incoming' /opt/zammad/app/models/channel/filter/secure_mailing.rb:6:in `run' /opt/zammad/app/models/channel/email_parser.rb:162:in `block in _process' /opt/zammad/app/models/channel/email_parser.rb:159:in `each' /opt/zammad/app/models/channel/email_parser.rb:159:in `_process' /opt/zammad/app/models/channel/email_parser.rb:123:in `block in process' /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:97:in `block in timeout' /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:35:in `block in catch' /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:35:in `catch' /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:35:in `catch' /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:112:in `timeout' /opt/zammad/app/models/channel/email_parser.rb:122:in `process' /opt/zammad/app/models/channel/driver/mail_stdin.rb:30:in `initialize' /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `new' /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `<main>' /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `eval' /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `perform' /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor/command.rb:27:in `run' /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor/invocation.rb:127:in `invoke_command' /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor.rb:392:in `dispatch' /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/command/base.rb:69:in `perform' /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/command.rb:48:in `invoke' /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands.rb:18:in `<main>' /Users/user/.rvm/gems/ruby-3.0.4/gems/bootsnap-1.15.0/lib/bootsnap/load_path_cache/core_ext/kernel_require.rb:32:in `require' /Users/user/.rvm/gems/ruby-3.0.4/gems/bootsnap-1.15.0/lib/bootsnap/load_path_cache/core_ext/kernel_require.rb:32:in `require' bin/rails:5:in `<main>' from /opt/zammad/app/models/channel/email_parser.rb:120:in `process' from /opt/zammad/app/models/channel/driver/mail_stdin.rb:30:in `initialize' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `new' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `<main>' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `eval' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `perform' from /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor/command.rb:27:in `run' from /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor/invocation.rb:127:in `invoke_command' from /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor.rb:392:in `dispatch' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/command/base.rb:69:in `perform' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/command.rb:48:in `invoke' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands.rb:18:in `<main>' from /Users/user/.rvm/gems/ruby-3.0.4/gems/bootsnap-1.15.0/lib/bootsnap/load_path_cache/core_ext/kernel_require.rb:32:in `require' from /Users/user/.rvm/gems/ruby-3.0.4/gems/bootsnap-1.15.0/lib/bootsnap/load_path_cache/core_ext/kernel_require.rb:32:in `require' from bin/rails:5:in `<main>' /opt/zammad/lib/secure_mailing/smime/incoming.rb:234:in `block in sender_is_signer?': undefined method `[]' for nil:NilClass (NoMethodError) from /opt/zammad/lib/secure_mailing/smime/incoming.rb:232:in `map' from /opt/zammad/lib/secure_mailing/smime/incoming.rb:232:in `sender_is_signer?' from /opt/zammad/lib/secure_mailing/smime/incoming.rb:124:in `verify_signature' from /opt/zammad/lib/secure_mailing/smime/incoming.rb:23:in `process' from /opt/zammad/lib/secure_mailing/backend/handler.rb:6:in `process' from /opt/zammad/lib/secure_mailing.rb:8:in `block in incoming' from /opt/zammad/lib/secure_mailing.rb:7:in `each' from /opt/zammad/lib/secure_mailing.rb:7:in `incoming' from /opt/zammad/app/models/channel/filter/secure_mailing.rb:6:in `run' from /opt/zammad/app/models/channel/email_parser.rb:162:in `block in _process' from /opt/zammad/app/models/channel/email_parser.rb:159:in `each' from /opt/zammad/app/models/channel/email_parser.rb:159:in `_process' from /opt/zammad/app/models/channel/email_parser.rb:123:in `block in process' from /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:97:in `block in timeout' from /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:35:in `block in catch' from /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:35:in `catch' from /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:35:in `catch' from /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:112:in `timeout' from /opt/zammad/app/models/channel/email_parser.rb:122:in `process' from /opt/zammad/app/models/channel/driver/mail_stdin.rb:30:in `initialize' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `new' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `<main>' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `eval' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `perform' from /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor/command.rb:27:in `run' from /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor/invocation.rb:127:in `invoke_command' from /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor.rb:392:in `dispatch' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/command/base.rb:69:in `perform' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/command.rb:48:in `invoke' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands.rb:18:in `<main>' from /Users/user/.rvm/gems/ruby-3.0.4/gems/bootsnap-1.15.0/lib/bootsnap/load_path_cache/core_ext/kernel_require.rb:32:in `require' from /Users/user/.rvm/gems/ruby-3.0.4/gems/bootsnap-1.15.0/lib/bootsnap/load_path_cache/core_ext/kernel_require.rb:32:in `require' from bin/rails:5:in `<main>' ``` ### Expected behaviour S/MIME signed emails should be processed without errors and a ticket is created/updated. Any issues with different than expected sender name should be shown above the existing article. ### Steps to reproduce the behaviour 1. Turn on S/MIME integration via **System > Integrations > S/MIME**. 2. In the same screen, click on the **Add Certificate** button. 3. Upload [ca.crt.txt](https://github.com/zammad/zammad/files/10390678/ca.crt.txt) certificate and click on the **Add** button. 4. Import [recipient-sign.eml.txt](https://github.com/zammad/zammad/files/10390679/recipient-sign.eml.txt) email message via Rails console: ```sh cat recipient-sign.eml.txt | rails r Channel::Driver::MailStdin.new ``` 5. Observe the reported error. The issue is reproducible whenever the S/MIME certificate used for signing a message has a **X509v3 Authority Key Identifier** (`authorityKeyIdentifier`) extension that points to a Certificate Authority (CA), and that same CA certificate is part of the Zammad's certificate store. An improper search over the Common Name (CN) field of all certificates in the chain assumes it will have an `emailAddress=...` value, but this is not the case for CA certificates: [`lib/secure_mailing/smime/incoming.rb@230`:](https://github.com/zammad/zammad/blob/5fce6548d103f17585b52bba5451561a60d1c19c/lib/secure_mailing/smime/incoming.rb#L231) ```ruby def sender_is_signer? signers = @verify_sign_p7enc.certificates.map do |cert| email = cert.subject.to_s.match(%r{emailAddress=(?<address>[^/]+)}) email[:address] end ``` Instead, a proper search over certificate's `subjectAltName` would be preferred, similar to how it was done for the [`SMIMECertificate` model](https://github.com/zammad/zammad/blob/develop/app/models/smime_certificate.rb#L119). ### Support Ticket _No response_ ### I'm sure this is a bug and no feature request or a general question. yes
1.0
Error while processing S/MIME signed emails when the sender name is different than CN - ### Used Zammad Version 5.4.x (git) ### Environment - Installation method: any - Operating system: MacOS 13.1 - Database + version: PostgreSQL 10.21 - Elasticsearch version: any - Browser + version: any ### Actual behaviour When an S/MIME signed email is being processed, the following error is logged and no ticket is created/updated: ```log "ERROR: Can't process email, you will find it for bug reporting under /opt/zammad/tmp/unprocessable_mail/1f45ea9904b0f58c530a473d76b0b913.eml, please create an issue at https://github.com/zammad/zammad/issues" "ERROR: #<NoMethodError: undefined method `[]' for nil:NilClass>" /opt/zammad/app/models/channel/email_parser.rb:138:in `rescue in process': #<NoMethodError: undefined method `[]' for nil:NilClass> (RuntimeError) /opt/zammad/lib/secure_mailing/smime/incoming.rb:234:in `block in sender_is_signer?' /opt/zammad/lib/secure_mailing/smime/incoming.rb:232:in `map' /opt/zammad/lib/secure_mailing/smime/incoming.rb:232:in `sender_is_signer?' /opt/zammad/lib/secure_mailing/smime/incoming.rb:124:in `verify_signature' /opt/zammad/lib/secure_mailing/smime/incoming.rb:23:in `process' /opt/zammad/lib/secure_mailing/backend/handler.rb:6:in `process' /opt/zammad/lib/secure_mailing.rb:8:in `block in incoming' /opt/zammad/lib/secure_mailing.rb:7:in `each' /opt/zammad/lib/secure_mailing.rb:7:in `incoming' /opt/zammad/app/models/channel/filter/secure_mailing.rb:6:in `run' /opt/zammad/app/models/channel/email_parser.rb:162:in `block in _process' /opt/zammad/app/models/channel/email_parser.rb:159:in `each' /opt/zammad/app/models/channel/email_parser.rb:159:in `_process' /opt/zammad/app/models/channel/email_parser.rb:123:in `block in process' /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:97:in `block in timeout' /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:35:in `block in catch' /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:35:in `catch' /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:35:in `catch' /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:112:in `timeout' /opt/zammad/app/models/channel/email_parser.rb:122:in `process' /opt/zammad/app/models/channel/driver/mail_stdin.rb:30:in `initialize' /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `new' /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `<main>' /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `eval' /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `perform' /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor/command.rb:27:in `run' /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor/invocation.rb:127:in `invoke_command' /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor.rb:392:in `dispatch' /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/command/base.rb:69:in `perform' /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/command.rb:48:in `invoke' /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands.rb:18:in `<main>' /Users/user/.rvm/gems/ruby-3.0.4/gems/bootsnap-1.15.0/lib/bootsnap/load_path_cache/core_ext/kernel_require.rb:32:in `require' /Users/user/.rvm/gems/ruby-3.0.4/gems/bootsnap-1.15.0/lib/bootsnap/load_path_cache/core_ext/kernel_require.rb:32:in `require' bin/rails:5:in `<main>' from /opt/zammad/app/models/channel/email_parser.rb:120:in `process' from /opt/zammad/app/models/channel/driver/mail_stdin.rb:30:in `initialize' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `new' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `<main>' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `eval' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `perform' from /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor/command.rb:27:in `run' from /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor/invocation.rb:127:in `invoke_command' from /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor.rb:392:in `dispatch' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/command/base.rb:69:in `perform' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/command.rb:48:in `invoke' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands.rb:18:in `<main>' from /Users/user/.rvm/gems/ruby-3.0.4/gems/bootsnap-1.15.0/lib/bootsnap/load_path_cache/core_ext/kernel_require.rb:32:in `require' from /Users/user/.rvm/gems/ruby-3.0.4/gems/bootsnap-1.15.0/lib/bootsnap/load_path_cache/core_ext/kernel_require.rb:32:in `require' from bin/rails:5:in `<main>' /opt/zammad/lib/secure_mailing/smime/incoming.rb:234:in `block in sender_is_signer?': undefined method `[]' for nil:NilClass (NoMethodError) from /opt/zammad/lib/secure_mailing/smime/incoming.rb:232:in `map' from /opt/zammad/lib/secure_mailing/smime/incoming.rb:232:in `sender_is_signer?' from /opt/zammad/lib/secure_mailing/smime/incoming.rb:124:in `verify_signature' from /opt/zammad/lib/secure_mailing/smime/incoming.rb:23:in `process' from /opt/zammad/lib/secure_mailing/backend/handler.rb:6:in `process' from /opt/zammad/lib/secure_mailing.rb:8:in `block in incoming' from /opt/zammad/lib/secure_mailing.rb:7:in `each' from /opt/zammad/lib/secure_mailing.rb:7:in `incoming' from /opt/zammad/app/models/channel/filter/secure_mailing.rb:6:in `run' from /opt/zammad/app/models/channel/email_parser.rb:162:in `block in _process' from /opt/zammad/app/models/channel/email_parser.rb:159:in `each' from /opt/zammad/app/models/channel/email_parser.rb:159:in `_process' from /opt/zammad/app/models/channel/email_parser.rb:123:in `block in process' from /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:97:in `block in timeout' from /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:35:in `block in catch' from /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:35:in `catch' from /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:35:in `catch' from /Users/user/.rvm/rubies/ruby-3.0.4/lib/ruby/3.0.0/timeout.rb:112:in `timeout' from /opt/zammad/app/models/channel/email_parser.rb:122:in `process' from /opt/zammad/app/models/channel/driver/mail_stdin.rb:30:in `initialize' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `new' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `<main>' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `eval' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands/runner/runner_command.rb:45:in `perform' from /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor/command.rb:27:in `run' from /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor/invocation.rb:127:in `invoke_command' from /Users/user/.rvm/gems/ruby-3.0.4/gems/thor-1.2.1/lib/thor.rb:392:in `dispatch' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/command/base.rb:69:in `perform' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/command.rb:48:in `invoke' from /Users/user/.rvm/gems/ruby-3.0.4/gems/railties-6.1.7/lib/rails/commands.rb:18:in `<main>' from /Users/user/.rvm/gems/ruby-3.0.4/gems/bootsnap-1.15.0/lib/bootsnap/load_path_cache/core_ext/kernel_require.rb:32:in `require' from /Users/user/.rvm/gems/ruby-3.0.4/gems/bootsnap-1.15.0/lib/bootsnap/load_path_cache/core_ext/kernel_require.rb:32:in `require' from bin/rails:5:in `<main>' ``` ### Expected behaviour S/MIME signed emails should be processed without errors and a ticket is created/updated. Any issues with different than expected sender name should be shown above the existing article. ### Steps to reproduce the behaviour 1. Turn on S/MIME integration via **System > Integrations > S/MIME**. 2. In the same screen, click on the **Add Certificate** button. 3. Upload [ca.crt.txt](https://github.com/zammad/zammad/files/10390678/ca.crt.txt) certificate and click on the **Add** button. 4. Import [recipient-sign.eml.txt](https://github.com/zammad/zammad/files/10390679/recipient-sign.eml.txt) email message via Rails console: ```sh cat recipient-sign.eml.txt | rails r Channel::Driver::MailStdin.new ``` 5. Observe the reported error. The issue is reproducible whenever the S/MIME certificate used for signing a message has a **X509v3 Authority Key Identifier** (`authorityKeyIdentifier`) extension that points to a Certificate Authority (CA), and that same CA certificate is part of the Zammad's certificate store. An improper search over the Common Name (CN) field of all certificates in the chain assumes it will have an `emailAddress=...` value, but this is not the case for CA certificates: [`lib/secure_mailing/smime/incoming.rb@230`:](https://github.com/zammad/zammad/blob/5fce6548d103f17585b52bba5451561a60d1c19c/lib/secure_mailing/smime/incoming.rb#L231) ```ruby def sender_is_signer? signers = @verify_sign_p7enc.certificates.map do |cert| email = cert.subject.to_s.match(%r{emailAddress=(?<address>[^/]+)}) email[:address] end ``` Instead, a proper search over certificate's `subjectAltName` would be preferred, similar to how it was done for the [`SMIMECertificate` model](https://github.com/zammad/zammad/blob/develop/app/models/smime_certificate.rb#L119). ### Support Ticket _No response_ ### I'm sure this is a bug and no feature request or a general question. yes
process
error while processing s mime signed emails when the sender name is different than cn used zammad version x git environment installation method any operating system macos database version postgresql elasticsearch version any browser version any actual behaviour when an s mime signed email is being processed the following error is logged and no ticket is created updated log error can t process email you will find it for bug reporting under opt zammad tmp unprocessable mail eml please create an issue at error opt zammad app models channel email parser rb in rescue in process runtimeerror opt zammad lib secure mailing smime incoming rb in block in sender is signer opt zammad lib secure mailing smime incoming rb in map opt zammad lib secure mailing smime incoming rb in sender is signer opt zammad lib secure mailing smime incoming rb in verify signature opt zammad lib secure mailing smime incoming rb in process opt zammad lib secure mailing backend handler rb in process opt zammad lib secure mailing rb in block in incoming opt zammad lib secure mailing rb in each opt zammad lib secure mailing rb in incoming opt zammad app models channel filter secure mailing rb in run opt zammad app models channel email parser rb in block in process opt zammad app models channel email parser rb in each opt zammad app models channel email parser rb in process opt zammad app models channel email parser rb in block in process users user rvm rubies ruby lib ruby timeout rb in block in timeout users user rvm rubies ruby lib ruby timeout rb in block in catch users user rvm rubies ruby lib ruby timeout rb in catch users user rvm rubies ruby lib ruby timeout rb in catch users user rvm rubies ruby lib ruby timeout rb in timeout opt zammad app models channel email parser rb in process opt zammad app models channel driver mail stdin rb in initialize users user rvm gems ruby gems railties lib rails commands runner runner command rb in new users user rvm gems ruby gems railties lib rails commands runner runner command rb in users user rvm gems ruby gems railties lib rails commands runner runner command rb in eval users user rvm gems ruby gems railties lib rails commands runner runner command rb in perform users user rvm gems ruby gems thor lib thor command rb in run users user rvm gems ruby gems thor lib thor invocation rb in invoke command users user rvm gems ruby gems thor lib thor rb in dispatch users user rvm gems ruby gems railties lib rails command base rb in perform users user rvm gems ruby gems railties lib rails command rb in invoke users user rvm gems ruby gems railties lib rails commands rb in users user rvm gems ruby gems bootsnap lib bootsnap load path cache core ext kernel require rb in require users user rvm gems ruby gems bootsnap lib bootsnap load path cache core ext kernel require rb in require bin rails in from opt zammad app models channel email parser rb in process from opt zammad app models channel driver mail stdin rb in initialize from users user rvm gems ruby gems railties lib rails commands runner runner command rb in new from users user rvm gems ruby gems railties lib rails commands runner runner command rb in from users user rvm gems ruby gems railties lib rails commands runner runner command rb in eval from users user rvm gems ruby gems railties lib rails commands runner runner command rb in perform from users user rvm gems ruby gems thor lib thor command rb in run from users user rvm gems ruby gems thor lib thor invocation rb in invoke command from users user rvm gems ruby gems thor lib thor rb in dispatch from users user rvm gems ruby gems railties lib rails command base rb in perform from users user rvm gems ruby gems railties lib rails command rb in invoke from users user rvm gems ruby gems railties lib rails commands rb in from users user rvm gems ruby gems bootsnap lib bootsnap load path cache core ext kernel require rb in require from users user rvm gems ruby gems bootsnap lib bootsnap load path cache core ext kernel require rb in require from bin rails in opt zammad lib secure mailing smime incoming rb in block in sender is signer undefined method for nil nilclass nomethoderror from opt zammad lib secure mailing smime incoming rb in map from opt zammad lib secure mailing smime incoming rb in sender is signer from opt zammad lib secure mailing smime incoming rb in verify signature from opt zammad lib secure mailing smime incoming rb in process from opt zammad lib secure mailing backend handler rb in process from opt zammad lib secure mailing rb in block in incoming from opt zammad lib secure mailing rb in each from opt zammad lib secure mailing rb in incoming from opt zammad app models channel filter secure mailing rb in run from opt zammad app models channel email parser rb in block in process from opt zammad app models channel email parser rb in each from opt zammad app models channel email parser rb in process from opt zammad app models channel email parser rb in block in process from users user rvm rubies ruby lib ruby timeout rb in block in timeout from users user rvm rubies ruby lib ruby timeout rb in block in catch from users user rvm rubies ruby lib ruby timeout rb in catch from users user rvm rubies ruby lib ruby timeout rb in catch from users user rvm rubies ruby lib ruby timeout rb in timeout from opt zammad app models channel email parser rb in process from opt zammad app models channel driver mail stdin rb in initialize from users user rvm gems ruby gems railties lib rails commands runner runner command rb in new from users user rvm gems ruby gems railties lib rails commands runner runner command rb in from users user rvm gems ruby gems railties lib rails commands runner runner command rb in eval from users user rvm gems ruby gems railties lib rails commands runner runner command rb in perform from users user rvm gems ruby gems thor lib thor command rb in run from users user rvm gems ruby gems thor lib thor invocation rb in invoke command from users user rvm gems ruby gems thor lib thor rb in dispatch from users user rvm gems ruby gems railties lib rails command base rb in perform from users user rvm gems ruby gems railties lib rails command rb in invoke from users user rvm gems ruby gems railties lib rails commands rb in from users user rvm gems ruby gems bootsnap lib bootsnap load path cache core ext kernel require rb in require from users user rvm gems ruby gems bootsnap lib bootsnap load path cache core ext kernel require rb in require from bin rails in expected behaviour s mime signed emails should be processed without errors and a ticket is created updated any issues with different than expected sender name should be shown above the existing article steps to reproduce the behaviour turn on s mime integration via system integrations s mime in the same screen click on the add certificate button upload certificate and click on the add button import email message via rails console sh cat recipient sign eml txt rails r channel driver mailstdin new observe the reported error the issue is reproducible whenever the s mime certificate used for signing a message has a authority key identifier authoritykeyidentifier extension that points to a certificate authority ca and that same ca certificate is part of the zammad s certificate store an improper search over the common name cn field of all certificates in the chain assumes it will have an emailaddress value but this is not the case for ca certificates ruby def sender is signer signers verify sign certificates map do cert email cert subject to s match r emailaddress email end instead a proper search over certificate s subjectaltname would be preferred similar to how it was done for the support ticket no response i m sure this is a bug and no feature request or a general question yes
1
279
6,001,197,266
IssuesEvent
2017-06-05 08:25:58
datacite/datacite
https://api.github.com/repos/datacite/datacite
opened
Announce planned outages
data center member reliability
As a data center manager (or member), I want to know when there is planned outages in order to alert my users so they donโ€™t get mad at me.
True
Announce planned outages - As a data center manager (or member), I want to know when there is planned outages in order to alert my users so they donโ€™t get mad at me.
non_process
announce planned outages as a data center manager or member i want to know when there is planned outages in order to alert my users so they donโ€™t get mad at me
0
224,229
24,769,725,982
IssuesEvent
2022-10-23 01:17:07
ncorejava/moment
https://api.github.com/repos/ncorejava/moment
opened
CVE-2022-37598 (High) detected in uglify-js-3.13.0.tgz
security vulnerability
## CVE-2022-37598 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>uglify-js-3.13.0.tgz</b></p></summary> <p>JavaScript parser, mangler/compressor and beautifier toolkit</p> <p>Library home page: <a href="https://registry.npmjs.org/uglify-js/-/uglify-js-3.13.0.tgz">https://registry.npmjs.org/uglify-js/-/uglify-js-3.13.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/uglify-js/package.json</p> <p> Dependency Hierarchy: - :x: **uglify-js-3.13.0.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Prototype pollution vulnerability in function DEFNODE in ast.js in mishoo UglifyJS 3.13.2 via the name variable in ast.js. <p>Publish Date: 2022-10-20 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-37598>CVE-2022-37598</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2022-10-20</p> <p>Fix Resolution: 3.13.10</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-37598 (High) detected in uglify-js-3.13.0.tgz - ## CVE-2022-37598 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>uglify-js-3.13.0.tgz</b></p></summary> <p>JavaScript parser, mangler/compressor and beautifier toolkit</p> <p>Library home page: <a href="https://registry.npmjs.org/uglify-js/-/uglify-js-3.13.0.tgz">https://registry.npmjs.org/uglify-js/-/uglify-js-3.13.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/uglify-js/package.json</p> <p> Dependency Hierarchy: - :x: **uglify-js-3.13.0.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Prototype pollution vulnerability in function DEFNODE in ast.js in mishoo UglifyJS 3.13.2 via the name variable in ast.js. <p>Publish Date: 2022-10-20 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-37598>CVE-2022-37598</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2022-10-20</p> <p>Fix Resolution: 3.13.10</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in uglify js tgz cve high severity vulnerability vulnerable library uglify js tgz javascript parser mangler compressor and beautifier toolkit library home page a href path to dependency file package json path to vulnerable library node modules uglify js package json dependency hierarchy x uglify js tgz vulnerable library found in base branch master vulnerability details prototype pollution vulnerability in function defnode in ast js in mishoo uglifyjs via the name variable in ast js publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution step up your open source security game with mend
0
521,247
15,106,256,336
IssuesEvent
2021-02-08 14:05:29
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
www.reddit.com - site is not usable
browser-fenix engine-gecko priority-critical
<!-- @browser: Firefox Mobile 87.0 --> <!-- @ua_header: Mozilla/5.0 (Android 10; Mobile; rv:87.0) Gecko/87.0 Firefox/87.0 --> <!-- @reported_with: android-components-reporter --> <!-- @public_url: https://github.com/webcompat/web-bugs/issues/66763 --> <!-- @extra_labels: browser-fenix --> **URL**: https://www.reddit.com/r/MakeupAddiction/submit **Browser / Version**: Firefox Mobile 87.0 **Operating System**: Android 10 **Tested Another Browser**: Yes Chrome **Problem type**: Site is not usable **Description**: Buttons or links not working **Steps to Reproduce**: Can't upload images to reddit. Doesn't work on chrome either. Only happens on mobile. <details> <summary>View the screenshot</summary> <img alt="Screenshot" src="https://webcompat.com/uploads/2021/2/d63a67bc-0c2d-4af6-a695-6efb56be5fdf.jpeg"> </details> <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20210203093146</li><li>channel: nightly</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li> </ul> </details> [View console log messages](https://webcompat.com/console_logs/2021/2/377c4d32-9139-4325-bd89-f24b900c9c5e) _From [webcompat.com](https://webcompat.com/) with โค๏ธ_
1.0
www.reddit.com - site is not usable - <!-- @browser: Firefox Mobile 87.0 --> <!-- @ua_header: Mozilla/5.0 (Android 10; Mobile; rv:87.0) Gecko/87.0 Firefox/87.0 --> <!-- @reported_with: android-components-reporter --> <!-- @public_url: https://github.com/webcompat/web-bugs/issues/66763 --> <!-- @extra_labels: browser-fenix --> **URL**: https://www.reddit.com/r/MakeupAddiction/submit **Browser / Version**: Firefox Mobile 87.0 **Operating System**: Android 10 **Tested Another Browser**: Yes Chrome **Problem type**: Site is not usable **Description**: Buttons or links not working **Steps to Reproduce**: Can't upload images to reddit. Doesn't work on chrome either. Only happens on mobile. <details> <summary>View the screenshot</summary> <img alt="Screenshot" src="https://webcompat.com/uploads/2021/2/d63a67bc-0c2d-4af6-a695-6efb56be5fdf.jpeg"> </details> <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20210203093146</li><li>channel: nightly</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li> </ul> </details> [View console log messages](https://webcompat.com/console_logs/2021/2/377c4d32-9139-4325-bd89-f24b900c9c5e) _From [webcompat.com](https://webcompat.com/) with โค๏ธ_
non_process
site is not usable url browser version firefox mobile operating system android tested another browser yes chrome problem type site is not usable description buttons or links not working steps to reproduce can t upload images to reddit doesn t work on chrome either only happens on mobile view the screenshot img alt screenshot src browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel nightly hastouchscreen true mixed active content blocked false mixed passive content blocked false tracking content blocked false from with โค๏ธ
0
20,626
27,298,796,496
IssuesEvent
2023-02-23 23:02:10
TUM-Dev/NavigaTUM
https://api.github.com/repos/TUM-Dev/NavigaTUM
closed
[Entry] [2930.EG.001]: Koordinate bearbeiten
entry webform delete-after-processing
Hallo, ich mรถchte diese Koordinate zum Roomfinder hinzufรผgen: ```yaml "2930.EG.001": { lat: 48.88517446990414, lon: 12.571829694610074 } ```
1.0
[Entry] [2930.EG.001]: Koordinate bearbeiten - Hallo, ich mรถchte diese Koordinate zum Roomfinder hinzufรผgen: ```yaml "2930.EG.001": { lat: 48.88517446990414, lon: 12.571829694610074 } ```
process
koordinate bearbeiten hallo ich mรถchte diese koordinate zum roomfinder hinzufรผgen yaml eg lat lon
1
16,905
22,217,491,812
IssuesEvent
2022-06-08 04:17:18
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
Bazel traverses generated symlink and fails to build
more data needed type: support / not a bug (process) team-ExternalDeps
<!-- ATTENTION! Please read and follow: - if this is a _question_ about how to build / test / query / deploy using Bazel, or a _discussion starter_, send it to bazel-discuss@googlegroups.com - if this is a _bug_ or _feature request_, fill the form below as best as you can. --> ### Description of the problem / feature request: Occasionally when I try to build with bazel I get the error ``` [kbatra@dev-1 test-repo] bazel build -c opt //... INFO: Scaled resource values: 128 cores, 131072 MB Starting local Bazel server and connecting to it... ERROR: error loading package 'bazel-test-repo/external/bazel_tools/src/main/protobuf': Label '//third_party/grpc/bazel:cc_grpc_library.bzl' is invalid because 'third_party/grpc/bazel' is not a package; perhaps you meant to put the colon here: '//:third_party/grpc/bazel/cc_grpc_library.bzl'? INFO: Elapsed time: 131.090s INFO: 0 processes. FAILED: Build did NOT complete successfully (125 packages loaded) currently loading: bazel-test-repo/external/bazel_tools/src/main/native ... (5 packages) ``` Bazel seems to enter into the symlink that it creates (`bazel-test-repo`) and tries to build the files in `bazel-test-repo/external/bazel_tools/src/main/protobuf`. When I add `bazel-test-repo` to the `.bazelignore` the issue seems to go away. ### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible. The issue is nondeterministic, but I can get this issue by building at the root of my project with `bazel build //...` ### What operating system are you running Bazel on? CentOS Linux release 7.7.1908 (Core) ### What's the output of `bazel info release`? release 4.1.0
1.0
Bazel traverses generated symlink and fails to build - <!-- ATTENTION! Please read and follow: - if this is a _question_ about how to build / test / query / deploy using Bazel, or a _discussion starter_, send it to bazel-discuss@googlegroups.com - if this is a _bug_ or _feature request_, fill the form below as best as you can. --> ### Description of the problem / feature request: Occasionally when I try to build with bazel I get the error ``` [kbatra@dev-1 test-repo] bazel build -c opt //... INFO: Scaled resource values: 128 cores, 131072 MB Starting local Bazel server and connecting to it... ERROR: error loading package 'bazel-test-repo/external/bazel_tools/src/main/protobuf': Label '//third_party/grpc/bazel:cc_grpc_library.bzl' is invalid because 'third_party/grpc/bazel' is not a package; perhaps you meant to put the colon here: '//:third_party/grpc/bazel/cc_grpc_library.bzl'? INFO: Elapsed time: 131.090s INFO: 0 processes. FAILED: Build did NOT complete successfully (125 packages loaded) currently loading: bazel-test-repo/external/bazel_tools/src/main/native ... (5 packages) ``` Bazel seems to enter into the symlink that it creates (`bazel-test-repo`) and tries to build the files in `bazel-test-repo/external/bazel_tools/src/main/protobuf`. When I add `bazel-test-repo` to the `.bazelignore` the issue seems to go away. ### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible. The issue is nondeterministic, but I can get this issue by building at the root of my project with `bazel build //...` ### What operating system are you running Bazel on? CentOS Linux release 7.7.1908 (Core) ### What's the output of `bazel info release`? release 4.1.0
process
bazel traverses generated symlink and fails to build attention please read and follow if this is a question about how to build test query deploy using bazel or a discussion starter send it to bazel discuss googlegroups com if this is a bug or feature request fill the form below as best as you can description of the problem feature request occasionally when i try to build with bazel i get the error bazel build c opt info scaled resource values cores mb starting local bazel server and connecting to it error error loading package bazel test repo external bazel tools src main protobuf label third party grpc bazel cc grpc library bzl is invalid because third party grpc bazel is not a package perhaps you meant to put the colon here third party grpc bazel cc grpc library bzl info elapsed time info processes failed build did not complete successfully packages loaded currently loading bazel test repo external bazel tools src main native packages bazel seems to enter into the symlink that it creates bazel test repo and tries to build the files in bazel test repo external bazel tools src main protobuf when i add bazel test repo to the bazelignore the issue seems to go away bugs what s the simplest easiest way to reproduce this bug please provide a minimal example if possible the issue is nondeterministic but i can get this issue by building at the root of my project with bazel build what operating system are you running bazel on centos linux release core what s the output of bazel info release release
1
59,711
24,853,876,407
IssuesEvent
2022-10-26 23:06:21
microsoft/vscode-cpptools
https://api.github.com/repos/microsoft/vscode-cpptools
closed
Default C_Cpp.codeAnalysis.clangTidy.headerFilter doesn't work on Windows in 1.13.2
bug Language Service fixed (release pending) quick fix regression insiders
There's a regression causing the headerFilter to be capitalized.
1.0
Default C_Cpp.codeAnalysis.clangTidy.headerFilter doesn't work on Windows in 1.13.2 - There's a regression causing the headerFilter to be capitalized.
non_process
default c cpp codeanalysis clangtidy headerfilter doesn t work on windows in there s a regression causing the headerfilter to be capitalized
0
49,018
10,314,929,752
IssuesEvent
2019-08-30 05:51:39
Mtaethefarmer/My-Interactive-Story
https://api.github.com/repos/Mtaethefarmer/My-Interactive-Story
closed
As a player I should be able to adjust settings of the camera to match my desired motion senstivity
code design
- [x] Expose player variables to the options menu ![image](https://app.gitkraken.com/api/glo/boards/5d4b6ff111fc31000f9624ea/attachments/5d68b7830b4324000f1bbd84)
1.0
As a player I should be able to adjust settings of the camera to match my desired motion senstivity - - [x] Expose player variables to the options menu ![image](https://app.gitkraken.com/api/glo/boards/5d4b6ff111fc31000f9624ea/attachments/5d68b7830b4324000f1bbd84)
non_process
as a player i should be able to adjust settings of the camera to match my desired motion senstivity expose player variables to the options menu
0
111,771
17,033,496,594
IssuesEvent
2021-07-05 01:26:04
attesch/hackazon
https://api.github.com/repos/attesch/hackazon
opened
CVE-2019-11358 (Medium) detected in jquery-1.9.1.min.js
security vulnerability
## CVE-2019-11358 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.9.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.min.js</a></p> <p>Path to vulnerable library: hackazon/vendor/phpunit/php-code-coverage/PHP/CodeCoverage/Report/HTML/Renderer/Template/js/jquery.min.js</p> <p> Dependency Hierarchy: - :x: **jquery-1.9.1.min.js** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jQuery before 3.4.0, as used in Drupal, Backdrop CMS, and other products, mishandles jQuery.extend(true, {}, ...) because of Object.prototype pollution. If an unsanitized source object contained an enumerable __proto__ property, it could extend the native Object.prototype. <p>Publish Date: 2019-04-20 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-11358>CVE-2019-11358</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358</a></p> <p>Release Date: 2019-04-20</p> <p>Fix Resolution: 3.4.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-11358 (Medium) detected in jquery-1.9.1.min.js - ## CVE-2019-11358 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.9.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.min.js</a></p> <p>Path to vulnerable library: hackazon/vendor/phpunit/php-code-coverage/PHP/CodeCoverage/Report/HTML/Renderer/Template/js/jquery.min.js</p> <p> Dependency Hierarchy: - :x: **jquery-1.9.1.min.js** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jQuery before 3.4.0, as used in Drupal, Backdrop CMS, and other products, mishandles jQuery.extend(true, {}, ...) because of Object.prototype pollution. If an unsanitized source object contained an enumerable __proto__ property, it could extend the native Object.prototype. <p>Publish Date: 2019-04-20 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-11358>CVE-2019-11358</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358</a></p> <p>Release Date: 2019-04-20</p> <p>Fix Resolution: 3.4.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in jquery min js cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to vulnerable library hackazon vendor phpunit php code coverage php codecoverage report html renderer template js jquery min js dependency hierarchy x jquery min js vulnerable library vulnerability details jquery before as used in drupal backdrop cms and other products mishandles jquery extend true because of object prototype pollution if an unsanitized source object contained an enumerable proto property it could extend the native object prototype publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
51,968
12,831,704,885
IssuesEvent
2020-07-07 06:06:16
jorgicio/jorgicio-gentoo-overlay
https://api.github.com/repos/jorgicio/jorgicio-gentoo-overlay
closed
x11-misc/optimus-manager: can't work in Gentoo
ebuild fail missing file
I'm using lightdm and i3. optimus-manager can't switch gpu. ``` โžœ optimus-manager --status ERROR: a GPU setup was initiated but Xorg post-start hook did not run. Log at /var/log/optimus-manager/switch/switch-20200624T234730.log If your login manager is GDM, make sure to follow those instructions: https://github.com/Askannz/optimus-manager#important--gnome-and-gdm-users If your display manager is neither GDM, SDDM nor LightDM, or if you don't use one, read the wiki: https://github.com/Askannz/optimus-manager/wiki/FAQ,-common-issues,-troubleshooting Cannot execute command because of previous errors. ``` And there is something wrong with `/etc/lightdm.conf.d/20-optimus-manager.conf`: ``` โžœ ag display-setup-script= /etc/lightdm.conf.d /etc/lightdm.conf.d/20-optimus-manager.conf 4:display-setup-script=/sbin/prime-offload ``` prime-offload is in `/usr/bin/prime-offload`, instead of `/sbin/prime-offload`. But even after set the right path in `/etc/lightdm.conf.d/20-optimus-manager.conf`, the error still same with before.
1.0
x11-misc/optimus-manager: can't work in Gentoo - I'm using lightdm and i3. optimus-manager can't switch gpu. ``` โžœ optimus-manager --status ERROR: a GPU setup was initiated but Xorg post-start hook did not run. Log at /var/log/optimus-manager/switch/switch-20200624T234730.log If your login manager is GDM, make sure to follow those instructions: https://github.com/Askannz/optimus-manager#important--gnome-and-gdm-users If your display manager is neither GDM, SDDM nor LightDM, or if you don't use one, read the wiki: https://github.com/Askannz/optimus-manager/wiki/FAQ,-common-issues,-troubleshooting Cannot execute command because of previous errors. ``` And there is something wrong with `/etc/lightdm.conf.d/20-optimus-manager.conf`: ``` โžœ ag display-setup-script= /etc/lightdm.conf.d /etc/lightdm.conf.d/20-optimus-manager.conf 4:display-setup-script=/sbin/prime-offload ``` prime-offload is in `/usr/bin/prime-offload`, instead of `/sbin/prime-offload`. But even after set the right path in `/etc/lightdm.conf.d/20-optimus-manager.conf`, the error still same with before.
non_process
misc optimus manager can t work in gentoo i m using lightdm and optimus manager can t switch gpu โžœ optimus manager status error a gpu setup was initiated but xorg post start hook did not run log at var log optimus manager switch switch log if your login manager is gdm make sure to follow those instructions if your display manager is neither gdm sddm nor lightdm or if you don t use one read the wiki cannot execute command because of previous errors and there is something wrong with etc lightdm conf d optimus manager conf โžœ ag display setup script etc lightdm conf d etc lightdm conf d optimus manager conf display setup script sbin prime offload prime offload is in usr bin prime offload instead of sbin prime offload but even after set the right path in etc lightdm conf d optimus manager conf the error still same with before
0
232,167
25,565,380,569
IssuesEvent
2022-11-30 13:57:23
hygieia/hygieia-whitesource-collector
https://api.github.com/repos/hygieia/hygieia-whitesource-collector
closed
CVE-2020-36188 (High) detected in jackson-databind-2.8.11.3.jar - autoclosed
wontfix security vulnerability
## CVE-2020-36188 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.8.11.3.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.11.3/jackson-databind-2.8.11.3.jar</p> <p> Dependency Hierarchy: - core-3.15.42.jar (Root Library) - spring-boot-starter-web-1.5.22.RELEASE.jar - :x: **jackson-databind-2.8.11.3.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/hygieia/hygieia-whitesource-collector/commit/4b5ed1d2f3030d721692ff4f980e8d2467fde19b">4b5ed1d2f3030d721692ff4f980e8d2467fde19b</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to com.newrelic.agent.deps.ch.qos.logback.core.db.JNDIConnectionSource. <p>Publish Date: 2021-01-06 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-36188>CVE-2020-36188</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2021-01-06</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-36188 (High) detected in jackson-databind-2.8.11.3.jar - autoclosed - ## CVE-2020-36188 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.8.11.3.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.11.3/jackson-databind-2.8.11.3.jar</p> <p> Dependency Hierarchy: - core-3.15.42.jar (Root Library) - spring-boot-starter-web-1.5.22.RELEASE.jar - :x: **jackson-databind-2.8.11.3.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/hygieia/hygieia-whitesource-collector/commit/4b5ed1d2f3030d721692ff4f980e8d2467fde19b">4b5ed1d2f3030d721692ff4f980e8d2467fde19b</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to com.newrelic.agent.deps.ch.qos.logback.core.db.JNDIConnectionSource. <p>Publish Date: 2021-01-06 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-36188>CVE-2020-36188</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2021-01-06</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in jackson databind jar autoclosed cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy core jar root library spring boot starter web release jar x jackson databind jar vulnerable library found in head commit a href found in base branch main vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to com newrelic agent deps ch qos logback core db jndiconnectionsource publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution com fasterxml jackson core jackson databind step up your open source security game with mend
0
14,042
16,849,534,958
IssuesEvent
2021-06-20 08:00:36
log2timeline/plaso
https://api.github.com/repos/log2timeline/plaso
closed
Change preprocessor to handle localized time zone on Windows
enhancement preprocessing
Preprocessor currently unable to handle localized time zones for Windows ``` [INFO] [PreProcess] Set attribute: time_zone_str to West-Europa (standaardtijd) [INFO] Parser filter expression changed to: winxp [INFO] Setting timezone to: West-Europa (standaardtijd) [WARNING] Unable to automatically configure timezone falling back to preferred timezone value: UTC ``` * [x] Preprocess available time zones on Windows https://github.com/log2timeline/plaso/pull/2672 * Determine localized time zone names * [x] Parse time zone offset from TZI value https://github.com/log2timeline/plaso/pull/3734 * [x] ~~Parse DST shift date and times from TZI value~~ - appears to not be necessary * [x] Map system time zone name to (localized) time zone - https://github.com/log2timeline/plaso/pull/3751 * for Windows XP looks like name is in Std value not in MUI_Std value, and can be mapped to the Windows Registry key name as the "normalized name" Also make changes to support MUI form e.g. in LNK description field: @shell32.dll,-34577
1.0
Change preprocessor to handle localized time zone on Windows - Preprocessor currently unable to handle localized time zones for Windows ``` [INFO] [PreProcess] Set attribute: time_zone_str to West-Europa (standaardtijd) [INFO] Parser filter expression changed to: winxp [INFO] Setting timezone to: West-Europa (standaardtijd) [WARNING] Unable to automatically configure timezone falling back to preferred timezone value: UTC ``` * [x] Preprocess available time zones on Windows https://github.com/log2timeline/plaso/pull/2672 * Determine localized time zone names * [x] Parse time zone offset from TZI value https://github.com/log2timeline/plaso/pull/3734 * [x] ~~Parse DST shift date and times from TZI value~~ - appears to not be necessary * [x] Map system time zone name to (localized) time zone - https://github.com/log2timeline/plaso/pull/3751 * for Windows XP looks like name is in Std value not in MUI_Std value, and can be mapped to the Windows Registry key name as the "normalized name" Also make changes to support MUI form e.g. in LNK description field: @shell32.dll,-34577
process
change preprocessor to handle localized time zone on windows preprocessor currently unable to handle localized time zones for windows set attribute time zone str to west europa standaardtijd parser filter expression changed to winxp setting timezone to west europa standaardtijd unable to automatically configure timezone falling back to preferred timezone value utc preprocess available time zones on windows determine localized time zone names parse time zone offset from tzi value parse dst shift date and times from tzi value appears to not be necessary map system time zone name to localized time zone for windows xp looks like name is in std value not in mui std value and can be mapped to the windows registry key name as the normalized name also make changes to support mui form e g in lnk description field dll
1
174,738
14,497,570,786
IssuesEvent
2020-12-11 14:24:34
oncleben31/cookiecutter-homeassistant-custom-component
https://api.github.com/repos/oncleben31/cookiecutter-homeassistant-custom-component
closed
Synchronize with latest changes in blueprint.
documentation enhancement
2020.11.16 is synchronized with blueprint commit from Oct 2, 2020 Add a way in the documentation to show with which version the cookiecutter is synchronized.
1.0
Synchronize with latest changes in blueprint. - 2020.11.16 is synchronized with blueprint commit from Oct 2, 2020 Add a way in the documentation to show with which version the cookiecutter is synchronized.
non_process
synchronize with latest changes in blueprint is synchronized with blueprint commit from oct add a way in the documentation to show with which version the cookiecutter is synchronized
0
12,599
14,996,574,167
IssuesEvent
2021-01-29 15:46:07
ORNL-AMO/AMO-Tools-Desktop
https://api.github.com/repos/ORNL-AMO/AMO-Tools-Desktop
closed
Several PH Calcs Fuel Use
Calculator Process Heating bug
I think the Fuel Use calculation is a problem for other calculators too. (not the gross loss, that seems to be working) This seems to be related to multiple losses. I'm guessing it is not using the right operating hours for the second loss? Is a problem for: Wall, Opening, Gas Leak, Atmo Charge Materials is okay and it is not applicable to Flue Gas
1.0
Several PH Calcs Fuel Use - I think the Fuel Use calculation is a problem for other calculators too. (not the gross loss, that seems to be working) This seems to be related to multiple losses. I'm guessing it is not using the right operating hours for the second loss? Is a problem for: Wall, Opening, Gas Leak, Atmo Charge Materials is okay and it is not applicable to Flue Gas
process
several ph calcs fuel use i think the fuel use calculation is a problem for other calculators too not the gross loss that seems to be working this seems to be related to multiple losses i m guessing it is not using the right operating hours for the second loss is a problem for wall opening gas leak atmo charge materials is okay and it is not applicable to flue gas
1
5,495
8,362,898,528
IssuesEvent
2018-10-03 18:10:26
cityofaustin/techstack
https://api.github.com/repos/cityofaustin/techstack
closed
Process Page Content
Content type: Process Page Department: Animal Center Department: EMS Foster Application Team: Content
Finalize Process Page Content that I believe is close to done so it can be added to the Alpha site. (still being added via yaml files). Let me know if you need the structure of the Process data model to do this. - [ ] Foster Content - [ ] EMS Content
1.0
Process Page Content - Finalize Process Page Content that I believe is close to done so it can be added to the Alpha site. (still being added via yaml files). Let me know if you need the structure of the Process data model to do this. - [ ] Foster Content - [ ] EMS Content
process
process page content finalize process page content that i believe is close to done so it can be added to the alpha site still being added via yaml files let me know if you need the structure of the process data model to do this foster content ems content
1
658,649
21,899,264,527
IssuesEvent
2022-05-20 11:49:41
mozilla/addons-frontend
https://api.github.com/repos/mozilla/addons-frontend
closed
Addon detail and /blocked-addon/ pages return a 404 when accessed with a `guid` in format `{6eeb4879-...}`
priority: p2 type: regression type: prod_bug
### Describe the problem and steps to reproduce it: Open the following detail page - `https://addons.allizom.org/en-US/firefox/{6eeb4879-9c55-4c50-bc21-ad9af693e6ae}/` ### What happened? A 404 is received ### What did you expect to happen? The page should redirect to this addon - https://addons.allizom.org/en-US/firefox/addon/release-11-05/ ### Anything else we should know? This issue also affects the /blocked-addon/ pages, where requests are using the `guid` by default, so there is no workaround there: `https://addons.allizom.org/en-US/firefox/blocked-addon/{f95e62ba-170b-444c-81ca-46520cad582e}/`; here is the link to the block page from [admin](https://addons-internal.stage.mozaws.net/en-US/admin/models/blocklist/block/343/change/) where the add-on is fully blocked. Also, here is another example to show that block pages are opened when the `guid` doesn't use the format causing the issues: https://addons.allizom.org/en-US/firefox/blocked-addon/adio-guid@1user.com/ This issue reproduces in all AMO environments.
1.0
Addon detail and /blocked-addon/ pages return a 404 when accessed with a `guid` in format `{6eeb4879-...}` - ### Describe the problem and steps to reproduce it: Open the following detail page - `https://addons.allizom.org/en-US/firefox/{6eeb4879-9c55-4c50-bc21-ad9af693e6ae}/` ### What happened? A 404 is received ### What did you expect to happen? The page should redirect to this addon - https://addons.allizom.org/en-US/firefox/addon/release-11-05/ ### Anything else we should know? This issue also affects the /blocked-addon/ pages, where requests are using the `guid` by default, so there is no workaround there: `https://addons.allizom.org/en-US/firefox/blocked-addon/{f95e62ba-170b-444c-81ca-46520cad582e}/`; here is the link to the block page from [admin](https://addons-internal.stage.mozaws.net/en-US/admin/models/blocklist/block/343/change/) where the add-on is fully blocked. Also, here is another example to show that block pages are opened when the `guid` doesn't use the format causing the issues: https://addons.allizom.org/en-US/firefox/blocked-addon/adio-guid@1user.com/ This issue reproduces in all AMO environments.
non_process
addon detail and blocked addon pages return a when accessed with a guid in format describe the problem and steps to reproduce it open the following detail page what happened a is received what did you expect to happen the page should redirect to this addon anything else we should know this issue also affects the blocked addon pages where requests are using the guid by default so there is no workaround there here is the link to the block page from where the add on is fully blocked also here is another example to show that block pages are opened when the guid doesn t use the format causing the issues this issue reproduces in all amo environments
0
484,184
13,935,789,353
IssuesEvent
2020-10-22 12:03:29
StargateMC/IssueTracker
https://api.github.com/repos/StargateMC/IssueTracker
closed
Improvements to NPC Spawn logic
1.12.2 Coding High Priority feature
I'd suggest the following changes to spawn logic: - [ ] Only allowing NPCs to spawn within 25 blocks of sea level on a world, and only when there is nothing above them. This will prevent NPCs spawning inside buildings, under trees or in caves and provide guarrranteed safe areas - eg: high in the mountains or underground Vote with a ๐Ÿ‘ or ๐Ÿ‘Ž emoji. Comment below for more suggestions
1.0
Improvements to NPC Spawn logic - I'd suggest the following changes to spawn logic: - [ ] Only allowing NPCs to spawn within 25 blocks of sea level on a world, and only when there is nothing above them. This will prevent NPCs spawning inside buildings, under trees or in caves and provide guarrranteed safe areas - eg: high in the mountains or underground Vote with a ๐Ÿ‘ or ๐Ÿ‘Ž emoji. Comment below for more suggestions
non_process
improvements to npc spawn logic i d suggest the following changes to spawn logic only allowing npcs to spawn within blocks of sea level on a world and only when there is nothing above them this will prevent npcs spawning inside buildings under trees or in caves and provide guarrranteed safe areas eg high in the mountains or underground vote with a ๐Ÿ‘ or ๐Ÿ‘Ž emoji comment below for more suggestions
0
22,183
30,733,295,630
IssuesEvent
2023-07-28 04:59:54
hashgraph/hedera-json-rpc-relay
https://api.github.com/repos/hashgraph/hedera-json-rpc-relay
closed
Release 0.28
enhancement P1 process
### Problem 0.28.0 features are not yet deployed ### Solution manual release process - [x] Review and update dependencies - [x] @hashgraph/hedera-local - [x] @hashgraph/sdk - [x] server/tests/acceptance/index.spec.ts - [x] Create a release/0.28 branch off of main. Ensure github test actions run. Merge Against `release/0.28` branch - [x] Bump Snapshot to `0.29.0-SNAPSHOT` branch off of main. Merge against Main. - [x] Tag as ```v0.28.0-rc1``` - [x] git tag - [x] Confirm new docker image version is deployed - [x] `v0.28.0-rc2` - [x] Tag Image - [x] Confirm docker image version is deployed - [x] Previewnet Testing - [x] Deploy tagged version - [x] Manual testing - [x] Run newman tests - [x] Run Dapp Example Bootstrap and manual tests - [x] Run acceptance tests - [x] Run performance tests - [x] Testnet Testing - [x] Deploy tagged version - [x] Manual testing - [x] Run newman tests - [x] Run Dapp Example Bootstrap and manual tests - [x] Run acceptance tests - [x] Run performance tests - [x] Let Bake - [x] Tag as ```v0.28.0``` - [x] git tag - [x] Confirm new docker image version is deployed - [x] Write up release notes and changelist - [x] Mainnet Testing - [x] Deploy tagged version - [x] Manual testing Any bugs or missed features found should see a new ticket opened, addressed in main and cherry-picked to ```release/0.28``` with a new rc version tagged and docker image deployed ### Alternatives _No response_
1.0
Release 0.28 - ### Problem 0.28.0 features are not yet deployed ### Solution manual release process - [x] Review and update dependencies - [x] @hashgraph/hedera-local - [x] @hashgraph/sdk - [x] server/tests/acceptance/index.spec.ts - [x] Create a release/0.28 branch off of main. Ensure github test actions run. Merge Against `release/0.28` branch - [x] Bump Snapshot to `0.29.0-SNAPSHOT` branch off of main. Merge against Main. - [x] Tag as ```v0.28.0-rc1``` - [x] git tag - [x] Confirm new docker image version is deployed - [x] `v0.28.0-rc2` - [x] Tag Image - [x] Confirm docker image version is deployed - [x] Previewnet Testing - [x] Deploy tagged version - [x] Manual testing - [x] Run newman tests - [x] Run Dapp Example Bootstrap and manual tests - [x] Run acceptance tests - [x] Run performance tests - [x] Testnet Testing - [x] Deploy tagged version - [x] Manual testing - [x] Run newman tests - [x] Run Dapp Example Bootstrap and manual tests - [x] Run acceptance tests - [x] Run performance tests - [x] Let Bake - [x] Tag as ```v0.28.0``` - [x] git tag - [x] Confirm new docker image version is deployed - [x] Write up release notes and changelist - [x] Mainnet Testing - [x] Deploy tagged version - [x] Manual testing Any bugs or missed features found should see a new ticket opened, addressed in main and cherry-picked to ```release/0.28``` with a new rc version tagged and docker image deployed ### Alternatives _No response_
process
release problem features are not yet deployed solution manual release process review and update dependencies hashgraph hedera local hashgraph sdk server tests acceptance index spec ts create a release branch off of main ensure github test actions run merge against release branch bump snapshot to snapshot branch off of main merge against main tag as git tag confirm new docker image version is deployed tag image confirm docker image version is deployed previewnet testing deploy tagged version manual testing run newman tests run dapp example bootstrap and manual tests run acceptance tests run performance tests testnet testing deploy tagged version manual testing run newman tests run dapp example bootstrap and manual tests run acceptance tests run performance tests let bake tag as git tag confirm new docker image version is deployed write up release notes and changelist mainnet testing deploy tagged version manual testing any bugs or missed features found should see a new ticket opened addressed in main and cherry picked to release with a new rc version tagged and docker image deployed alternatives no response
1
110,510
13,911,169,793
IssuesEvent
2020-10-20 17:01:20
RestaSoft/example-iti91m
https://api.github.com/repos/RestaSoft/example-iti91m
opened
d - Maquetar secciones para visualizar informaciรณn
Design FrontEnd
Estructuras la manera de cรณmo se visualizarรก la informaciรณn del modal
1.0
d - Maquetar secciones para visualizar informaciรณn - Estructuras la manera de cรณmo se visualizarรก la informaciรณn del modal
non_process
d maquetar secciones para visualizar informaciรณn estructuras la manera de cรณmo se visualizarรก la informaciรณn del modal
0
309,770
26,678,250,792
IssuesEvent
2023-01-26 15:46:03
ntop/ntopng
https://api.github.com/repos/ntop/ntopng
closed
Flow Analysis Glitches
Bug Ready to Test
There are a few issues to fix - The two columns on the left have the same value - The two right-most columns should be right-aligned - Protocol filtering) clicking on a hostname is not working. Example http://localhost:3000/lua/flows_stats.lua?application=DNS.Cybersec - Breakdown sort is useless and not working ![image](https://user-images.githubusercontent.com/4493366/214315757-0fb02286-8f48-4380-83af-ceee9517b878.png)
1.0
Flow Analysis Glitches - There are a few issues to fix - The two columns on the left have the same value - The two right-most columns should be right-aligned - Protocol filtering) clicking on a hostname is not working. Example http://localhost:3000/lua/flows_stats.lua?application=DNS.Cybersec - Breakdown sort is useless and not working ![image](https://user-images.githubusercontent.com/4493366/214315757-0fb02286-8f48-4380-83af-ceee9517b878.png)
non_process
flow analysis glitches there are a few issues to fix the two columns on the left have the same value the two right most columns should be right aligned protocol filtering clicking on a hostname is not working example breakdown sort is useless and not working
0
14,457
17,533,241,918
IssuesEvent
2021-08-12 01:47:53
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Automatic reporting of errors
Feedback stale Processing Feature Request
Author Name: **Paolo Cavallini** (@pcav) Original Redmine Issue: [5370](https://issues.qgis.org/issues/5370) Redmine category:processing/gui --- It would be nice if in case of an error, the user could automatically send the log (by clicking on an OK button) to a mailing list, so that errors could be compared, and automatically archived.
1.0
Automatic reporting of errors - Author Name: **Paolo Cavallini** (@pcav) Original Redmine Issue: [5370](https://issues.qgis.org/issues/5370) Redmine category:processing/gui --- It would be nice if in case of an error, the user could automatically send the log (by clicking on an OK button) to a mailing list, so that errors could be compared, and automatically archived.
process
automatic reporting of errors author name paolo cavallini pcav original redmine issue redmine category processing gui it would be nice if in case of an error the user could automatically send the log by clicking on an ok button to a mailing list so that errors could be compared and automatically archived
1
7,825
11,007,369,319
IssuesEvent
2019-12-04 08:20:05
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
Evasion or tolerance responses - merges
multi-species process term merge
Merge: GO:0052060 evasion or tolerance by symbiont of host-produced nitric oxide 1 EXP GO:0052163 modulation by symbiont of defense-related host nitric oxide production 2 EXP GO:0052567 response to defense-related host reactive oxygen species production 1 EXP GO:0052059 evasion or tolerance by symbiont of host-produced reactive oxygen species 0 into GO:0052164 modulation by symbiont of defense-related host reactive oxygen species production 1 EXP GO:0052566 response to host phytoalexin production GO:0052061 evasion or tolerance by symbiont of host-produced phytoalexins merge into GO:0052165 modulation by symbiont of host phytoalexin production 0 annotations Move GO:0052565 response to defense-related host nitric oxide production 3 EXP under 'GO:0071500 cellular response to nitrosative stress & change label to 'defense response to host innate immune response nitric oxide production'
1.0
Evasion or tolerance responses - merges - Merge: GO:0052060 evasion or tolerance by symbiont of host-produced nitric oxide 1 EXP GO:0052163 modulation by symbiont of defense-related host nitric oxide production 2 EXP GO:0052567 response to defense-related host reactive oxygen species production 1 EXP GO:0052059 evasion or tolerance by symbiont of host-produced reactive oxygen species 0 into GO:0052164 modulation by symbiont of defense-related host reactive oxygen species production 1 EXP GO:0052566 response to host phytoalexin production GO:0052061 evasion or tolerance by symbiont of host-produced phytoalexins merge into GO:0052165 modulation by symbiont of host phytoalexin production 0 annotations Move GO:0052565 response to defense-related host nitric oxide production 3 EXP under 'GO:0071500 cellular response to nitrosative stress & change label to 'defense response to host innate immune response nitric oxide production'
process
evasion or tolerance responses merges merge go evasion or tolerance by symbiont of host produced nitric oxide exp go modulation by symbiont of defense related host nitric oxide production exp go response to defense related host reactive oxygen species production exp go evasion or tolerance by symbiont of host produced reactive oxygen species into go modulation by symbiont of defense related host reactive oxygen species production exp go response to host phytoalexin production go evasion or tolerance by symbiont of host produced phytoalexins merge into go modulation by symbiont of host phytoalexin production annotations move go response to defense related host nitric oxide production exp under go cellular response to nitrosative stress change label to defense response to host innate immune response nitric oxide production
1
7,993
11,185,781,924
IssuesEvent
2020-01-01 06:03:39
arunkumar9t2/scabbard
https://api.github.com/repos/arunkumar9t2/scabbard
opened
Render Component Tree
module:gradle-plugin module:processor
In addition to current graphs, we could provide option to render component tree like Uber does with RIBS. Example: ![](http://1fykyq3mdn5r21tpna3wkdyi-wpengine.netdna-ssl.com/wp-content/uploads/2017/08/image5.png) Steps: - [ ] Add a separate processor - [ ] `ComponentPath` has parent and child information, use that create a custom tree data structure - [ ] Create a graph with `rankdir = TB`
1.0
Render Component Tree - In addition to current graphs, we could provide option to render component tree like Uber does with RIBS. Example: ![](http://1fykyq3mdn5r21tpna3wkdyi-wpengine.netdna-ssl.com/wp-content/uploads/2017/08/image5.png) Steps: - [ ] Add a separate processor - [ ] `ComponentPath` has parent and child information, use that create a custom tree data structure - [ ] Create a graph with `rankdir = TB`
process
render component tree in addition to current graphs we could provide option to render component tree like uber does with ribs example steps add a separate processor componentpath has parent and child information use that create a custom tree data structure create a graph with rankdir tb
1
18,613
24,579,293,104
IssuesEvent
2022-10-13 14:31:31
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[Consent API] [Android] Data sharing image > UI Issue in the data sharing image screenshot in the mobile app
Bug P2 Android Process: Fixed Process: Tested QA Process: Tested dev
**Pre-condition -** The study should be created by enabling data sharing permission **Steps:** 1. Sign up or sign in to the mobile app 2. Enroll to the study [eg: on the data sharing permission screen select 'provided'] 3. Withdrawn from the study 4. Rejoin the same study [ on the data sharing permission screen select 'Not provided'] 4. Click on the Study resources screen 5. Click on Data sharing image and observe **AR:** Data sharing image > UI Issue in the data sharing image screenshot in the mobile app ![ma](https://user-images.githubusercontent.com/86007179/180783300-1a98c7ee-41a6-414f-afa3-19622016b9ad.png)
3.0
[Consent API] [Android] Data sharing image > UI Issue in the data sharing image screenshot in the mobile app - **Pre-condition -** The study should be created by enabling data sharing permission **Steps:** 1. Sign up or sign in to the mobile app 2. Enroll to the study [eg: on the data sharing permission screen select 'provided'] 3. Withdrawn from the study 4. Rejoin the same study [ on the data sharing permission screen select 'Not provided'] 4. Click on the Study resources screen 5. Click on Data sharing image and observe **AR:** Data sharing image > UI Issue in the data sharing image screenshot in the mobile app ![ma](https://user-images.githubusercontent.com/86007179/180783300-1a98c7ee-41a6-414f-afa3-19622016b9ad.png)
process
data sharing image ui issue in the data sharing image screenshot in the mobile app pre condition the study should be created by enabling data sharing permission steps sign up or sign in to the mobile app enroll to the study withdrawn from the study rejoin the same study click on the study resources screen click on data sharing image and observe ar data sharing image ui issue in the data sharing image screenshot in the mobile app
1
20,749
11,500,634,809
IssuesEvent
2020-02-12 15:52:34
cityofaustin/atd-mobility-project-database
https://api.github.com/repos/cityofaustin/atd-mobility-project-database
closed
Research current ATD project tracking workflows
Epic Project: Mobility Project Database Service: PM Type: Research Workgroup: AMD
- Perform user interviews - Review documents/tools - Add into [ATD MPD Learnings Synthesis document](https://docs.google.com/document/d/1swkNIccJtT5IzdQz_zi4H2LgzhQd3c4iC22Y8SiGOQk/edit?usp=sharing) Notes and documents in [Google Drive](https://drive.google.com/drive/folders/1NYqaRT7IczU3d-vx9PFZKjtvpvxP58f5).
1.0
Research current ATD project tracking workflows - - Perform user interviews - Review documents/tools - Add into [ATD MPD Learnings Synthesis document](https://docs.google.com/document/d/1swkNIccJtT5IzdQz_zi4H2LgzhQd3c4iC22Y8SiGOQk/edit?usp=sharing) Notes and documents in [Google Drive](https://drive.google.com/drive/folders/1NYqaRT7IczU3d-vx9PFZKjtvpvxP58f5).
non_process
research current atd project tracking workflows perform user interviews review documents tools add into notes and documents in
0
20,890
27,714,667,516
IssuesEvent
2023-03-14 16:12:06
OliverKillane/Imperial-Computing-Notes
https://api.github.com/repos/OliverKillane/Imperial-Computing-Notes
opened
Data Processing Lock Manager
60029 - Data Processing Systems Content Missing
RW Lock differs from lectures. - Waiting confirmation from lecturer on correctness of lecture content [here](https://edstem.org/us/courses/29415/discussion/2756894) - Need to fix and improve current Lock Manager implementation in `transactions/code`
1.0
Data Processing Lock Manager - RW Lock differs from lectures. - Waiting confirmation from lecturer on correctness of lecture content [here](https://edstem.org/us/courses/29415/discussion/2756894) - Need to fix and improve current Lock Manager implementation in `transactions/code`
process
data processing lock manager rw lock differs from lectures waiting confirmation from lecturer on correctness of lecture content need to fix and improve current lock manager implementation in transactions code
1
8,014
11,205,216,290
IssuesEvent
2020-01-05 12:44:42
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
gdal2xyz (QGIS 3.4.13; Yesterday downloaded and installed; Windows)- The csv. file is not saved to the PC.
Bug Feedback Processing
Gdal2xyz does not save and create csv. file to selected location on PC. On the console, however, the whole process is going well.
1.0
gdal2xyz (QGIS 3.4.13; Yesterday downloaded and installed; Windows)- The csv. file is not saved to the PC. - Gdal2xyz does not save and create csv. file to selected location on PC. On the console, however, the whole process is going well.
process
qgis yesterday downloaded and installed windows the csv file is not saved to the pc does not save and create csv file to selected location on pc on the console however the whole process is going well
1
14,909
18,295,629,603
IssuesEvent
2021-10-05 20:10:31
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
encoding issue when using GRASS processing tool on macOS
Processing Bug MacOS
I tried this on two different Macs with two different OS versions: Mojave and Catalina. Als tried this with two different versions of QGIS: 3.18.0-Zรผrich and 3.16.4-Hannover When I run any shapefile or geopackage or any GIS bases file with the projection = epsg:3857 any GRASS processing tool fails to run due to Unicode error ``` Traceback (most recent call last): File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/algs/grass7/Grass7Algorithm.py", line 423, in processAlgorithm getattr(self, fullName)(parameters, context, feedback) File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/algs/grass7/Grass7Algorithm.py", line 490, in processInputs self.loadVectorLayerFromParameter( File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/algs/grass7/Grass7Algorithm.py", line 877, in loadVectorLayerFromParameter self.loadVectorLayer(name, layer, external=external, feedback=feedback) File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/algs/grass7/Grass7Algorithm.py", line 914, in loadVectorLayer self.setSessionProjectionFromLayer(layer) File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/algs/grass7/Grass7Algorithm.py", line 1044, in setSessionProjectionFromLayer self.setSessionProjection(layer.crs()) File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/algs/grass7/Grass7Algorithm.py", line 1051, in setSessionProjection file_name = Grass7Utils.exportCrsWktToFile(crs) File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/algs/grass7/Grass7Utils.py", line 95, in exportCrsWktToFile f.write(wkt) UnicodeEncodeError: 'ascii' codec can't encode character '\xb0' in position 849: ordinal not in range(128) ``` which is the degree symbol in the usage part of the WKT As a note when I use the same shapefile or geopackage on the same machine using QGIS 3.8.0-Zanziba the grass processing tools runs without any issue, on both Mac's. all installers were from dmg's downloaded from QGIS website.
1.0
encoding issue when using GRASS processing tool on macOS - I tried this on two different Macs with two different OS versions: Mojave and Catalina. Als tried this with two different versions of QGIS: 3.18.0-Zรผrich and 3.16.4-Hannover When I run any shapefile or geopackage or any GIS bases file with the projection = epsg:3857 any GRASS processing tool fails to run due to Unicode error ``` Traceback (most recent call last): File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/algs/grass7/Grass7Algorithm.py", line 423, in processAlgorithm getattr(self, fullName)(parameters, context, feedback) File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/algs/grass7/Grass7Algorithm.py", line 490, in processInputs self.loadVectorLayerFromParameter( File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/algs/grass7/Grass7Algorithm.py", line 877, in loadVectorLayerFromParameter self.loadVectorLayer(name, layer, external=external, feedback=feedback) File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/algs/grass7/Grass7Algorithm.py", line 914, in loadVectorLayer self.setSessionProjectionFromLayer(layer) File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/algs/grass7/Grass7Algorithm.py", line 1044, in setSessionProjectionFromLayer self.setSessionProjection(layer.crs()) File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/algs/grass7/Grass7Algorithm.py", line 1051, in setSessionProjection file_name = Grass7Utils.exportCrsWktToFile(crs) File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/algs/grass7/Grass7Utils.py", line 95, in exportCrsWktToFile f.write(wkt) UnicodeEncodeError: 'ascii' codec can't encode character '\xb0' in position 849: ordinal not in range(128) ``` which is the degree symbol in the usage part of the WKT As a note when I use the same shapefile or geopackage on the same machine using QGIS 3.8.0-Zanziba the grass processing tools runs without any issue, on both Mac's. all installers were from dmg's downloaded from QGIS website.
process
encoding issue when using grass processing tool on macos i tried this on two different macs with two different os versions mojave and catalina als tried this with two different versions of qgis zรผrich and hannover when i run any shapefile or geopackage or any gis bases file with the projection epsg any grass processing tool fails to run due to unicode error traceback most recent call last file applications qgis app contents macos resources python plugins processing algs py line in processalgorithm getattr self fullname parameters context feedback file applications qgis app contents macos resources python plugins processing algs py line in processinputs self loadvectorlayerfromparameter file applications qgis app contents macos resources python plugins processing algs py line in loadvectorlayerfromparameter self loadvectorlayer name layer external external feedback feedback file applications qgis app contents macos resources python plugins processing algs py line in loadvectorlayer self setsessionprojectionfromlayer layer file applications qgis app contents macos resources python plugins processing algs py line in setsessionprojectionfromlayer self setsessionprojection layer crs file applications qgis app contents macos resources python plugins processing algs py line in setsessionprojection file name exportcrswkttofile crs file applications qgis app contents macos resources python plugins processing algs py line in exportcrswkttofile f write wkt unicodeencodeerror ascii codec can t encode character in position ordinal not in range which is the degree symbol in the usage part of the wkt as a note when i use the same shapefile or geopackage on the same machine using qgis zanziba the grass processing tools runs without any issue on both mac s all installers were from dmg s downloaded from qgis website
1
70,491
18,158,231,089
IssuesEvent
2021-09-27 06:19:41
RSA-Bots/PandaWrapper
https://api.github.com/repos/RSA-Bots/PandaWrapper
closed
Not all callbacks are asynchronous
bug sev: med TODO ver: build
Issue: Not all callbacks are asynchronous Version: v0.1.0-beta.2 Description: When implementing a `SlashCommand` the callback is `async`-able. However, when implementing `SelectMenuCommand`, the callback is not `async`-able.
1.0
Not all callbacks are asynchronous - Issue: Not all callbacks are asynchronous Version: v0.1.0-beta.2 Description: When implementing a `SlashCommand` the callback is `async`-able. However, when implementing `SelectMenuCommand`, the callback is not `async`-able.
non_process
not all callbacks are asynchronous issue not all callbacks are asynchronous version beta description when implementing a slashcommand the callback is async able however when implementing selectmenucommand the callback is not async able
0
5,627
8,481,884,456
IssuesEvent
2018-10-25 16:55:34
easy-software-ufal/annotations_repos
https://api.github.com/repos/easy-software-ufal/annotations_repos
opened
dotnet/BenchmarkDotNet [Params] with arrays as params throws System.Reflection.TargetInvocationException
ADP C# test wrong processing
Issue: `https://github.com/dotnet/BenchmarkDotNet/issues/712` PR: `https://github.com/dotnet/BenchmarkDotNet/commit/e66bb0fcab515d545239159a9766b2c43d3d36a3`
1.0
dotnet/BenchmarkDotNet [Params] with arrays as params throws System.Reflection.TargetInvocationException - Issue: `https://github.com/dotnet/BenchmarkDotNet/issues/712` PR: `https://github.com/dotnet/BenchmarkDotNet/commit/e66bb0fcab515d545239159a9766b2c43d3d36a3`
process
dotnet benchmarkdotnet with arrays as params throws system reflection targetinvocationexception issue pr
1
7,220
10,349,188,065
IssuesEvent
2019-09-04 21:42:19
GetTerminus/terminus-ui
https://api.github.com/repos/GetTerminus/terminus-ui
opened
Docs: Add more conventions to the development readme
Focus: community Goal: Process Improvement Type: chore
- Add new documentation to the development readme: - [ ] member ordering - [ ] underscore convention - [ ] setters & getters - [ ] decorators ### Member ordering Currently most files have members in (an) order. We should formalize that pattern for consistency. 1. private properties 1. public properties 1. getters - getter before setter before private property 1. view references 1. inputs 1. outputs 1. constructor 1. public methods 1. protected/static methods 1. private methods ### Underscores Prefixing private methods and properties is a common pattern. We've opted to rely on TypeScript's member access rather than the underscore. We should make it clear for contributors that we don't follow that pattern. Caveat: The one place we do use the underscore is to prefix the private property that is used with a getter and setter. eg ```typescript public set active(v: boolean) { this._foo = v; } public get active(): boolean { return this._foo; } private _foo = false; ``` ### Setters & Getters Our setters always come before the getter; which is in turn before the private property. (See above code example) ### Decorators Decorators go on their own line above the item they are decorating. ```typescript @Decorator() public foo; ```
1.0
Docs: Add more conventions to the development readme - - Add new documentation to the development readme: - [ ] member ordering - [ ] underscore convention - [ ] setters & getters - [ ] decorators ### Member ordering Currently most files have members in (an) order. We should formalize that pattern for consistency. 1. private properties 1. public properties 1. getters - getter before setter before private property 1. view references 1. inputs 1. outputs 1. constructor 1. public methods 1. protected/static methods 1. private methods ### Underscores Prefixing private methods and properties is a common pattern. We've opted to rely on TypeScript's member access rather than the underscore. We should make it clear for contributors that we don't follow that pattern. Caveat: The one place we do use the underscore is to prefix the private property that is used with a getter and setter. eg ```typescript public set active(v: boolean) { this._foo = v; } public get active(): boolean { return this._foo; } private _foo = false; ``` ### Setters & Getters Our setters always come before the getter; which is in turn before the private property. (See above code example) ### Decorators Decorators go on their own line above the item they are decorating. ```typescript @Decorator() public foo; ```
process
docs add more conventions to the development readme add new documentation to the development readme member ordering underscore convention setters getters decorators member ordering currently most files have members in an order we should formalize that pattern for consistency private properties public properties getters getter before setter before private property view references inputs outputs constructor public methods protected static methods private methods underscores prefixing private methods and properties is a common pattern we ve opted to rely on typescript s member access rather than the underscore we should make it clear for contributors that we don t follow that pattern caveat the one place we do use the underscore is to prefix the private property that is used with a getter and setter eg typescript public set active v boolean this foo v public get active boolean return this foo private foo false setters getters our setters always come before the getter which is in turn before the private property see above code example decorators decorators go on their own line above the item they are decorating typescript decorator public foo
1
17,676
23,511,013,375
IssuesEvent
2022-08-18 16:32:01
brucemiller/LaTeXML
https://api.github.com/repos/brucemiller/LaTeXML
closed
images with caption may have truncated alt text
enhancement postprocessing accessibility
When referring to images, the HTML stylesheet tries to lift the alternative text from the caption via: https://github.com/brucemiller/LaTeXML/blob/58cbf1814f555e9741e9c2bea2cb5b35e87a4685/lib/LaTeXML/resources/XSLT/LaTeXML-misc-xhtml.xsl#L175 However selecting `text()` only picks the first text node, which is most often only a small piece of the caption. The stylesheet should rather extract the text content recursively (ideally following something like https://www.w3.org/TR/accname-1.1/#mapping_additional_nd_te).
1.0
images with caption may have truncated alt text - When referring to images, the HTML stylesheet tries to lift the alternative text from the caption via: https://github.com/brucemiller/LaTeXML/blob/58cbf1814f555e9741e9c2bea2cb5b35e87a4685/lib/LaTeXML/resources/XSLT/LaTeXML-misc-xhtml.xsl#L175 However selecting `text()` only picks the first text node, which is most often only a small piece of the caption. The stylesheet should rather extract the text content recursively (ideally following something like https://www.w3.org/TR/accname-1.1/#mapping_additional_nd_te).
process
images with caption may have truncated alt text when referring to images the html stylesheet tries to lift the alternative text from the caption via however selecting text only picks the first text node which is most often only a small piece of the caption the stylesheet should rather extract the text content recursively ideally following something like
1
9,998
13,042,264,383
IssuesEvent
2020-07-28 22:05:03
allinurl/goaccess
https://api.github.com/repos/allinurl/goaccess
closed
Sed Errors with Relative Date
log-processing other
I am trying to create an html report with a relative date on Mac OS. I am able to generate the report fine with a selected start date, but not with the relative date command. Here's the command I am using and the errors it gives me: ``` sed -n '/'$(date '+%d/%b/%Y' -d 'yesterday')'/,$ p' /private/var/log/apache2/access_log | goaccess -o /Users/username/Documents/report.html date: illegal time format usage: date [-jnRu] [-d dst] [-r seconds] [-t west] [-v[+|-]val[ymwdHMS]] ... [-f fmt date | [[[mm]dd]HH]MM[[cc]yy][.ss]] [+format] sed: first RE may not be empty ``` Appreciate any help you can offer.
1.0
Sed Errors with Relative Date - I am trying to create an html report with a relative date on Mac OS. I am able to generate the report fine with a selected start date, but not with the relative date command. Here's the command I am using and the errors it gives me: ``` sed -n '/'$(date '+%d/%b/%Y' -d 'yesterday')'/,$ p' /private/var/log/apache2/access_log | goaccess -o /Users/username/Documents/report.html date: illegal time format usage: date [-jnRu] [-d dst] [-r seconds] [-t west] [-v[+|-]val[ymwdHMS]] ... [-f fmt date | [[[mm]dd]HH]MM[[cc]yy][.ss]] [+format] sed: first RE may not be empty ``` Appreciate any help you can offer.
process
sed errors with relative date i am trying to create an html report with a relative date on mac os i am able to generate the report fine with a selected start date but not with the relative date command here s the command i am using and the errors it gives me sed n date d b y d yesterday p private var log access log goaccess o users username documents report html date illegal time format usage date val dd hh mm yy sed first re may not be empty appreciate any help you can offer
1
33,149
9,037,478,076
IssuesEvent
2019-02-09 11:03:47
athrane/pineapple
https://api.github.com/repos/athrane/pineapple
closed
Remove fixed version from example modules
default configuration maven build pineapple-core pineapple-example-modules
For some historic reason, the pineapple-examples-modules project have been built with a fixed version number = 1.0.0. This gives problems when deploying to Bintray, see issue #308. The solution seems to be to remove the fixed versioning and version the example modules along with the version used for the other components in the project.
1.0
Remove fixed version from example modules - For some historic reason, the pineapple-examples-modules project have been built with a fixed version number = 1.0.0. This gives problems when deploying to Bintray, see issue #308. The solution seems to be to remove the fixed versioning and version the example modules along with the version used for the other components in the project.
non_process
remove fixed version from example modules for some historic reason the pineapple examples modules project have been built with a fixed version number this gives problems when deploying to bintray see issue the solution seems to be to remove the fixed versioning and version the example modules along with the version used for the other components in the project
0
75,250
7,466,852,426
IssuesEvent
2018-04-02 12:56:54
edenlabllc/ehealth.api
https://api.github.com/repos/edenlabllc/ehealth.api
closed
add declaration_limit to admin portal configuration
FE FE/nhs.admin priority/medium status/test
as NHS admin i want configure ehealth parameters so that they are always actual
1.0
add declaration_limit to admin portal configuration - as NHS admin i want configure ehealth parameters so that they are always actual
non_process
add declaration limit to admin portal configuration as nhs admin i want configure ehealth parameters so that they are always actual
0
12,263
3,264,412,166
IssuesEvent
2015-10-22 11:36:24
algolia/instantsearch.js
https://api.github.com/repos/algolia/instantsearch.js
closed
Unit test React.js components and the library
in progress test
# TODO: - [x] setup the testing stack - [x] write Template test - [x] write lib/InstantSearch.js test - [ ] write components/RefinementList.js (clic event use case) - [x] write widget test
1.0
Unit test React.js components and the library - # TODO: - [x] setup the testing stack - [x] write Template test - [x] write lib/InstantSearch.js test - [ ] write components/RefinementList.js (clic event use case) - [x] write widget test
non_process
unit test react js components and the library todo setup the testing stack write template test write lib instantsearch js test write components refinementlist js clic event use case write widget test
0
313,930
9,577,526,536
IssuesEvent
2019-05-07 11:54:12
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
www.google.com - see bug description
browser-focus-geckoview engine-gecko priority-critical
<!-- @browser: Firefox Mobile 68.0 --> <!-- @ua_header: Mozilla/5.0 (Android 8.0.0; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0 --> <!-- @reported_with: --> <!-- @extra_labels: browser-focus-geckoview --> **URL**: https://www.google.com/search?q=pizza hut **Browser / Version**: Firefox Mobile 68.0 **Operating System**: Android 8.0.0 **Tested Another Browser**: No **Problem type**: Something else **Description**: my custom autocomplete is not working **Steps to Reproduce**: <details> <summary>Browser Configuration</summary> <ul> <li>None</li> </ul> </details> _From [webcompat.com](https://webcompat.com/) with โค๏ธ_
1.0
www.google.com - see bug description - <!-- @browser: Firefox Mobile 68.0 --> <!-- @ua_header: Mozilla/5.0 (Android 8.0.0; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0 --> <!-- @reported_with: --> <!-- @extra_labels: browser-focus-geckoview --> **URL**: https://www.google.com/search?q=pizza hut **Browser / Version**: Firefox Mobile 68.0 **Operating System**: Android 8.0.0 **Tested Another Browser**: No **Problem type**: Something else **Description**: my custom autocomplete is not working **Steps to Reproduce**: <details> <summary>Browser Configuration</summary> <ul> <li>None</li> </ul> </details> _From [webcompat.com](https://webcompat.com/) with โค๏ธ_
non_process
see bug description url hut browser version firefox mobile operating system android tested another browser no problem type something else description my custom autocomplete is not working steps to reproduce browser configuration none from with โค๏ธ
0
6,329
9,369,109,310
IssuesEvent
2019-04-03 10:14:49
decidim/decidim
https://api.github.com/repos/decidim/decidim
opened
Filter spaces by scope and area
space: assemblies space: processes type: feature
#### Feature Filter spaces by scope and area. Applied to: - Processes - Assemblies #### Related to https://meta.decidim.org/processes/roadmap/f/122/proposals/
1.0
Filter spaces by scope and area - #### Feature Filter spaces by scope and area. Applied to: - Processes - Assemblies #### Related to https://meta.decidim.org/processes/roadmap/f/122/proposals/
process
filter spaces by scope and area feature filter spaces by scope and area applied to processes assemblies related to
1
6,058
8,881,001,281
IssuesEvent
2019-01-14 08:46:18
zotero/zotero
https://api.github.com/repos/zotero/zotero
closed
Word integration error with document stored in OneDrive
Word Processor Integration
This is bizarre, but we've had various reports that having a document in OneDrive causes integration errors. E.g., https://forums.zotero.org/discussion/73678/error-id-944178723 We should either try to fix this or, if possible, display a warning when a document is saved in OneDrive (if we can tell from the plugin).
1.0
Word integration error with document stored in OneDrive - This is bizarre, but we've had various reports that having a document in OneDrive causes integration errors. E.g., https://forums.zotero.org/discussion/73678/error-id-944178723 We should either try to fix this or, if possible, display a warning when a document is saved in OneDrive (if we can tell from the plugin).
process
word integration error with document stored in onedrive this is bizarre but we ve had various reports that having a document in onedrive causes integration errors e g we should either try to fix this or if possible display a warning when a document is saved in onedrive if we can tell from the plugin
1
277,524
24,081,238,402
IssuesEvent
2022-09-19 06:49:18
cbnu-sequence/proreviewer-web-front
https://api.github.com/repos/cbnu-sequence/proreviewer-web-front
closed
[Test] useInput ์ปค์Šคํ…€ ํ›…์„ Unit ํ…Œ์ŠคํŠธ๋ฅผ ํ•ด๋ณด๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
test
## ๐ŸŽฏ ์„ค๋ช… useInput ์ปค์Šคํ…€ ํ›…์„ Unit ํ…Œ์ŠคํŠธ๋ฅผ ํ•ด๋ณด๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค. ## ๐Ÿ’ญ ๊ธฐํƒ€ ํ˜„์žฌ ํ…Œ์ŠคํŠธ ํŒŒ์ผ ๊ตฌ์กฐ๊ฐ€ ๋ฃจํŠธ ๊ฒฝ๋กœ์˜ test ๋””๋ ‰ํ„ฐ๋ฆฌ ํ•œ ๊ณณ์—์„œ ๊ด€๋ฆฌํ•˜๊ณ  ์žˆ๋Š”๋ฐ, ์ด๊ฒƒ ๋ณด๋‹ค๋Š” ๊ทธ ๋•Œ ๋ง์”€ํ•˜์‹  ๊ฒƒ์ฒ˜๋Ÿผ ํ•ด๋‹น ๋””๋ ‰ํ„ฐ๋ฆฌ ์•ˆ์—์„œ test ๋””๋ ‰ํ„ฐ๋ฆฌ๋ฅผ ๋งŒ๋“ค์–ด์„œ ๊ด€๋ฆฌํ•˜๋Š” ๊ฒƒ์ด Unit ํ…Œ์ŠคํŠธ๋ฅผ ์ง„ํ–‰ํ•˜๊ธฐ์—๋Š” ์กฐ๊ธˆ ๋” ํšจ์œจ์ ์ผ ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
1.0
[Test] useInput ์ปค์Šคํ…€ ํ›…์„ Unit ํ…Œ์ŠคํŠธ๋ฅผ ํ•ด๋ณด๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค. - ## ๐ŸŽฏ ์„ค๋ช… useInput ์ปค์Šคํ…€ ํ›…์„ Unit ํ…Œ์ŠคํŠธ๋ฅผ ํ•ด๋ณด๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค. ## ๐Ÿ’ญ ๊ธฐํƒ€ ํ˜„์žฌ ํ…Œ์ŠคํŠธ ํŒŒ์ผ ๊ตฌ์กฐ๊ฐ€ ๋ฃจํŠธ ๊ฒฝ๋กœ์˜ test ๋””๋ ‰ํ„ฐ๋ฆฌ ํ•œ ๊ณณ์—์„œ ๊ด€๋ฆฌํ•˜๊ณ  ์žˆ๋Š”๋ฐ, ์ด๊ฒƒ ๋ณด๋‹ค๋Š” ๊ทธ ๋•Œ ๋ง์”€ํ•˜์‹  ๊ฒƒ์ฒ˜๋Ÿผ ํ•ด๋‹น ๋””๋ ‰ํ„ฐ๋ฆฌ ์•ˆ์—์„œ test ๋””๋ ‰ํ„ฐ๋ฆฌ๋ฅผ ๋งŒ๋“ค์–ด์„œ ๊ด€๋ฆฌํ•˜๋Š” ๊ฒƒ์ด Unit ํ…Œ์ŠคํŠธ๋ฅผ ์ง„ํ–‰ํ•˜๊ธฐ์—๋Š” ์กฐ๊ธˆ ๋” ํšจ์œจ์ ์ผ ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
non_process
useinput ์ปค์Šคํ…€ ํ›…์„ unit ํ…Œ์ŠคํŠธ๋ฅผ ํ•ด๋ณด๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค ๐ŸŽฏ ์„ค๋ช… useinput ์ปค์Šคํ…€ ํ›…์„ unit ํ…Œ์ŠคํŠธ๋ฅผ ํ•ด๋ณด๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค ๐Ÿ’ญ ๊ธฐํƒ€ ํ˜„์žฌ ํ…Œ์ŠคํŠธ ํŒŒ์ผ ๊ตฌ์กฐ๊ฐ€ ๋ฃจํŠธ ๊ฒฝ๋กœ์˜ test ๋””๋ ‰ํ„ฐ๋ฆฌ ํ•œ ๊ณณ์—์„œ ๊ด€๋ฆฌํ•˜๊ณ  ์žˆ๋Š”๋ฐ ์ด๊ฒƒ ๋ณด๋‹ค๋Š” ๊ทธ ๋•Œ ๋ง์”€ํ•˜์‹  ๊ฒƒ์ฒ˜๋Ÿผ ํ•ด๋‹น ๋””๋ ‰ํ„ฐ๋ฆฌ ์•ˆ์—์„œ test ๋””๋ ‰ํ„ฐ๋ฆฌ๋ฅผ ๋งŒ๋“ค์–ด์„œ ๊ด€๋ฆฌํ•˜๋Š” ๊ฒƒ์ด unit ํ…Œ์ŠคํŠธ๋ฅผ ์ง„ํ–‰ํ•˜๊ธฐ์—๋Š” ์กฐ๊ธˆ ๋” ํšจ์œจ์ ์ผ ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค
0
110,372
23,921,853,116
IssuesEvent
2022-09-09 17:44:13
unoplatform/uno
https://api.github.com/repos/unoplatform/uno
closed
Avoid IO in generators, when possible
kind/bug area/code-generation project/core-tools
### Current behavior Xaml generator currently does direct disk IO, e.g: https://github.com/unoplatform/uno/blob/4666c66d75ea47c9156ea79c5d1e83465fdf570d/src/SourceGenerators/Uno.UI.SourceGenerators/XamlGenerator/XamlFileParser.cs#L98 (there might be other places where IO happens) ### Expected behavior Generators shouldn't do direct IO, at least when Roslyn generators are used. For Uno generators, this might be possible with https://github.com/unoplatform/Uno.SourceGeneration/pull/151 ### How to reproduce it (as minimally and precisely as possible) _No response_ ### Workaround _No response_ ### Works on UWP/WinUI _No response_ ### Environment _No response_ ### NuGet package version(s) _No response_ ### Affected platforms _No response_ ### IDE _No response_ ### IDE version _No response_ ### Relevant plugins _No response_ ### Anything else we need to know? _No response_
1.0
Avoid IO in generators, when possible - ### Current behavior Xaml generator currently does direct disk IO, e.g: https://github.com/unoplatform/uno/blob/4666c66d75ea47c9156ea79c5d1e83465fdf570d/src/SourceGenerators/Uno.UI.SourceGenerators/XamlGenerator/XamlFileParser.cs#L98 (there might be other places where IO happens) ### Expected behavior Generators shouldn't do direct IO, at least when Roslyn generators are used. For Uno generators, this might be possible with https://github.com/unoplatform/Uno.SourceGeneration/pull/151 ### How to reproduce it (as minimally and precisely as possible) _No response_ ### Workaround _No response_ ### Works on UWP/WinUI _No response_ ### Environment _No response_ ### NuGet package version(s) _No response_ ### Affected platforms _No response_ ### IDE _No response_ ### IDE version _No response_ ### Relevant plugins _No response_ ### Anything else we need to know? _No response_
non_process
avoid io in generators when possible current behavior xaml generator currently does direct disk io e g there might be other places where io happens expected behavior generators shouldn t do direct io at least when roslyn generators are used for uno generators this might be possible with how to reproduce it as minimally and precisely as possible no response workaround no response works on uwp winui no response environment no response nuget package version s no response affected platforms no response ide no response ide version no response relevant plugins no response anything else we need to know no response
0