Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
140,355
21,083,592,797
IssuesEvent
2022-04-03 09:16:11
bounswe/bounswe2022group3
https://api.github.com/repos/bounswe/bounswe2022group3
opened
Creating a Issue Template
documentation design
### Task: A template which is needed to be comprehensive about issue like what an issue is, and why we do care about it to handle, what things are used, and where the result of it is put, and if necessary what additional notes are. ### Format (make a table, list, etc.): A template ### Required tools (if applicable) or researches: None ### Place to put the documentation: The side bar on the Wiki page ### Additional notes: None **Deadline:** 03.04.2022
1.0
Creating a Issue Template - ### Task: A template which is needed to be comprehensive about issue like what an issue is, and why we do care about it to handle, what things are used, and where the result of it is put, and if necessary what additional notes are. ### Format (make a table, list, etc.): A template ### Required tools (if applicable) or researches: None ### Place to put the documentation: The side bar on the Wiki page ### Additional notes: None **Deadline:** 03.04.2022
non_process
creating a issue template task a template which is needed to be comprehensive about issue like what an issue is and why we do care about it to handle what things are used and where the result of it is put and if necessary what additional notes are format make a table list etc a template required tools if applicable or researches none place to put the documentation the side bar on the wiki page additional notes none deadline
0
6,644
9,761,606,899
IssuesEvent
2019-06-05 09:11:05
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Problem with ParameterFile in batch mode
Bug Processing
Hello, I use QGis 3.6.3 in windows I've a script that use a ParameterFile to select a folder (QgsProcessingParameterFile.Folder option in the self.addParameter function) It works well in normal mode, but crashed in batch mode. Here is the error message displayed AttributeError: 'NoneType' object has no attribute 'setParameterValue' Traceback (most recent call last): File "C:/PROGRA~1/QGIS3~1.6/apps/qgis/./python/plugins\processing\gui\BatchPanel.py", line 183, in load wrapper.setParameterValue(value, context) AttributeError: 'NoneType' object has no attribute 'setParameterValue' It worked with the previous versions of Qgis (3.6.2 or 3.61) Thanks
1.0
Problem with ParameterFile in batch mode - Hello, I use QGis 3.6.3 in windows I've a script that use a ParameterFile to select a folder (QgsProcessingParameterFile.Folder option in the self.addParameter function) It works well in normal mode, but crashed in batch mode. Here is the error message displayed AttributeError: 'NoneType' object has no attribute 'setParameterValue' Traceback (most recent call last): File "C:/PROGRA~1/QGIS3~1.6/apps/qgis/./python/plugins\processing\gui\BatchPanel.py", line 183, in load wrapper.setParameterValue(value, context) AttributeError: 'NoneType' object has no attribute 'setParameterValue' It worked with the previous versions of Qgis (3.6.2 or 3.61) Thanks
process
problem with parameterfile in batch mode hello i use qgis in windows i ve a script that use a parameterfile to select a folder qgsprocessingparameterfile folder option in the self addparameter function it works well in normal mode but crashed in batch mode here is the error message displayed attributeerror nonetype object has no attribute setparametervalue traceback most recent call last file c progra apps qgis python plugins processing gui batchpanel py line in load wrapper setparametervalue value context attributeerror nonetype object has no attribute setparametervalue it worked with the previous versions of qgis or thanks
1
239,849
26,232,286,799
IssuesEvent
2023-01-05 02:04:00
TheKingTermux/alice
https://api.github.com/repos/TheKingTermux/alice
opened
Prototype Pollution in JSON5 via Parse Method
Security Auto Create Issues
### Description Dependabot cannot update to the required version as there is already an existing pull request for the latest version There is already an existing pull request for the latest version: 2.2.3 Prototype Pollution in JSON5 via Parse Method #95 Open Opened 2 minutes ago on json5 (npm) · package-lock.json Dependabot cannot update to the required version as there is already an existing pull request for the latest version There is already an existing pull request for the latest version: 2.2.3 The parse method of the JSON5 library before and including version 2.2.1 does not restrict parsing of keys named __proto__, allowing specially crafted strings to pollute the prototype of the resulting object. This vulnerability pollutes the prototype of the object returned by JSON5.parse and not the global Object prototype, which is the commonly understood definition of Prototype Pollution. However, polluting the prototype of a single object can have significant security impact for an application if the object is later used in trusted operations. Impact This vulnerability could allow an attacker to set arbitrary and unexpected keys on the object returned from JSON5.parse. The actual impact will depend on how applications utilize the returned object and how they filter unwanted keys, but could include denial of service, cross-site scripting, elevation of privilege, and in extreme cases, remote code execution. Mitigation This vulnerability is patched in json5 v2.2.2 and later. A patch has also been backported for json5 v1 in versions v1.0.2 and later. Details Suppose a developer wants to allow users and admins to perform some risky operation, but they want to restrict what non-admins can do. To accomplish this, they accept a JSON blob from the user, parse it using JSON5.parse, confirm that the provided data does not set some sensitive keys, and then performs the risky operation using the validated data: ```js const JSON5 = require('json5'); const doSomethingDangerous = (props) => { if (props.isAdmin) { console.log('Doing dangerous thing as admin.'); } else { console.log('Doing dangerous thing as user.'); } }; const secCheckKeysSet = (obj, searchKeys) => { let searchKeyFound = false; Object.keys(obj).forEach((key) => { if (searchKeys.indexOf(key) > -1) { searchKeyFound = true; } }); return searchKeyFound; }; const props = JSON5.parse('{\"foo\": \"bar\"}'); if (!secCheckKeysSet(props, ['isAdmin', 'isMod'])) { doSomethingDangerous(props); // \"Doing dangerous thing as user.\" } else { throw new Error('Forbidden...'); } ``` If the user attempts to set the isAdmin key, their request will be rejected: ```js const props = JSON5.parse('{\"foo\": \"bar\", \"isAdmin\": true}'); if (!secCheckKeysSet(props, ['isAdmin', 'isMod'])) { doSomethingDangerous(props); } else { throw new Error('Forbidden...'); // Error: Forbidden… } ``` However, users can instead set the __proto__ key to {\"isAdmin\": true}. JSON5 will parse this key and will set the isAdmin key on the prototype of the returned object, allowing the user to bypass the security check and run their request as an admin: ```js const props = JSON5.parse('{\"foo\": \"bar\", \"__proto__\": {\"isAdmin\": true}}'); if (!secCheckKeysSet(props, ['isAdmin', 'isMod'])) { doSomethingDangerous(props); // \"Doing dangerous thing as admin.\" } else { throw new Error('Forbidden...'); } ``` ### Severity Check - [ ] Low - [ ] Moderate - [X] High - [ ] Critical ### Severity Number 7.1 / 10 ### CVSS base metrics - Attack vector Network - Attack complexity High - Privileges required Low - User interaction None - Scope Unchanged - Confidentiality High - Integrity Low - Availability High - CVSS:3.1/AV:N/AC:H/PR:L/UI:N/S:U/C:H/I:L/A:H - Weaknesses CWE-1321 - CVE ID CVE-2022-46175 - GHSA ID GHSA-9c47-m6qq-7p4h ### Information Package json5 (npm) >= 2.0.0, < 2.2.2 ### References - GHSA-9c47-m6qq-7p4h - https://nvd.nist.gov/vuln/detail/CVE-2022-46175 - json5/json5#199 - json5/json5#295 - json5/json5#298
True
Prototype Pollution in JSON5 via Parse Method - ### Description Dependabot cannot update to the required version as there is already an existing pull request for the latest version There is already an existing pull request for the latest version: 2.2.3 Prototype Pollution in JSON5 via Parse Method #95 Open Opened 2 minutes ago on json5 (npm) · package-lock.json Dependabot cannot update to the required version as there is already an existing pull request for the latest version There is already an existing pull request for the latest version: 2.2.3 The parse method of the JSON5 library before and including version 2.2.1 does not restrict parsing of keys named __proto__, allowing specially crafted strings to pollute the prototype of the resulting object. This vulnerability pollutes the prototype of the object returned by JSON5.parse and not the global Object prototype, which is the commonly understood definition of Prototype Pollution. However, polluting the prototype of a single object can have significant security impact for an application if the object is later used in trusted operations. Impact This vulnerability could allow an attacker to set arbitrary and unexpected keys on the object returned from JSON5.parse. The actual impact will depend on how applications utilize the returned object and how they filter unwanted keys, but could include denial of service, cross-site scripting, elevation of privilege, and in extreme cases, remote code execution. Mitigation This vulnerability is patched in json5 v2.2.2 and later. A patch has also been backported for json5 v1 in versions v1.0.2 and later. Details Suppose a developer wants to allow users and admins to perform some risky operation, but they want to restrict what non-admins can do. To accomplish this, they accept a JSON blob from the user, parse it using JSON5.parse, confirm that the provided data does not set some sensitive keys, and then performs the risky operation using the validated data: ```js const JSON5 = require('json5'); const doSomethingDangerous = (props) => { if (props.isAdmin) { console.log('Doing dangerous thing as admin.'); } else { console.log('Doing dangerous thing as user.'); } }; const secCheckKeysSet = (obj, searchKeys) => { let searchKeyFound = false; Object.keys(obj).forEach((key) => { if (searchKeys.indexOf(key) > -1) { searchKeyFound = true; } }); return searchKeyFound; }; const props = JSON5.parse('{\"foo\": \"bar\"}'); if (!secCheckKeysSet(props, ['isAdmin', 'isMod'])) { doSomethingDangerous(props); // \"Doing dangerous thing as user.\" } else { throw new Error('Forbidden...'); } ``` If the user attempts to set the isAdmin key, their request will be rejected: ```js const props = JSON5.parse('{\"foo\": \"bar\", \"isAdmin\": true}'); if (!secCheckKeysSet(props, ['isAdmin', 'isMod'])) { doSomethingDangerous(props); } else { throw new Error('Forbidden...'); // Error: Forbidden… } ``` However, users can instead set the __proto__ key to {\"isAdmin\": true}. JSON5 will parse this key and will set the isAdmin key on the prototype of the returned object, allowing the user to bypass the security check and run their request as an admin: ```js const props = JSON5.parse('{\"foo\": \"bar\", \"__proto__\": {\"isAdmin\": true}}'); if (!secCheckKeysSet(props, ['isAdmin', 'isMod'])) { doSomethingDangerous(props); // \"Doing dangerous thing as admin.\" } else { throw new Error('Forbidden...'); } ``` ### Severity Check - [ ] Low - [ ] Moderate - [X] High - [ ] Critical ### Severity Number 7.1 / 10 ### CVSS base metrics - Attack vector Network - Attack complexity High - Privileges required Low - User interaction None - Scope Unchanged - Confidentiality High - Integrity Low - Availability High - CVSS:3.1/AV:N/AC:H/PR:L/UI:N/S:U/C:H/I:L/A:H - Weaknesses CWE-1321 - CVE ID CVE-2022-46175 - GHSA ID GHSA-9c47-m6qq-7p4h ### Information Package json5 (npm) >= 2.0.0, < 2.2.2 ### References - GHSA-9c47-m6qq-7p4h - https://nvd.nist.gov/vuln/detail/CVE-2022-46175 - json5/json5#199 - json5/json5#295 - json5/json5#298
non_process
prototype pollution in via parse method description dependabot cannot update to the required version as there is already an existing pull request for the latest version there is already an existing pull request for the latest version prototype pollution in via parse method open opened minutes ago on npm · package lock json dependabot cannot update to the required version as there is already an existing pull request for the latest version there is already an existing pull request for the latest version the parse method of the library before and including version does not restrict parsing of keys named proto allowing specially crafted strings to pollute the prototype of the resulting object this vulnerability pollutes the prototype of the object returned by parse and not the global object prototype which is the commonly understood definition of prototype pollution however polluting the prototype of a single object can have significant security impact for an application if the object is later used in trusted operations impact this vulnerability could allow an attacker to set arbitrary and unexpected keys on the object returned from parse the actual impact will depend on how applications utilize the returned object and how they filter unwanted keys but could include denial of service cross site scripting elevation of privilege and in extreme cases remote code execution mitigation this vulnerability is patched in and later a patch has also been backported for in versions and later details suppose a developer wants to allow users and admins to perform some risky operation but they want to restrict what non admins can do to accomplish this they accept a json blob from the user parse it using parse confirm that the provided data does not set some sensitive keys and then performs the risky operation using the validated data js const require const dosomethingdangerous props if props isadmin console log doing dangerous thing as admin else console log doing dangerous thing as user const seccheckkeysset obj searchkeys let searchkeyfound false object keys obj foreach key if searchkeys indexof key searchkeyfound true return searchkeyfound const props parse foo bar if seccheckkeysset props dosomethingdangerous props doing dangerous thing as user else throw new error forbidden if the user attempts to set the isadmin key their request will be rejected js const props parse foo bar isadmin true if seccheckkeysset props dosomethingdangerous props else throw new error forbidden error forbidden… however users can instead set the proto key to isadmin true will parse this key and will set the isadmin key on the prototype of the returned object allowing the user to bypass the security check and run their request as an admin js const props parse foo bar proto isadmin true if seccheckkeysset props dosomethingdangerous props doing dangerous thing as admin else throw new error forbidden severity check low moderate high critical severity number cvss base metrics attack vector network attack complexity high privileges required low user interaction none scope unchanged confidentiality high integrity low availability high cvss av n ac h pr l ui n s u c h i l a h weaknesses cwe cve id cve ghsa id ghsa information package npm references ghsa
0
21,639
30,055,256,313
IssuesEvent
2023-06-28 06:09:43
h4sh5/npm-auto-scanner
https://api.github.com/repos/h4sh5/npm-auto-scanner
opened
@emivespa/iih 0.1.0 has 1 guarddog issues
npm-silent-process-execution
```{"npm-silent-process-execution":[{"code":" const iiProcess = spawn('ii', ['-s', `${cli.flags.host}`, '-n', `${cli.flags.nick}`], {\n detached: true,\n stdio: 'ignore',\n });","location":"package/dist/cli.js:58","message":"This package is silently executing another executable"}]}```
1.0
@emivespa/iih 0.1.0 has 1 guarddog issues - ```{"npm-silent-process-execution":[{"code":" const iiProcess = spawn('ii', ['-s', `${cli.flags.host}`, '-n', `${cli.flags.nick}`], {\n detached: true,\n stdio: 'ignore',\n });","location":"package/dist/cli.js:58","message":"This package is silently executing another executable"}]}```
process
emivespa iih has guarddog issues npm silent process execution n detached true n stdio ignore n location package dist cli js message this package is silently executing another executable
1
12,205
14,742,715,012
IssuesEvent
2021-01-07 12:46:34
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Site 068 Portland: Unable to process Credit Card Transactions in SA Billing
anc-process anp-important ant-bug
In GitLab by @kdjstudios on May 17, 2019, 13:00 **Submitted by:** <jeffrey.casey@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-05-17-88178/conversation **Server:** Internal **Client/Site:** Portland **Account:** **Issue:** When attempting to process and save Credit Card transactions SA Billing returns “Something went wrong” message. What is the status of SAB? When can we expect functionality to be returned?
1.0
Site 068 Portland: Unable to process Credit Card Transactions in SA Billing - In GitLab by @kdjstudios on May 17, 2019, 13:00 **Submitted by:** <jeffrey.casey@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-05-17-88178/conversation **Server:** Internal **Client/Site:** Portland **Account:** **Issue:** When attempting to process and save Credit Card transactions SA Billing returns “Something went wrong” message. What is the status of SAB? When can we expect functionality to be returned?
process
site portland unable to process credit card transactions in sa billing in gitlab by kdjstudios on may submitted by helpdesk server internal client site portland account issue when attempting to process and save credit card transactions sa billing returns “something went wrong” message what is the status of sab when can we expect functionality to be returned
1
15,341
19,488,412,338
IssuesEvent
2021-12-26 21:23:20
MasterPlayer/adxl345-sv
https://api.github.com/repos/MasterPlayer/adxl345-sv
opened
SingleTap Interrupt processing in hw
enhancement hardware process
add ability of component for processing in hardware SingleTap interrupts
1.0
SingleTap Interrupt processing in hw - add ability of component for processing in hardware SingleTap interrupts
process
singletap interrupt processing in hw add ability of component for processing in hardware singletap interrupts
1
20,346
27,002,727,003
IssuesEvent
2023-02-10 09:13:32
bitfocus/companion-module-requests
https://api.github.com/repos/bitfocus/companion-module-requests
opened
Audac - M2 - zonematrix processor
NOT YET PROCESSED
- [x] **I have researched the list of existing Companion modules and requests and have determined this has not yet been requested** The name of the device, hardware, or software you would like to control: Audac - M2 - zonematrix processor What you would like to be able to make it do from Companion: Would be great to have at least some basic control like output volume and the mute function. I would be great to have variables to know the current levels. Direct links or attachments to the ethernet control protocol or API: https://acpromedia.com/assets/uploads/product_downloads/c6eea4cd7ec8b24b74df351cf5e6f1cb.pdf
1.0
Audac - M2 - zonematrix processor - - [x] **I have researched the list of existing Companion modules and requests and have determined this has not yet been requested** The name of the device, hardware, or software you would like to control: Audac - M2 - zonematrix processor What you would like to be able to make it do from Companion: Would be great to have at least some basic control like output volume and the mute function. I would be great to have variables to know the current levels. Direct links or attachments to the ethernet control protocol or API: https://acpromedia.com/assets/uploads/product_downloads/c6eea4cd7ec8b24b74df351cf5e6f1cb.pdf
process
audac zonematrix processor i have researched the list of existing companion modules and requests and have determined this has not yet been requested the name of the device hardware or software you would like to control audac zonematrix processor what you would like to be able to make it do from companion would be great to have at least some basic control like output volume and the mute function i would be great to have variables to know the current levels direct links or attachments to the ethernet control protocol or api
1
24,146
16,945,108,565
IssuesEvent
2021-06-28 05:12:31
Unidata/MetPy
https://api.github.com/repos/Unidata/MetPy
closed
Upload image test results on AppVeyor
Area: Infrastructure Type: Enhancement
Looks like AppVeyor supports uploading build artifacts, so we should do so. Matplotlib's config: ```yaml artifacts: - path: dist\* name: packages - path: result_images\* name: result_images type: zip on_finish: on_failure: - python tools/visualize_tests.py --no-browser - echo zipping images after a failure... - 7z a result_images.zip result_images\ | grep -v "Compressing" - appveyor PushArtifact result_images.zip ```
1.0
Upload image test results on AppVeyor - Looks like AppVeyor supports uploading build artifacts, so we should do so. Matplotlib's config: ```yaml artifacts: - path: dist\* name: packages - path: result_images\* name: result_images type: zip on_finish: on_failure: - python tools/visualize_tests.py --no-browser - echo zipping images after a failure... - 7z a result_images.zip result_images\ | grep -v "Compressing" - appveyor PushArtifact result_images.zip ```
non_process
upload image test results on appveyor looks like appveyor supports uploading build artifacts so we should do so matplotlib s config yaml artifacts path dist name packages path result images name result images type zip on finish on failure python tools visualize tests py no browser echo zipping images after a failure a result images zip result images grep v compressing appveyor pushartifact result images zip
0
82,950
16,065,172,718
IssuesEvent
2021-04-23 17:54:02
gleam-lang/gleam
https://api.github.com/repos/gleam-lang/gleam
closed
Basic JS code generation
area:codegen help wanted
- [x] Expressions - [x] Ints (with bitwise operator hints to make then ints rather than floats) - [x] Floats - [x] Strings - [x] Function calls - [x] Constants - [x] Ints - [x] Floats - [x] Strings - [x] Module level functions
1.0
Basic JS code generation - - [x] Expressions - [x] Ints (with bitwise operator hints to make then ints rather than floats) - [x] Floats - [x] Strings - [x] Function calls - [x] Constants - [x] Ints - [x] Floats - [x] Strings - [x] Module level functions
non_process
basic js code generation expressions ints with bitwise operator hints to make then ints rather than floats floats strings function calls constants ints floats strings module level functions
0
19,784
14,543,744,205
IssuesEvent
2020-12-15 17:15:12
coreos/zincati
https://api.github.com/repos/coreos/zincati
closed
metrics: values from runtime config could be set earlier
area/observability area/usability jira kind/friction
From a race condition observed in https://github.com/coreos/fedora-coreos-tracker/issues/686. Most metrics are initially set in an ad-hoc fashion as soon as execution reaches the relevant codepath. In particular, this is the case for `zincati_update_agent_updates_enabled` which is first set when the actor [is started](https://github.com/coreos/zincati/blob/v0.0.14/src/update_agent/actor.rs#L35). This effectively means there is a non-zero timespan when the metrics server is already active but the metrics is yet unset (i.e. not present in the response to the clients). This is generally not a problem for Prometheus (which is able to track this "unset" state), but simpler consumers could be tricked by this racing behavior. The general case is not fixable. But specifically for info metrics coming from configuration, this issues can be avoided by eliminating the window of possible races. As those values are known right after parsing the config, they can be directly set into relevant metrics before starting the metrics service.
True
metrics: values from runtime config could be set earlier - From a race condition observed in https://github.com/coreos/fedora-coreos-tracker/issues/686. Most metrics are initially set in an ad-hoc fashion as soon as execution reaches the relevant codepath. In particular, this is the case for `zincati_update_agent_updates_enabled` which is first set when the actor [is started](https://github.com/coreos/zincati/blob/v0.0.14/src/update_agent/actor.rs#L35). This effectively means there is a non-zero timespan when the metrics server is already active but the metrics is yet unset (i.e. not present in the response to the clients). This is generally not a problem for Prometheus (which is able to track this "unset" state), but simpler consumers could be tricked by this racing behavior. The general case is not fixable. But specifically for info metrics coming from configuration, this issues can be avoided by eliminating the window of possible races. As those values are known right after parsing the config, they can be directly set into relevant metrics before starting the metrics service.
non_process
metrics values from runtime config could be set earlier from a race condition observed in most metrics are initially set in an ad hoc fashion as soon as execution reaches the relevant codepath in particular this is the case for zincati update agent updates enabled which is first set when the actor this effectively means there is a non zero timespan when the metrics server is already active but the metrics is yet unset i e not present in the response to the clients this is generally not a problem for prometheus which is able to track this unset state but simpler consumers could be tricked by this racing behavior the general case is not fixable but specifically for info metrics coming from configuration this issues can be avoided by eliminating the window of possible races as those values are known right after parsing the config they can be directly set into relevant metrics before starting the metrics service
0
16,440
21,317,069,446
IssuesEvent
2022-04-16 13:16:19
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
closed
DOMException when using two "ditavalref"'s on first level in map
bug priority/medium preprocess preprocess/branch-filtering stale
Reported here: https://groups.yahoo.com/neo/groups/dita-users/conversations/messages/43072 I'm attaching a sample project: [two-ditavalref-map.zip](https://github.com/dita-ot/dita-ot/files/1566943/two-ditavalref-map.zip) There are two ditaval files referenced on the first DITA Map level like: <map> <ditavalref href="ditavals/AWS.ditaval"/> <ditavalref href="ditavals/LINUX.ditaval"/> I can reproduce problem with both DITA OT 2.x and 3.x: BUILD FAILED C:\Users\radu_coravu\Desktop\DITA OT newest\dita-ot-3.0.1\build.xml:45: The following error occurred while executing this line: C:\Users\radu_coravu\Desktop\DITA OT newest\dita-ot-3.0.1\plugins\org.dita.base\build_preprocess2.xml:110: org.w3c.dom.DOMException: HIERARCHY_REQUEST_ERR: An attempt was made to insert a node where it is not permitted. at org.apache.xerces.dom.CoreDocumentImpl.insertBefore(Unknown Source) at org.apache.xerces.dom.NodeImpl.appendChild(Unknown Source) at org.dita.dost.module.filter.MapBranchFilterModule.splitBranches(MapBranchFilterModule.java:326) at org.dita.dost.module.filter.MapBranchFilterModule.processMap(MapBranchFilterModule.java:111) at org.dita.dost.module.filter.MapBranchFilterModule.execute(MapBranchFilterModule.java:78) at org.dita.dost.ant.ExtensibleAntInvoker.execute(ExtensibleAntInvoker.java:163) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
2.0
DOMException when using two "ditavalref"'s on first level in map - Reported here: https://groups.yahoo.com/neo/groups/dita-users/conversations/messages/43072 I'm attaching a sample project: [two-ditavalref-map.zip](https://github.com/dita-ot/dita-ot/files/1566943/two-ditavalref-map.zip) There are two ditaval files referenced on the first DITA Map level like: <map> <ditavalref href="ditavals/AWS.ditaval"/> <ditavalref href="ditavals/LINUX.ditaval"/> I can reproduce problem with both DITA OT 2.x and 3.x: BUILD FAILED C:\Users\radu_coravu\Desktop\DITA OT newest\dita-ot-3.0.1\build.xml:45: The following error occurred while executing this line: C:\Users\radu_coravu\Desktop\DITA OT newest\dita-ot-3.0.1\plugins\org.dita.base\build_preprocess2.xml:110: org.w3c.dom.DOMException: HIERARCHY_REQUEST_ERR: An attempt was made to insert a node where it is not permitted. at org.apache.xerces.dom.CoreDocumentImpl.insertBefore(Unknown Source) at org.apache.xerces.dom.NodeImpl.appendChild(Unknown Source) at org.dita.dost.module.filter.MapBranchFilterModule.splitBranches(MapBranchFilterModule.java:326) at org.dita.dost.module.filter.MapBranchFilterModule.processMap(MapBranchFilterModule.java:111) at org.dita.dost.module.filter.MapBranchFilterModule.execute(MapBranchFilterModule.java:78) at org.dita.dost.ant.ExtensibleAntInvoker.execute(ExtensibleAntInvoker.java:163) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
process
domexception when using two ditavalref s on first level in map reported here i m attaching a sample project there are two ditaval files referenced on the first dita map level like i can reproduce problem with both dita ot x and x build failed c users radu coravu desktop dita ot newest dita ot build xml the following error occurred while executing this line c users radu coravu desktop dita ot newest dita ot plugins org dita base build xml org dom domexception hierarchy request err an attempt was made to insert a node where it is not permitted at org apache xerces dom coredocumentimpl insertbefore unknown source at org apache xerces dom nodeimpl appendchild unknown source at org dita dost module filter mapbranchfiltermodule splitbranches mapbranchfiltermodule java at org dita dost module filter mapbranchfiltermodule processmap mapbranchfiltermodule java at org dita dost module filter mapbranchfiltermodule execute mapbranchfiltermodule java at org dita dost ant extensibleantinvoker execute extensibleantinvoker java at org apache tools ant unknownelement execute unknownelement java at sun reflect invoke unknown source at sun reflect delegatingmethodaccessorimpl invoke unknown source
1
792,208
27,950,583,601
IssuesEvent
2023-03-24 08:31:19
ahmedkaludi/accelerated-mobile-pages
https://api.github.com/repos/ahmedkaludi/accelerated-mobile-pages
closed
While using the Yuki Blogger theme, when we edit the AMP customization, it shows a fatal error.
bug [Priority: HIGH] Ready for Review
While using the Yuki Blogger theme, when we edit the AMP customization, it shows a fatal error. Ref video: https://monosnap.com/direct/JisIlAW3qq7OarIcluBmCXv8ONyokO Ref ticket:https://magazine3.in/conversation/110482?folder_id=29
1.0
While using the Yuki Blogger theme, when we edit the AMP customization, it shows a fatal error. - While using the Yuki Blogger theme, when we edit the AMP customization, it shows a fatal error. Ref video: https://monosnap.com/direct/JisIlAW3qq7OarIcluBmCXv8ONyokO Ref ticket:https://magazine3.in/conversation/110482?folder_id=29
non_process
while using the yuki blogger theme when we edit the amp customization it shows a fatal error while using the yuki blogger theme when we edit the amp customization it shows a fatal error ref video ref ticket
0
5,942
8,766,817,440
IssuesEvent
2018-12-17 17:52:01
knative/serving
https://api.github.com/repos/knative/serving
closed
Enable object versioning for our release artifact GCR buckets
area/test-and-release kind/cleanup kind/process
<!-- Pro-tip: You can leave this block commented, and it still works! Select the appropriate areas for your issue: /area test-and-release Classify what kind of issue this is: /kind cleanup /kind process /assign @jonjohnsonjr --> ## Expected Behavior We have object versioning for our GCR bucket, to undo accidental bad delete/overwrites. ## Actual Behavior We don't. ## Additional Info
1.0
Enable object versioning for our release artifact GCR buckets - <!-- Pro-tip: You can leave this block commented, and it still works! Select the appropriate areas for your issue: /area test-and-release Classify what kind of issue this is: /kind cleanup /kind process /assign @jonjohnsonjr --> ## Expected Behavior We have object versioning for our GCR bucket, to undo accidental bad delete/overwrites. ## Actual Behavior We don't. ## Additional Info
process
enable object versioning for our release artifact gcr buckets pro tip you can leave this block commented and it still works select the appropriate areas for your issue area test and release classify what kind of issue this is kind cleanup kind process assign jonjohnsonjr expected behavior we have object versioning for our gcr bucket to undo accidental bad delete overwrites actual behavior we don t additional info
1
106,039
16,664,152,758
IssuesEvent
2021-06-06 21:39:42
AlexRogalskiy/github-action-open-jscharts
https://api.github.com/repos/AlexRogalskiy/github-action-open-jscharts
opened
CVE-2020-28469 (Medium) detected in glob-parent-5.1.1.tgz
security vulnerability
## CVE-2020-28469 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>glob-parent-5.1.1.tgz</b></p></summary> <p>Extract the non-magic parent path from a glob string.</p> <p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz</a></p> <p>Path to dependency file: github-action-open-jscharts/package.json</p> <p>Path to vulnerable library: github-action-open-jscharts/node_modules/glob-parent/package.json</p> <p> Dependency Hierarchy: - eslint-7.20.0.tgz (Root Library) - :x: **glob-parent-5.1.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/github-action-open-jscharts/commit/b839600e4ce06e285536987cb21516e0b3e830aa">b839600e4ce06e285536987cb21516e0b3e830aa</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package glob-parent before 5.1.2. The enclosure regex used to check for strings ending in enclosure containing path separator. <p>Publish Date: 2021-06-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469>CVE-2020-28469</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469</a></p> <p>Release Date: 2021-06-03</p> <p>Fix Resolution: glob-parent - 5.1.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-28469 (Medium) detected in glob-parent-5.1.1.tgz - ## CVE-2020-28469 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>glob-parent-5.1.1.tgz</b></p></summary> <p>Extract the non-magic parent path from a glob string.</p> <p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz</a></p> <p>Path to dependency file: github-action-open-jscharts/package.json</p> <p>Path to vulnerable library: github-action-open-jscharts/node_modules/glob-parent/package.json</p> <p> Dependency Hierarchy: - eslint-7.20.0.tgz (Root Library) - :x: **glob-parent-5.1.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/github-action-open-jscharts/commit/b839600e4ce06e285536987cb21516e0b3e830aa">b839600e4ce06e285536987cb21516e0b3e830aa</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package glob-parent before 5.1.2. The enclosure regex used to check for strings ending in enclosure containing path separator. <p>Publish Date: 2021-06-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469>CVE-2020-28469</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469</a></p> <p>Release Date: 2021-06-03</p> <p>Fix Resolution: glob-parent - 5.1.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in glob parent tgz cve medium severity vulnerability vulnerable library glob parent tgz extract the non magic parent path from a glob string library home page a href path to dependency file github action open jscharts package json path to vulnerable library github action open jscharts node modules glob parent package json dependency hierarchy eslint tgz root library x glob parent tgz vulnerable library found in head commit a href vulnerability details this affects the package glob parent before the enclosure regex used to check for strings ending in enclosure containing path separator publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution glob parent step up your open source security game with whitesource
0
287,899
24,872,440,430
IssuesEvent
2022-10-27 16:11:26
datafuselabs/databend
https://api.github.com/repos/datafuselabs/databend
closed
Track: Sql logic test experience improvement
C-testing
### Summary This is a tracking issue for sql logic experience, anything about logic test can put here. ### tasks - [x] https://github.com/datafuselabs/databend/issues/6696 - [x] https://github.com/datafuselabs/databend/issues/6694 - [x] https://github.com/datafuselabs/databend/issues/6697 - [x] https://github.com/datafuselabs/databend/issues/6701 - [x] https://github.com/datafuselabs/databend/issues/6705 - [x] https://github.com/datafuselabs/databend/issues/6709 - [x] https://github.com/datafuselabs/databend/issues/7027
1.0
Track: Sql logic test experience improvement - ### Summary This is a tracking issue for sql logic experience, anything about logic test can put here. ### tasks - [x] https://github.com/datafuselabs/databend/issues/6696 - [x] https://github.com/datafuselabs/databend/issues/6694 - [x] https://github.com/datafuselabs/databend/issues/6697 - [x] https://github.com/datafuselabs/databend/issues/6701 - [x] https://github.com/datafuselabs/databend/issues/6705 - [x] https://github.com/datafuselabs/databend/issues/6709 - [x] https://github.com/datafuselabs/databend/issues/7027
non_process
track sql logic test experience improvement summary this is a tracking issue for sql logic experience anything about logic test can put here tasks
0
387,010
11,454,628,879
IssuesEvent
2020-02-06 17:25:58
Activiti/Activiti
https://api.github.com/repos/Activiti/Activiti
closed
[Core] Versioning Support Release concept by Deploying a set of related artefacts
core priority1 release-notes-required risk
When a new version of the process definition is deployed a new record is put in the database. Process instances for old and new versions can be live in parallel. But we also have json files that define the variables expected for the connectors in the process definition. Could a breaking change be made to one of these json files e.g. old version of connector and process expects a certain variable values and new version expects another. Imagine moving from 'yes/no' format to 'true/false' - if these values were referenced in the json then perhaps this could be a breaking change and if so without versioning the old instances could then get stuck. Could do with a more concrete example on this.
1.0
[Core] Versioning Support Release concept by Deploying a set of related artefacts - When a new version of the process definition is deployed a new record is put in the database. Process instances for old and new versions can be live in parallel. But we also have json files that define the variables expected for the connectors in the process definition. Could a breaking change be made to one of these json files e.g. old version of connector and process expects a certain variable values and new version expects another. Imagine moving from 'yes/no' format to 'true/false' - if these values were referenced in the json then perhaps this could be a breaking change and if so without versioning the old instances could then get stuck. Could do with a more concrete example on this.
non_process
versioning support release concept by deploying a set of related artefacts when a new version of the process definition is deployed a new record is put in the database process instances for old and new versions can be live in parallel but we also have json files that define the variables expected for the connectors in the process definition could a breaking change be made to one of these json files e g old version of connector and process expects a certain variable values and new version expects another imagine moving from yes no format to true false if these values were referenced in the json then perhaps this could be a breaking change and if so without versioning the old instances could then get stuck could do with a more concrete example on this
0
406,531
27,570,081,467
IssuesEvent
2023-03-08 08:34:06
GenericMappingTools/pygmt
https://api.github.com/repos/GenericMappingTools/pygmt
closed
Add a gallery example for the Figure.timestamp method
documentation
The `Figure.timestamp()` method was added in https://github.com/GenericMappingTools/pygmt/pull/2208, and a gallery example is needed (xref https://github.com/GenericMappingTools/pygmt/pull/2208#issuecomment-1451522619).
1.0
Add a gallery example for the Figure.timestamp method - The `Figure.timestamp()` method was added in https://github.com/GenericMappingTools/pygmt/pull/2208, and a gallery example is needed (xref https://github.com/GenericMappingTools/pygmt/pull/2208#issuecomment-1451522619).
non_process
add a gallery example for the figure timestamp method the figure timestamp method was added in and a gallery example is needed xref
0
115,721
24,803,497,662
IssuesEvent
2022-10-25 00:55:34
alefragnani/vscode-bookmarks
https://api.github.com/repos/alefragnani/vscode-bookmarks
closed
[BUG] - Repeated gutter icon on line wrap
bug caused by vscode
This doesn't seem to be intended, at least I can't see the point of having that many bookmark icons in the gutter. <!-- Please search existing issues to avoid creating duplicates. --> <!-- Use Help > Report Issue to prefill some of these. --> **Environment/version** - Extension version: v13.3.1 - VSCode version: 1.70.2 - OS version: macOS Monterey **Steps to reproduce** 1. Toggle line wrap on 2. Write a long line that wraps around 3. Bookmark it <img width="825" alt="image" src="https://user-images.githubusercontent.com/2845433/185999846-5f126b89-e143-426c-9cdb-518f5c098f22.png">
1.0
[BUG] - Repeated gutter icon on line wrap - This doesn't seem to be intended, at least I can't see the point of having that many bookmark icons in the gutter. <!-- Please search existing issues to avoid creating duplicates. --> <!-- Use Help > Report Issue to prefill some of these. --> **Environment/version** - Extension version: v13.3.1 - VSCode version: 1.70.2 - OS version: macOS Monterey **Steps to reproduce** 1. Toggle line wrap on 2. Write a long line that wraps around 3. Bookmark it <img width="825" alt="image" src="https://user-images.githubusercontent.com/2845433/185999846-5f126b89-e143-426c-9cdb-518f5c098f22.png">
non_process
repeated gutter icon on line wrap this doesn t seem to be intended at least i can t see the point of having that many bookmark icons in the gutter report issue to prefill some of these environment version extension version vscode version os version macos monterey steps to reproduce toggle line wrap on write a long line that wraps around bookmark it img width alt image src
0
16,845
22,096,552,001
IssuesEvent
2022-06-01 10:35:56
camunda/zeebe
https://api.github.com/repos/camunda/zeebe
closed
IllegalStateException when writing decision evaluation event
kind/bug severity/mid area/reliability team/process-automation
**Describe the bug** <!-- A clear and concise description of what the bug is. --> When trying to write the decision evaluation event an `IllegalArgumentException` is thrown. This is because when searching for decision by decision requirements key multiple results with the same decision id are returned: ```java final var decisionKeysByDecisionId = decisionState .findDecisionsByDecisionRequirementsKey(decision.getDecisionRequirementsKey()) .stream() .collect( Collectors.toMap( persistedDecision -> bufferAsString(persistedDecision.getDecisionId()), DecisionInfo::new)); ``` These duplicate decision id cause the `toMap` function to fail, as no merge function is provided. The found decisions do all have a different version. **To Reproduce** <!-- Steps to reproduce the behavior If possible add a minimal reproducer code sample - when using the Java client: https://github.com/zeebe-io/zeebe-test-template-java --> It was a challenge to reproduce this issue but I found a way to do this. It requires 2 DRD's that both contain a decision with the same id and a process which contains a business rule task referencing this decision id. [Repro files.zip](https://github.com/camunda/zeebe/files/8603769/Repro.files.zip) Next follow these steps: 1. Deploy `translateDay.dmn` 2. Deploy `translateMonth.dmn` 3. Without making any changes redeploy `translateDay.dmn` 4. Deploy `translateProcess.dmn` 5. Start a PI: `zbctl create instance translateProcess --insecure --variables '{"day":"monday","month":"april"}'` At this point an exception should be thrown. **Expected behavior** <!-- A clear and concise description of what you expected to happen. --> No exception should occur. **Log/Stacktrace** <!-- If possible add the full stacktrace or Zeebe log which contains the issue. --> <details><summary>Full Stacktrace</summary> <p> ``` java.lang.IllegalStateException: Duplicate key translate (attempted merging values DecisionInfo[key=2251799813685250, version=1] and DecisionInfo[key=2251799813685255, version=3]) at java.util.stream.Collectors.duplicateKeyException(Collectors.java:135) ~[?:?] at java.util.stream.Collectors.lambda$uniqKeysMapAccumulator$1(Collectors.java:182) ~[?:?] at java.util.stream.ReduceOps$3ReducingSink.accept(ReduceOps.java:169) ~[?:?] at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) ~[?:?] at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?] at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?] at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921) ~[?:?] at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?] at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682) ~[?:?] at io.camunda.zeebe.engine.processing.bpmn.behavior.BpmnDecisionBehavior.writeDecisionEvaluationEvent(BpmnDecisionBehavior.java:233) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.behavior.BpmnDecisionBehavior.lambda$evaluateDecision$3(BpmnDecisionBehavior.java:114) ~[classes/:?] at io.camunda.zeebe.util.Either$Right.flatMap(Either.java:366) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.behavior.BpmnDecisionBehavior.evaluateDecision(BpmnDecisionBehavior.java:109) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.task.BusinessRuleTaskProcessor$CalledDecisionBehavior.lambda$onActivate$0(BusinessRuleTaskProcessor.java:89) ~[classes/:?] at io.camunda.zeebe.util.Either$Right.flatMap(Either.java:366) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.task.BusinessRuleTaskProcessor$CalledDecisionBehavior.onActivate(BusinessRuleTaskProcessor.java:89) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.task.BusinessRuleTaskProcessor.onActivate(BusinessRuleTaskProcessor.java:40) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.task.BusinessRuleTaskProcessor.onActivate(BusinessRuleTaskProcessor.java:21) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.BpmnStreamProcessor.lambda$processEvent$2(BpmnStreamProcessor.java:128) ~[classes/:?] at io.camunda.zeebe.util.Either$Right.ifRightOrLeft(Either.java:381) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.BpmnStreamProcessor.processEvent(BpmnStreamProcessor.java:127) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.BpmnStreamProcessor.lambda$processRecord$0(BpmnStreamProcessor.java:110) ~[classes/:?] at io.camunda.zeebe.util.Either$Right.ifRightOrLeft(Either.java:381) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.BpmnStreamProcessor.processRecord(BpmnStreamProcessor.java:107) ~[classes/:?] at io.camunda.zeebe.engine.processing.streamprocessor.TypedRecordProcessor.processRecord(TypedRecordProcessor.java:58) ~[classes/:?] at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.lambda$processInTransaction$3(ProcessingStateMachine.java:300) ~[classes/:?] at io.camunda.zeebe.db.impl.rocksdb.transaction.ZeebeTransaction.run(ZeebeTransaction.java:84) ~[classes/:?] at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.processInTransaction(ProcessingStateMachine.java:290) ~[classes/:?] at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.processCommand(ProcessingStateMachine.java:253) ~[classes/:?] at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.tryToReadNextRecord(ProcessingStateMachine.java:213) ~[classes/:?] at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.readNextRecord(ProcessingStateMachine.java:189) ~[classes/:?] at io.camunda.zeebe.util.sched.ActorJob.invoke(ActorJob.java:79) ~[classes/:?] at io.camunda.zeebe.util.sched.ActorJob.execute(ActorJob.java:44) ~[classes/:?] at io.camunda.zeebe.util.sched.ActorTask.execute(ActorTask.java:122) ~[classes/:?] at io.camunda.zeebe.util.sched.ActorThread.executeCurrentTask(ActorThread.java:97) ~[classes/:?] at io.camunda.zeebe.util.sched.ActorThread.doWork(ActorThread.java:80) ~[classes/:?] at io.camunda.zeebe.util.sched.ActorThread.run(ActorThread.java:189) ~[classes/:?] ``` </p> </details> **Environment:** - OS: Cloud - Zeebe Version: Seen on 8.0.0, also happens on latest main
1.0
IllegalStateException when writing decision evaluation event - **Describe the bug** <!-- A clear and concise description of what the bug is. --> When trying to write the decision evaluation event an `IllegalArgumentException` is thrown. This is because when searching for decision by decision requirements key multiple results with the same decision id are returned: ```java final var decisionKeysByDecisionId = decisionState .findDecisionsByDecisionRequirementsKey(decision.getDecisionRequirementsKey()) .stream() .collect( Collectors.toMap( persistedDecision -> bufferAsString(persistedDecision.getDecisionId()), DecisionInfo::new)); ``` These duplicate decision id cause the `toMap` function to fail, as no merge function is provided. The found decisions do all have a different version. **To Reproduce** <!-- Steps to reproduce the behavior If possible add a minimal reproducer code sample - when using the Java client: https://github.com/zeebe-io/zeebe-test-template-java --> It was a challenge to reproduce this issue but I found a way to do this. It requires 2 DRD's that both contain a decision with the same id and a process which contains a business rule task referencing this decision id. [Repro files.zip](https://github.com/camunda/zeebe/files/8603769/Repro.files.zip) Next follow these steps: 1. Deploy `translateDay.dmn` 2. Deploy `translateMonth.dmn` 3. Without making any changes redeploy `translateDay.dmn` 4. Deploy `translateProcess.dmn` 5. Start a PI: `zbctl create instance translateProcess --insecure --variables '{"day":"monday","month":"april"}'` At this point an exception should be thrown. **Expected behavior** <!-- A clear and concise description of what you expected to happen. --> No exception should occur. **Log/Stacktrace** <!-- If possible add the full stacktrace or Zeebe log which contains the issue. --> <details><summary>Full Stacktrace</summary> <p> ``` java.lang.IllegalStateException: Duplicate key translate (attempted merging values DecisionInfo[key=2251799813685250, version=1] and DecisionInfo[key=2251799813685255, version=3]) at java.util.stream.Collectors.duplicateKeyException(Collectors.java:135) ~[?:?] at java.util.stream.Collectors.lambda$uniqKeysMapAccumulator$1(Collectors.java:182) ~[?:?] at java.util.stream.ReduceOps$3ReducingSink.accept(ReduceOps.java:169) ~[?:?] at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) ~[?:?] at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?] at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?] at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921) ~[?:?] at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?] at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682) ~[?:?] at io.camunda.zeebe.engine.processing.bpmn.behavior.BpmnDecisionBehavior.writeDecisionEvaluationEvent(BpmnDecisionBehavior.java:233) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.behavior.BpmnDecisionBehavior.lambda$evaluateDecision$3(BpmnDecisionBehavior.java:114) ~[classes/:?] at io.camunda.zeebe.util.Either$Right.flatMap(Either.java:366) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.behavior.BpmnDecisionBehavior.evaluateDecision(BpmnDecisionBehavior.java:109) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.task.BusinessRuleTaskProcessor$CalledDecisionBehavior.lambda$onActivate$0(BusinessRuleTaskProcessor.java:89) ~[classes/:?] at io.camunda.zeebe.util.Either$Right.flatMap(Either.java:366) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.task.BusinessRuleTaskProcessor$CalledDecisionBehavior.onActivate(BusinessRuleTaskProcessor.java:89) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.task.BusinessRuleTaskProcessor.onActivate(BusinessRuleTaskProcessor.java:40) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.task.BusinessRuleTaskProcessor.onActivate(BusinessRuleTaskProcessor.java:21) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.BpmnStreamProcessor.lambda$processEvent$2(BpmnStreamProcessor.java:128) ~[classes/:?] at io.camunda.zeebe.util.Either$Right.ifRightOrLeft(Either.java:381) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.BpmnStreamProcessor.processEvent(BpmnStreamProcessor.java:127) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.BpmnStreamProcessor.lambda$processRecord$0(BpmnStreamProcessor.java:110) ~[classes/:?] at io.camunda.zeebe.util.Either$Right.ifRightOrLeft(Either.java:381) ~[classes/:?] at io.camunda.zeebe.engine.processing.bpmn.BpmnStreamProcessor.processRecord(BpmnStreamProcessor.java:107) ~[classes/:?] at io.camunda.zeebe.engine.processing.streamprocessor.TypedRecordProcessor.processRecord(TypedRecordProcessor.java:58) ~[classes/:?] at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.lambda$processInTransaction$3(ProcessingStateMachine.java:300) ~[classes/:?] at io.camunda.zeebe.db.impl.rocksdb.transaction.ZeebeTransaction.run(ZeebeTransaction.java:84) ~[classes/:?] at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.processInTransaction(ProcessingStateMachine.java:290) ~[classes/:?] at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.processCommand(ProcessingStateMachine.java:253) ~[classes/:?] at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.tryToReadNextRecord(ProcessingStateMachine.java:213) ~[classes/:?] at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.readNextRecord(ProcessingStateMachine.java:189) ~[classes/:?] at io.camunda.zeebe.util.sched.ActorJob.invoke(ActorJob.java:79) ~[classes/:?] at io.camunda.zeebe.util.sched.ActorJob.execute(ActorJob.java:44) ~[classes/:?] at io.camunda.zeebe.util.sched.ActorTask.execute(ActorTask.java:122) ~[classes/:?] at io.camunda.zeebe.util.sched.ActorThread.executeCurrentTask(ActorThread.java:97) ~[classes/:?] at io.camunda.zeebe.util.sched.ActorThread.doWork(ActorThread.java:80) ~[classes/:?] at io.camunda.zeebe.util.sched.ActorThread.run(ActorThread.java:189) ~[classes/:?] ``` </p> </details> **Environment:** - OS: Cloud - Zeebe Version: Seen on 8.0.0, also happens on latest main
process
illegalstateexception when writing decision evaluation event describe the bug when trying to write the decision evaluation event an illegalargumentexception is thrown this is because when searching for decision by decision requirements key multiple results with the same decision id are returned java final var decisionkeysbydecisionid decisionstate finddecisionsbydecisionrequirementskey decision getdecisionrequirementskey stream collect collectors tomap persisteddecision bufferasstring persisteddecision getdecisionid decisioninfo new these duplicate decision id cause the tomap function to fail as no merge function is provided the found decisions do all have a different version to reproduce steps to reproduce the behavior if possible add a minimal reproducer code sample when using the java client it was a challenge to reproduce this issue but i found a way to do this it requires drd s that both contain a decision with the same id and a process which contains a business rule task referencing this decision id next follow these steps deploy translateday dmn deploy translatemonth dmn without making any changes redeploy translateday dmn deploy translateprocess dmn start a pi zbctl create instance translateprocess insecure variables day monday month april at this point an exception should be thrown expected behavior no exception should occur log stacktrace full stacktrace java lang illegalstateexception duplicate key translate attempted merging values decisioninfo and decisioninfo at java util stream collectors duplicatekeyexception collectors java at java util stream collectors lambda uniqkeysmapaccumulator collectors java at java util stream reduceops accept reduceops java at java util arraylist arraylistspliterator foreachremaining arraylist java at java util stream abstractpipeline copyinto abstractpipeline java at java util stream abstractpipeline wrapandcopyinto abstractpipeline java at java util stream reduceops reduceop evaluatesequential reduceops java at java util stream abstractpipeline evaluate abstractpipeline java at java util stream referencepipeline collect referencepipeline java at io camunda zeebe engine processing bpmn behavior bpmndecisionbehavior writedecisionevaluationevent bpmndecisionbehavior java at io camunda zeebe engine processing bpmn behavior bpmndecisionbehavior lambda evaluatedecision bpmndecisionbehavior java at io camunda zeebe util either right flatmap either java at io camunda zeebe engine processing bpmn behavior bpmndecisionbehavior evaluatedecision bpmndecisionbehavior java at io camunda zeebe engine processing bpmn task businessruletaskprocessor calleddecisionbehavior lambda onactivate businessruletaskprocessor java at io camunda zeebe util either right flatmap either java at io camunda zeebe engine processing bpmn task businessruletaskprocessor calleddecisionbehavior onactivate businessruletaskprocessor java at io camunda zeebe engine processing bpmn task businessruletaskprocessor onactivate businessruletaskprocessor java at io camunda zeebe engine processing bpmn task businessruletaskprocessor onactivate businessruletaskprocessor java at io camunda zeebe engine processing bpmn bpmnstreamprocessor lambda processevent bpmnstreamprocessor java at io camunda zeebe util either right ifrightorleft either java at io camunda zeebe engine processing bpmn bpmnstreamprocessor processevent bpmnstreamprocessor java at io camunda zeebe engine processing bpmn bpmnstreamprocessor lambda processrecord bpmnstreamprocessor java at io camunda zeebe util either right ifrightorleft either java at io camunda zeebe engine processing bpmn bpmnstreamprocessor processrecord bpmnstreamprocessor java at io camunda zeebe engine processing streamprocessor typedrecordprocessor processrecord typedrecordprocessor java at io camunda zeebe engine processing streamprocessor processingstatemachine lambda processintransaction processingstatemachine java at io camunda zeebe db impl rocksdb transaction zeebetransaction run zeebetransaction java at io camunda zeebe engine processing streamprocessor processingstatemachine processintransaction processingstatemachine java at io camunda zeebe engine processing streamprocessor processingstatemachine processcommand processingstatemachine java at io camunda zeebe engine processing streamprocessor processingstatemachine trytoreadnextrecord processingstatemachine java at io camunda zeebe engine processing streamprocessor processingstatemachine readnextrecord processingstatemachine java at io camunda zeebe util sched actorjob invoke actorjob java at io camunda zeebe util sched actorjob execute actorjob java at io camunda zeebe util sched actortask execute actortask java at io camunda zeebe util sched actorthread executecurrenttask actorthread java at io camunda zeebe util sched actorthread dowork actorthread java at io camunda zeebe util sched actorthread run actorthread java environment os cloud zeebe version seen on also happens on latest main
1
243,628
26,286,969,558
IssuesEvent
2023-01-07 23:46:05
MValle21/riposte
https://api.github.com/repos/MValle21/riposte
opened
CVE-2021-37137 (High) detected in netty-codec-4.1.49.Final.jar
security vulnerability
## CVE-2021-37137 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-4.1.49.Final.jar</b></p></summary> <p>Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers and clients.</p> <p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p> <p>Path to dependency file: /riposte-servlet-api-adapter/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec/4.1.49.Final/20218de83c906348283f548c255650fd06030424/netty-codec-4.1.49.Final.jar</p> <p> Dependency Hierarchy: - netty-codec-http-4.1.49.Final.jar (Root Library) - :x: **netty-codec-4.1.49.Final.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/MValle21/riposte/commit/d64100a6a3eec7fcaa704df6c76f50ee0fe55763">d64100a6a3eec7fcaa704df6c76f50ee0fe55763</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The Snappy frame decoder function doesn't restrict the chunk length which may lead to excessive memory usage. Beside this it also may buffer reserved skippable chunks until the whole chunk was received which may lead to excessive memory usage as well. This vulnerability can be triggered by supplying malicious input that decompresses to a very big size (via a network stream or a file) or by sending a huge skippable chunk. <p>Publish Date: 2021-10-19 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-37137>CVE-2021-37137</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-9vjp-v76f-g363">https://github.com/advisories/GHSA-9vjp-v76f-g363</a></p> <p>Release Date: 2021-10-19</p> <p>Fix Resolution (io.netty:netty-codec): 4.1.68.Final</p> <p>Direct dependency fix Resolution (io.netty:netty-codec-http): 4.1.68.Final</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue
True
CVE-2021-37137 (High) detected in netty-codec-4.1.49.Final.jar - ## CVE-2021-37137 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-4.1.49.Final.jar</b></p></summary> <p>Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers and clients.</p> <p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p> <p>Path to dependency file: /riposte-servlet-api-adapter/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec/4.1.49.Final/20218de83c906348283f548c255650fd06030424/netty-codec-4.1.49.Final.jar</p> <p> Dependency Hierarchy: - netty-codec-http-4.1.49.Final.jar (Root Library) - :x: **netty-codec-4.1.49.Final.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/MValle21/riposte/commit/d64100a6a3eec7fcaa704df6c76f50ee0fe55763">d64100a6a3eec7fcaa704df6c76f50ee0fe55763</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The Snappy frame decoder function doesn't restrict the chunk length which may lead to excessive memory usage. Beside this it also may buffer reserved skippable chunks until the whole chunk was received which may lead to excessive memory usage as well. This vulnerability can be triggered by supplying malicious input that decompresses to a very big size (via a network stream or a file) or by sending a huge skippable chunk. <p>Publish Date: 2021-10-19 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-37137>CVE-2021-37137</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-9vjp-v76f-g363">https://github.com/advisories/GHSA-9vjp-v76f-g363</a></p> <p>Release Date: 2021-10-19</p> <p>Fix Resolution (io.netty:netty-codec): 4.1.68.Final</p> <p>Direct dependency fix Resolution (io.netty:netty-codec-http): 4.1.68.Final</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue
non_process
cve high detected in netty codec final jar cve high severity vulnerability vulnerable library netty codec final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file riposte servlet api adapter build gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec final netty codec final jar dependency hierarchy netty codec http final jar root library x netty codec final jar vulnerable library found in head commit a href found in base branch master vulnerability details the snappy frame decoder function doesn t restrict the chunk length which may lead to excessive memory usage beside this it also may buffer reserved skippable chunks until the whole chunk was received which may lead to excessive memory usage as well this vulnerability can be triggered by supplying malicious input that decompresses to a very big size via a network stream or a file or by sending a huge skippable chunk publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io netty netty codec final direct dependency fix resolution io netty netty codec http final rescue worker helmet automatic remediation is available for this issue
0
80,312
3,560,834,504
IssuesEvent
2016-01-23 10:39:10
jadjoubran/laravel5-angular-material-starter
https://api.github.com/repos/jadjoubran/laravel5-angular-material-starter
closed
Update angular folder structure/naming & fix generators accordingly
enhancement Priority: High
- [x] `angular/app` should be renamed to `angular/app/pages/` - [x] `angular/directives` should be renamed to `angular/app/components` - [x] `.directive.js` should become `.component.js` - [x] page_name`.page.html` (for page views) - [x] finalize other breaking changes - [x] update documentation - [x] upgrade guide Most of these tasks help in angular 2 migration
1.0
Update angular folder structure/naming & fix generators accordingly - - [x] `angular/app` should be renamed to `angular/app/pages/` - [x] `angular/directives` should be renamed to `angular/app/components` - [x] `.directive.js` should become `.component.js` - [x] page_name`.page.html` (for page views) - [x] finalize other breaking changes - [x] update documentation - [x] upgrade guide Most of these tasks help in angular 2 migration
non_process
update angular folder structure naming fix generators accordingly angular app should be renamed to angular app pages angular directives should be renamed to angular app components directive js should become component js page name page html for page views finalize other breaking changes update documentation upgrade guide most of these tasks help in angular migration
0
241,929
20,173,542,184
IssuesEvent
2022-02-10 12:36:57
woocommerce/woocommerce-gutenberg-products-block
https://api.github.com/repos/woocommerce/woocommerce-gutenberg-products-block
closed
Critical flows: Merchant → Cart → Can add inner blocks
type: enhancement ◼️ block: cart category: tests
### Note Create e2e tests to verify that a merchant can add inner blocks. ### File `tests/e2e/specs/merchant/cart-inner-blocks.test`
1.0
Critical flows: Merchant → Cart → Can add inner blocks - ### Note Create e2e tests to verify that a merchant can add inner blocks. ### File `tests/e2e/specs/merchant/cart-inner-blocks.test`
non_process
critical flows merchant → cart → can add inner blocks note create tests to verify that a merchant can add inner blocks file tests specs merchant cart inner blocks test
0
103,607
16,602,936,223
IssuesEvent
2021-06-01 22:17:46
gms-ws-sandbox/nibrs
https://api.github.com/repos/gms-ws-sandbox/nibrs
opened
CVE-2018-11040 (Medium) detected in spring-webmvc-4.3.11.RELEASE.jar, spring-web-4.3.11.RELEASE.jar
security vulnerability
## CVE-2018-11040 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>spring-webmvc-4.3.11.RELEASE.jar</b>, <b>spring-web-4.3.11.RELEASE.jar</b></p></summary> <p> <details><summary><b>spring-webmvc-4.3.11.RELEASE.jar</b></p></summary> <p>Spring Web MVC</p> <p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p> <p>Path to dependency file: nibrs/tools/nibrs-fbi-service/pom.xml</p> <p>Path to vulnerable library: nibrs/tools/nibrs-fbi-service/target/nibrs-fbi-service-1.0.0/WEB-INF/lib/spring-webmvc-4.3.11.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-webmvc/4.3.11.RELEASE/spring-webmvc-4.3.11.RELEASE.jar</p> <p> Dependency Hierarchy: - :x: **spring-webmvc-4.3.11.RELEASE.jar** (Vulnerable Library) </details> <details><summary><b>spring-web-4.3.11.RELEASE.jar</b></p></summary> <p>Spring Web</p> <p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p> <p>Path to dependency file: nibrs/tools/nibrs-fbi-service/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-web/4.3.11.RELEASE/spring-web-4.3.11.RELEASE.jar,nibrs/tools/nibrs-fbi-service/target/nibrs-fbi-service-1.0.0/WEB-INF/lib/spring-web-4.3.11.RELEASE.jar</p> <p> Dependency Hierarchy: - :x: **spring-web-4.3.11.RELEASE.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/gms-ws-sandbox/nibrs/commit/9fb1c19bd26c2113d1961640de126a33eacdc946">9fb1c19bd26c2113d1961640de126a33eacdc946</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Spring Framework, versions 5.0.x prior to 5.0.7 and 4.3.x prior to 4.3.18 and older unsupported versions, allows web applications to enable cross-domain requests via JSONP (JSON with Padding) through AbstractJsonpResponseBodyAdvice for REST controllers and MappingJackson2JsonView for browser requests. Both are not enabled by default in Spring Framework nor Spring Boot, however, when MappingJackson2JsonView is configured in an application, JSONP support is automatically ready to use through the "jsonp" and "callback" JSONP parameters, enabling cross-domain requests. <p>Publish Date: 2018-06-25 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11040>CVE-2018-11040</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11040">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11040</a></p> <p>Release Date: 2018-06-25</p> <p>Fix Resolution: org.springframework:spring-web:5.0.7.RELEASE,4.3.18.RELEASE,org.springframework:spring-webmvc:5.0.7.RELEASE,4.3.18.RELEASE,org.springframework:spring-websocket:5.0.7.RELEASE,4.3.18.RELEASE</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.springframework","packageName":"spring-webmvc","packageVersion":"4.3.11.RELEASE","packageFilePaths":["/tools/nibrs-fbi-service/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"org.springframework:spring-webmvc:4.3.11.RELEASE","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.springframework:spring-web:5.0.7.RELEASE,4.3.18.RELEASE,org.springframework:spring-webmvc:5.0.7.RELEASE,4.3.18.RELEASE,org.springframework:spring-websocket:5.0.7.RELEASE,4.3.18.RELEASE"},{"packageType":"Java","groupId":"org.springframework","packageName":"spring-web","packageVersion":"4.3.11.RELEASE","packageFilePaths":["/tools/nibrs-fbi-service/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"org.springframework:spring-web:4.3.11.RELEASE","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.springframework:spring-web:5.0.7.RELEASE,4.3.18.RELEASE,org.springframework:spring-webmvc:5.0.7.RELEASE,4.3.18.RELEASE,org.springframework:spring-websocket:5.0.7.RELEASE,4.3.18.RELEASE"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2018-11040","vulnerabilityDetails":"Spring Framework, versions 5.0.x prior to 5.0.7 and 4.3.x prior to 4.3.18 and older unsupported versions, allows web applications to enable cross-domain requests via JSONP (JSON with Padding) through AbstractJsonpResponseBodyAdvice for REST controllers and MappingJackson2JsonView for browser requests. Both are not enabled by default in Spring Framework nor Spring Boot, however, when MappingJackson2JsonView is configured in an application, JSONP support is automatically ready to use through the \"jsonp\" and \"callback\" JSONP parameters, enabling cross-domain requests.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11040","cvss3Severity":"medium","cvss3Score":"5.9","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
CVE-2018-11040 (Medium) detected in spring-webmvc-4.3.11.RELEASE.jar, spring-web-4.3.11.RELEASE.jar - ## CVE-2018-11040 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>spring-webmvc-4.3.11.RELEASE.jar</b>, <b>spring-web-4.3.11.RELEASE.jar</b></p></summary> <p> <details><summary><b>spring-webmvc-4.3.11.RELEASE.jar</b></p></summary> <p>Spring Web MVC</p> <p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p> <p>Path to dependency file: nibrs/tools/nibrs-fbi-service/pom.xml</p> <p>Path to vulnerable library: nibrs/tools/nibrs-fbi-service/target/nibrs-fbi-service-1.0.0/WEB-INF/lib/spring-webmvc-4.3.11.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-webmvc/4.3.11.RELEASE/spring-webmvc-4.3.11.RELEASE.jar</p> <p> Dependency Hierarchy: - :x: **spring-webmvc-4.3.11.RELEASE.jar** (Vulnerable Library) </details> <details><summary><b>spring-web-4.3.11.RELEASE.jar</b></p></summary> <p>Spring Web</p> <p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p> <p>Path to dependency file: nibrs/tools/nibrs-fbi-service/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-web/4.3.11.RELEASE/spring-web-4.3.11.RELEASE.jar,nibrs/tools/nibrs-fbi-service/target/nibrs-fbi-service-1.0.0/WEB-INF/lib/spring-web-4.3.11.RELEASE.jar</p> <p> Dependency Hierarchy: - :x: **spring-web-4.3.11.RELEASE.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/gms-ws-sandbox/nibrs/commit/9fb1c19bd26c2113d1961640de126a33eacdc946">9fb1c19bd26c2113d1961640de126a33eacdc946</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Spring Framework, versions 5.0.x prior to 5.0.7 and 4.3.x prior to 4.3.18 and older unsupported versions, allows web applications to enable cross-domain requests via JSONP (JSON with Padding) through AbstractJsonpResponseBodyAdvice for REST controllers and MappingJackson2JsonView for browser requests. Both are not enabled by default in Spring Framework nor Spring Boot, however, when MappingJackson2JsonView is configured in an application, JSONP support is automatically ready to use through the "jsonp" and "callback" JSONP parameters, enabling cross-domain requests. <p>Publish Date: 2018-06-25 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11040>CVE-2018-11040</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11040">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11040</a></p> <p>Release Date: 2018-06-25</p> <p>Fix Resolution: org.springframework:spring-web:5.0.7.RELEASE,4.3.18.RELEASE,org.springframework:spring-webmvc:5.0.7.RELEASE,4.3.18.RELEASE,org.springframework:spring-websocket:5.0.7.RELEASE,4.3.18.RELEASE</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.springframework","packageName":"spring-webmvc","packageVersion":"4.3.11.RELEASE","packageFilePaths":["/tools/nibrs-fbi-service/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"org.springframework:spring-webmvc:4.3.11.RELEASE","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.springframework:spring-web:5.0.7.RELEASE,4.3.18.RELEASE,org.springframework:spring-webmvc:5.0.7.RELEASE,4.3.18.RELEASE,org.springframework:spring-websocket:5.0.7.RELEASE,4.3.18.RELEASE"},{"packageType":"Java","groupId":"org.springframework","packageName":"spring-web","packageVersion":"4.3.11.RELEASE","packageFilePaths":["/tools/nibrs-fbi-service/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"org.springframework:spring-web:4.3.11.RELEASE","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.springframework:spring-web:5.0.7.RELEASE,4.3.18.RELEASE,org.springframework:spring-webmvc:5.0.7.RELEASE,4.3.18.RELEASE,org.springframework:spring-websocket:5.0.7.RELEASE,4.3.18.RELEASE"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2018-11040","vulnerabilityDetails":"Spring Framework, versions 5.0.x prior to 5.0.7 and 4.3.x prior to 4.3.18 and older unsupported versions, allows web applications to enable cross-domain requests via JSONP (JSON with Padding) through AbstractJsonpResponseBodyAdvice for REST controllers and MappingJackson2JsonView for browser requests. Both are not enabled by default in Spring Framework nor Spring Boot, however, when MappingJackson2JsonView is configured in an application, JSONP support is automatically ready to use through the \"jsonp\" and \"callback\" JSONP parameters, enabling cross-domain requests.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11040","cvss3Severity":"medium","cvss3Score":"5.9","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_process
cve medium detected in spring webmvc release jar spring web release jar cve medium severity vulnerability vulnerable libraries spring webmvc release jar spring web release jar spring webmvc release jar spring web mvc library home page a href path to dependency file nibrs tools nibrs fbi service pom xml path to vulnerable library nibrs tools nibrs fbi service target nibrs fbi service web inf lib spring webmvc release jar home wss scanner repository org springframework spring webmvc release spring webmvc release jar dependency hierarchy x spring webmvc release jar vulnerable library spring web release jar spring web library home page a href path to dependency file nibrs tools nibrs fbi service pom xml path to vulnerable library home wss scanner repository org springframework spring web release spring web release jar nibrs tools nibrs fbi service target nibrs fbi service web inf lib spring web release jar dependency hierarchy x spring web release jar vulnerable library found in head commit a href found in base branch master vulnerability details spring framework versions x prior to and x prior to and older unsupported versions allows web applications to enable cross domain requests via jsonp json with padding through abstractjsonpresponsebodyadvice for rest controllers and for browser requests both are not enabled by default in spring framework nor spring boot however when is configured in an application jsonp support is automatically ready to use through the jsonp and callback jsonp parameters enabling cross domain requests publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org springframework spring web release release org springframework spring webmvc release release org springframework spring websocket release release isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree org springframework spring webmvc release isminimumfixversionavailable true minimumfixversion org springframework spring web release release org springframework spring webmvc release release org springframework spring websocket release release packagetype java groupid org springframework packagename spring web packageversion release packagefilepaths istransitivedependency false dependencytree org springframework spring web release isminimumfixversionavailable true minimumfixversion org springframework spring web release release org springframework spring webmvc release release org springframework spring websocket release release basebranches vulnerabilityidentifier cve vulnerabilitydetails spring framework versions x prior to and x prior to and older unsupported versions allows web applications to enable cross domain requests via jsonp json with padding through abstractjsonpresponsebodyadvice for rest controllers and for browser requests both are not enabled by default in spring framework nor spring boot however when is configured in an application jsonp support is automatically ready to use through the jsonp and callback jsonp parameters enabling cross domain requests vulnerabilityurl
0
21,642
30,056,517,079
IssuesEvent
2023-06-28 07:15:40
0xPolygonMiden/miden-vm
https://api.github.com/repos/0xPolygonMiden/miden-vm
closed
Question: Why is Fmp (free memory pointer) required?
processor
The fmp takes up one column in the execution trace and has two associated vm operations. It looks like its used for generating memory addresses for procedure locals? I'm trying to understand why we need the fmp. Couldn't we just delegate calculation of the memory addresses for procedure locals to the compiler? This would allows us to free two instruction slots and a trace column. I'm most likely missing something. Any information / explanation appreciated.
1.0
Question: Why is Fmp (free memory pointer) required? - The fmp takes up one column in the execution trace and has two associated vm operations. It looks like its used for generating memory addresses for procedure locals? I'm trying to understand why we need the fmp. Couldn't we just delegate calculation of the memory addresses for procedure locals to the compiler? This would allows us to free two instruction slots and a trace column. I'm most likely missing something. Any information / explanation appreciated.
process
question why is fmp free memory pointer required the fmp takes up one column in the execution trace and has two associated vm operations it looks like its used for generating memory addresses for procedure locals i m trying to understand why we need the fmp couldn t we just delegate calculation of the memory addresses for procedure locals to the compiler this would allows us to free two instruction slots and a trace column i m most likely missing something any information explanation appreciated
1
36,399
2,799,326,196
IssuesEvent
2015-05-12 23:43:22
wordpress-mobile/WordPress-Android
https://api.github.com/repos/wordpress-mobile/WordPress-Android
closed
Navigation Drawer: Notifications tap highlight shows over border
core notifications priority-low
When you tap on Notifications, the highlight shows over the border at the bottom of the header: ![screen shot 2015-03-16 at 11 42 58 am](https://cloud.githubusercontent.com/assets/789137/6673613/c7628c84-cbd1-11e4-850d-c0cb2cf1e569.png) Sony Z3 compact, Android 4.4.4
1.0
Navigation Drawer: Notifications tap highlight shows over border - When you tap on Notifications, the highlight shows over the border at the bottom of the header: ![screen shot 2015-03-16 at 11 42 58 am](https://cloud.githubusercontent.com/assets/789137/6673613/c7628c84-cbd1-11e4-850d-c0cb2cf1e569.png) Sony Z3 compact, Android 4.4.4
non_process
navigation drawer notifications tap highlight shows over border when you tap on notifications the highlight shows over the border at the bottom of the header sony compact android
0
10,018
13,043,914,557
IssuesEvent
2020-07-29 03:02:15
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `Compress` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `Compress` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @breeswish ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `Compress` from TiDB - ## Description Port the scalar function `Compress` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @breeswish ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function compress from tidb description port the scalar function compress from tidb to coprocessor score mentor s breeswish recommended skills rust programming learning materials already implemented expressions ported from tidb
1
145,841
11,710,120,744
IssuesEvent
2020-03-08 22:47:37
JuliaDocs/Documenter.jl
https://api.github.com/repos/JuliaDocs/Documenter.jl
closed
PDF/LaTeX test phase failing on tags
Type: Tests
https://travis-ci.org/JuliaDocs/Documenter.jl/builds/644089901 https://travis-ci.org/JuliaDocs/Documenter.jl/builds/638646797 ... https://travis-ci.org/JuliaDocs/Documenter.jl/builds/615449662 It blocks the documentation deployment phase, so that phase has to be restarted manually for tags as long as this is not fixed.
1.0
PDF/LaTeX test phase failing on tags - https://travis-ci.org/JuliaDocs/Documenter.jl/builds/644089901 https://travis-ci.org/JuliaDocs/Documenter.jl/builds/638646797 ... https://travis-ci.org/JuliaDocs/Documenter.jl/builds/615449662 It blocks the documentation deployment phase, so that phase has to be restarted manually for tags as long as this is not fixed.
non_process
pdf latex test phase failing on tags it blocks the documentation deployment phase so that phase has to be restarted manually for tags as long as this is not fixed
0
17,732
23,641,606,097
IssuesEvent
2022-08-25 17:36:56
apache/arrow-rs
https://api.github.com/repos/apache/arrow-rs
closed
Potential OOB Access in Unreleased Comparison Kernels
bug development-process
**Describe the bug** <!-- A clear and concise description of what the bug is. --> https://github.com/apache/arrow-rs/pull/2533 introduced a potential OOB access as described [here](https://github.com/apache/arrow-rs/pull/2533#discussion_r953949061). This should be fixed prior to the next release **To Reproduce** <!-- Steps to reproduce the behavior: --> **Expected behavior** <!-- A clear and concise description of what you expected to happen. --> **Additional context** <!-- Add any other context about the problem here. -->
1.0
Potential OOB Access in Unreleased Comparison Kernels - **Describe the bug** <!-- A clear and concise description of what the bug is. --> https://github.com/apache/arrow-rs/pull/2533 introduced a potential OOB access as described [here](https://github.com/apache/arrow-rs/pull/2533#discussion_r953949061). This should be fixed prior to the next release **To Reproduce** <!-- Steps to reproduce the behavior: --> **Expected behavior** <!-- A clear and concise description of what you expected to happen. --> **Additional context** <!-- Add any other context about the problem here. -->
process
potential oob access in unreleased comparison kernels describe the bug a clear and concise description of what the bug is introduced a potential oob access as described this should be fixed prior to the next release to reproduce steps to reproduce the behavior expected behavior a clear and concise description of what you expected to happen additional context add any other context about the problem here
1
258,450
22,320,328,552
IssuesEvent
2022-06-14 05:28:30
NebulaMC-GG/Support
https://api.github.com/repos/NebulaMC-GG/Support
closed
/evs and /ivs commands not working in spawn
bug needs testing spigot
Self-explanatory in the title. The commands work fine in the wilderness world, they just don't work in spawn.
1.0
/evs and /ivs commands not working in spawn - Self-explanatory in the title. The commands work fine in the wilderness world, they just don't work in spawn.
non_process
evs and ivs commands not working in spawn self explanatory in the title the commands work fine in the wilderness world they just don t work in spawn
0
18,365
24,492,474,476
IssuesEvent
2022-10-10 04:35:43
f-lab-edu/MarketFlea
https://api.github.com/repos/f-lab-edu/MarketFlea
opened
회원 탈퇴
in process
- [ ] 비밀번호 입력 후 매치 성공시 현재 로그인 중인 회원 탈퇴 - [ ] 회원 탈퇴 성공시 자동으로 로그아웃 - [ ] 비밀번호 불일치 시 안내 메시지 리턴
1.0
회원 탈퇴 - - [ ] 비밀번호 입력 후 매치 성공시 현재 로그인 중인 회원 탈퇴 - [ ] 회원 탈퇴 성공시 자동으로 로그아웃 - [ ] 비밀번호 불일치 시 안내 메시지 리턴
process
회원 탈퇴 비밀번호 입력 후 매치 성공시 현재 로그인 중인 회원 탈퇴 회원 탈퇴 성공시 자동으로 로그아웃 비밀번호 불일치 시 안내 메시지 리턴
1
127,466
12,323,600,524
IssuesEvent
2020-05-13 12:27:10
gatsbyjs/gatsby
https://api.github.com/repos/gatsbyjs/gatsby
closed
new pages dont update: Tutorial can't be followed
stale? status: needs reproduction type: bug type: documentation
<!-- Please fill out each section below, otherwise, your issue will be closed. This info allows Gatsby maintainers to diagnose (and fix!) your issue as quickly as possible. Useful Links: - Documentation: https://www.gatsbyjs.org/docs/ - How to File an Issue: https://www.gatsbyjs.org/contributing/how-to-file-an-issue/ Before opening a new issue, please search existing issues: https://github.com/gatsbyjs/gatsby/issues --> ## Description I am new to Gatsby (ie. today is my first dive). Following the Gatsby tutorials here https://www.gatsbyjs.org/tutorial/ I find I have to restart Gatsby develop to view new pages. ### Steps to reproduce 1. Follow the Gatsby tutorials on a new machine with a new Gatsby setup 2. Add a new page (ie src/pages/about.js) 3. Check the develop browser window, and go to the new page all new pages use this initial structure: `import React from "react" export default () => (` 4. We see a 404 page with "There's not a page yet at /test/" 5. Stop, then restart the Gatsby develop processs 6. Now it works! (I have installed nvm, and tried toggling from latest node (13.12.0) back to LTS node (12.16.1), no difference) ### Expected result We shouldn't have to restart Gatsby develop according to your tutorial. ### Actual result See above - -pages don't load. ### Environment System: OS: macOS 10.15.4 CPU: (6) x64 Intel(R) Core(TM) i5-9600K CPU @ 3.70GHz Shell: 5.7.1 - /bin/zsh Binaries: Node: 12.16.1 - ~/.nvm/versions/node/v12.16.1/bin/node npm: 6.13.4 - ~/.nvm/versions/node/v12.16.1/bin/npm Languages: Python: 2.7.16 - /usr/bin/python Browsers: Chrome: 80.0.3987.149 Firefox: 73.0.1 Safari: 13.1
1.0
new pages dont update: Tutorial can't be followed - <!-- Please fill out each section below, otherwise, your issue will be closed. This info allows Gatsby maintainers to diagnose (and fix!) your issue as quickly as possible. Useful Links: - Documentation: https://www.gatsbyjs.org/docs/ - How to File an Issue: https://www.gatsbyjs.org/contributing/how-to-file-an-issue/ Before opening a new issue, please search existing issues: https://github.com/gatsbyjs/gatsby/issues --> ## Description I am new to Gatsby (ie. today is my first dive). Following the Gatsby tutorials here https://www.gatsbyjs.org/tutorial/ I find I have to restart Gatsby develop to view new pages. ### Steps to reproduce 1. Follow the Gatsby tutorials on a new machine with a new Gatsby setup 2. Add a new page (ie src/pages/about.js) 3. Check the develop browser window, and go to the new page all new pages use this initial structure: `import React from "react" export default () => (` 4. We see a 404 page with "There's not a page yet at /test/" 5. Stop, then restart the Gatsby develop processs 6. Now it works! (I have installed nvm, and tried toggling from latest node (13.12.0) back to LTS node (12.16.1), no difference) ### Expected result We shouldn't have to restart Gatsby develop according to your tutorial. ### Actual result See above - -pages don't load. ### Environment System: OS: macOS 10.15.4 CPU: (6) x64 Intel(R) Core(TM) i5-9600K CPU @ 3.70GHz Shell: 5.7.1 - /bin/zsh Binaries: Node: 12.16.1 - ~/.nvm/versions/node/v12.16.1/bin/node npm: 6.13.4 - ~/.nvm/versions/node/v12.16.1/bin/npm Languages: Python: 2.7.16 - /usr/bin/python Browsers: Chrome: 80.0.3987.149 Firefox: 73.0.1 Safari: 13.1
non_process
new pages dont update tutorial can t be followed please fill out each section below otherwise your issue will be closed this info allows gatsby maintainers to diagnose and fix your issue as quickly as possible useful links documentation how to file an issue before opening a new issue please search existing issues description i am new to gatsby ie today is my first dive following the gatsby tutorials here i find i have to restart gatsby develop to view new pages steps to reproduce follow the gatsby tutorials on a new machine with a new gatsby setup add a new page ie src pages about js check the develop browser window and go to the new page all new pages use this initial structure import react from react export default we see a page with there s not a page yet at test stop then restart the gatsby develop processs now it works i have installed nvm and tried toggling from latest node back to lts node no difference expected result we shouldn t have to restart gatsby develop according to your tutorial actual result see above pages don t load environment system os macos cpu intel r core tm cpu shell bin zsh binaries node nvm versions node bin node npm nvm versions node bin npm languages python usr bin python browsers chrome firefox safari
0
9,415
12,411,234,670
IssuesEvent
2020-05-22 08:11:00
ESMValGroup/ESMValCore
https://api.github.com/repos/ESMValGroup/ESMValCore
closed
Feature request: Preprocessor to extract amplitude of cycles
enhancement preprocessor
We do not have a preprocessor yet which is able to extract the amplitude of cycles (annual cycle, diurnal cycle, etc.). I will open a PR for that.
1.0
Feature request: Preprocessor to extract amplitude of cycles - We do not have a preprocessor yet which is able to extract the amplitude of cycles (annual cycle, diurnal cycle, etc.). I will open a PR for that.
process
feature request preprocessor to extract amplitude of cycles we do not have a preprocessor yet which is able to extract the amplitude of cycles annual cycle diurnal cycle etc i will open a pr for that
1
8,015
11,205,217,469
IssuesEvent
2020-01-05 12:45:13
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
QGIS 3.10: Broken clustering method selection in GRASS v.cluster processing algorithm?
Bug Feedback Processing
1. QGIS DBSCAN works properly. 2. GRASS v.cluster methods do not. 3. When the user selects the Clustering method it would be helpful if the dialog entry displayed the entry. DBSCAN for example, currently it displays how many options are selected. ( 1 options selected). If the goal is to run all algorithms than perhaps a comma separate list should appear. 4. When I use the same input data and parameters with GRASS v.cluster there is no labeled cluster even though the Log output states there are clusters. The 'cat' column appears in the output with the same value as the 'fid' **_instead of the label value for the cluster_**. DBSCAN Registering primitives... 12 clusters found 6 outliers found When testing Optics and 'density' methods the same result and both state clusters are found. Optics Registering primitives... 3 clusters found 95 outliers found density Registering primitives... 39 clusters found 22 outliers found
1.0
QGIS 3.10: Broken clustering method selection in GRASS v.cluster processing algorithm? - 1. QGIS DBSCAN works properly. 2. GRASS v.cluster methods do not. 3. When the user selects the Clustering method it would be helpful if the dialog entry displayed the entry. DBSCAN for example, currently it displays how many options are selected. ( 1 options selected). If the goal is to run all algorithms than perhaps a comma separate list should appear. 4. When I use the same input data and parameters with GRASS v.cluster there is no labeled cluster even though the Log output states there are clusters. The 'cat' column appears in the output with the same value as the 'fid' **_instead of the label value for the cluster_**. DBSCAN Registering primitives... 12 clusters found 6 outliers found When testing Optics and 'density' methods the same result and both state clusters are found. Optics Registering primitives... 3 clusters found 95 outliers found density Registering primitives... 39 clusters found 22 outliers found
process
qgis broken clustering method selection in grass v cluster processing algorithm qgis dbscan works properly grass v cluster methods do not when the user selects the clustering method it would be helpful if the dialog entry displayed the entry dbscan for example currently it displays how many options are selected options selected if the goal is to run all algorithms than perhaps a comma separate list should appear when i use the same input data and parameters with grass v cluster there is no labeled cluster even though the log output states there are clusters the cat column appears in the output with the same value as the fid instead of the label value for the cluster dbscan registering primitives clusters found outliers found when testing optics and density methods the same result and both state clusters are found optics registering primitives clusters found outliers found density registering primitives clusters found outliers found
1
15,687
19,847,968,519
IssuesEvent
2022-01-21 09:04:29
googleapis/java-eventarc-publishing
https://api.github.com/repos/googleapis/java-eventarc-publishing
opened
Your .repo-metadata.json file has a problem 🤒
type: process repo-metadata: lint
You have a problem with your .repo-metadata.json file: Result of scan 📈: * api_shortname 'eventarc-publishing' invalid in .repo-metadata.json ☝️ Once you address these problems, you can close this issue. ### Need help? * [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field. * [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**. * Reach out to **go/github-automation** if you have any questions.
1.0
Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file: Result of scan 📈: * api_shortname 'eventarc-publishing' invalid in .repo-metadata.json ☝️ Once you address these problems, you can close this issue. ### Need help? * [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field. * [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**. * Reach out to **go/github-automation** if you have any questions.
process
your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 api shortname eventarc publishing invalid in repo metadata json ☝️ once you address these problems you can close this issue need help lists valid options for each field for grpc libraries api shortname should match the subdomain of an api s hostname reach out to go github automation if you have any questions
1
628,407
19,985,421,202
IssuesEvent
2022-01-30 15:36:04
WasmEdge/WasmEdge
https://api.github.com/repos/WasmEdge/WasmEdge
closed
[Install] Add Linux aarch64 support for extensions
enhancement priority:medium platform:linux
The current installer does not attempt to install TF and image extensions on Linux aarch64 since those extensions were not available on that platform. However, we have recently released those extensions on Linux aarch64. The installer should support that.
1.0
[Install] Add Linux aarch64 support for extensions - The current installer does not attempt to install TF and image extensions on Linux aarch64 since those extensions were not available on that platform. However, we have recently released those extensions on Linux aarch64. The installer should support that.
non_process
add linux support for extensions the current installer does not attempt to install tf and image extensions on linux since those extensions were not available on that platform however we have recently released those extensions on linux the installer should support that
0
92,157
26,597,845,895
IssuesEvent
2023-01-23 13:48:15
appsmithorg/appsmith
https://api.github.com/repos/appsmithorg/appsmith
closed
[Bug]: The fields that cause errors in widgets do not get a red highlight without clicking on them
Bug Property Pane App Viewers Pod Low Production Needs Triaging UI Builders Pod All Widgets
### Is there an existing issue for this? - [X] I have searched the existing issues ### Description We do not get to see the red highlight over the fields in the property pane that cause any errors in any widgets. https://www.loom.com/share/a7e8ba9dd42545f1beecff592cacb8c0 ### Steps To Reproduce 1. Go to any application. 2. Set up binding(s) in any widget in such a way that it throws an error. ### Public Sample App _No response_ ### Issue video log _No response_ ### Version Cloud
1.0
[Bug]: The fields that cause errors in widgets do not get a red highlight without clicking on them - ### Is there an existing issue for this? - [X] I have searched the existing issues ### Description We do not get to see the red highlight over the fields in the property pane that cause any errors in any widgets. https://www.loom.com/share/a7e8ba9dd42545f1beecff592cacb8c0 ### Steps To Reproduce 1. Go to any application. 2. Set up binding(s) in any widget in such a way that it throws an error. ### Public Sample App _No response_ ### Issue video log _No response_ ### Version Cloud
non_process
the fields that cause errors in widgets do not get a red highlight without clicking on them is there an existing issue for this i have searched the existing issues description we do not get to see the red highlight over the fields in the property pane that cause any errors in any widgets steps to reproduce go to any application set up binding s in any widget in such a way that it throws an error public sample app no response issue video log no response version cloud
0
10,734
13,533,723,991
IssuesEvent
2020-09-16 03:46:33
timberio/vector
https://api.github.com/repos/timberio/vector
opened
Process metrics - Remap tags
Epic domain: metrics domain: processing type: feature
Like logs, metrics can be messy and Vector should provide the necessary tools to clean metric data up. Vector currently offers very weak tools for this job (add, rename, and remove tags). This epic is focused on improving this area of Vector so that it at least has parity with other leading metrics collectors (Telegraf). # User Needs As a user, I need... 1. To REMAP tags (add, rename, remove), so that I can more easily transform a metric's tag data. 2. To CONVERT between various metrics types, so that I can correct upstream type errors. <p></p>
1.0
Process metrics - Remap tags - Like logs, metrics can be messy and Vector should provide the necessary tools to clean metric data up. Vector currently offers very weak tools for this job (add, rename, and remove tags). This epic is focused on improving this area of Vector so that it at least has parity with other leading metrics collectors (Telegraf). # User Needs As a user, I need... 1. To REMAP tags (add, rename, remove), so that I can more easily transform a metric's tag data. 2. To CONVERT between various metrics types, so that I can correct upstream type errors. <p></p>
process
process metrics remap tags like logs metrics can be messy and vector should provide the necessary tools to clean metric data up vector currently offers very weak tools for this job add rename and remove tags this epic is focused on improving this area of vector so that it at least has parity with other leading metrics collectors telegraf user needs as a user i need to remap tags add rename remove so that i can more easily transform a metric s tag data to convert between various metrics types so that i can correct upstream type errors
1
18,832
24,736,505,684
IssuesEvent
2022-10-20 22:35:53
varabyte/kotter
https://api.github.com/repos/varabyte/kotter
closed
Get code coverage to 90+% for all non-UI classes
good first issue process
Create a test executor and test terminal so we can verify behaviors in unit tests.
1.0
Get code coverage to 90+% for all non-UI classes - Create a test executor and test terminal so we can verify behaviors in unit tests.
process
get code coverage to for all non ui classes create a test executor and test terminal so we can verify behaviors in unit tests
1
83,486
24,064,063,439
IssuesEvent
2022-09-17 08:03:45
pyodide/pyodide
https://api.github.com/repos/pyodide/pyodide
closed
micropip import error
bug build
After building pyodide, I tried the following code : ``` pyodide.loadPackage("micropip").then(() => { pyodide.runPythonAsync(` import micropip `); }); ``` ``` pyodide.asm.js:14 Uncaught (in promise) PythonError: Traceback (most recent call last): File "/lib/python3.9/asyncio/futures.py", line 201, in result raise self._exception File "/lib/python3.9/asyncio/tasks.py", line 256, in __step result = coro.send(None) File "/lib/python3.9/site-packages/_pyodide/_base.py", line 494, in eval_code_async await CodeRunner( File "/lib/python3.9/site-packages/_pyodide/_base.py", line 345, in run_async coroutine = eval(self.code, globals, locals) File "<exec>", line 2, in <module> ModuleNotFoundError: No module named 'micropip' at new_error (pyodide.asm.js:14) at pyodide.asm.wasm:0x1b9399 at pyodide.asm.wasm:0x1bd7d6 at pyodide.asm.wasm:0x853699 at pyodide.asm.wasm:0x1f73c1 at pyodide.asm.wasm:0x2f2f19 at pyodide.asm.wasm:0x87250d at pyodide.asm.wasm:0x23ed39 at pyodide.asm.wasm:0x8708c8 at pyodide.asm.wasm:0x1f7b76 ``` I built just some of packages, because of some errors. But `./packages/micropip/build/micropip-0.1/micropip` exists.
1.0
micropip import error - After building pyodide, I tried the following code : ``` pyodide.loadPackage("micropip").then(() => { pyodide.runPythonAsync(` import micropip `); }); ``` ``` pyodide.asm.js:14 Uncaught (in promise) PythonError: Traceback (most recent call last): File "/lib/python3.9/asyncio/futures.py", line 201, in result raise self._exception File "/lib/python3.9/asyncio/tasks.py", line 256, in __step result = coro.send(None) File "/lib/python3.9/site-packages/_pyodide/_base.py", line 494, in eval_code_async await CodeRunner( File "/lib/python3.9/site-packages/_pyodide/_base.py", line 345, in run_async coroutine = eval(self.code, globals, locals) File "<exec>", line 2, in <module> ModuleNotFoundError: No module named 'micropip' at new_error (pyodide.asm.js:14) at pyodide.asm.wasm:0x1b9399 at pyodide.asm.wasm:0x1bd7d6 at pyodide.asm.wasm:0x853699 at pyodide.asm.wasm:0x1f73c1 at pyodide.asm.wasm:0x2f2f19 at pyodide.asm.wasm:0x87250d at pyodide.asm.wasm:0x23ed39 at pyodide.asm.wasm:0x8708c8 at pyodide.asm.wasm:0x1f7b76 ``` I built just some of packages, because of some errors. But `./packages/micropip/build/micropip-0.1/micropip` exists.
non_process
micropip import error after building pyodide i tried the following code pyodide loadpackage micropip then pyodide runpythonasync import micropip pyodide asm js uncaught in promise pythonerror traceback most recent call last file lib asyncio futures py line in result raise self exception file lib asyncio tasks py line in step result coro send none file lib site packages pyodide base py line in eval code async await coderunner file lib site packages pyodide base py line in run async coroutine eval self code globals locals file line in modulenotfounderror no module named micropip at new error pyodide asm js at pyodide asm wasm at pyodide asm wasm at pyodide asm wasm at pyodide asm wasm at pyodide asm wasm at pyodide asm wasm at pyodide asm wasm at pyodide asm wasm at pyodide asm wasm i built just some of packages because of some errors but packages micropip build micropip micropip exists
0
255,107
27,484,745,677
IssuesEvent
2023-03-04 01:14:40
panasalap/linux-4.1.15
https://api.github.com/repos/panasalap/linux-4.1.15
opened
CVE-2020-29371 (Low) detected in linux-yocto-4.1v4.1.17
security vulnerability
## CVE-2020-29371 - Low Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yocto-4.1v4.1.17</b></p></summary> <p> <p>[no description]</p> <p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto-4.1>https://git.yoctoproject.org/git/linux-yocto-4.1</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (3)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/romfs/storage.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/romfs/storage.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/romfs/storage.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in romfs_dev_read in fs/romfs/storage.c in the Linux kernel before 5.8.4. Uninitialized memory leaks to userspace, aka CID-bcf85fcedfdd. <p>Publish Date: 2020-11-28 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-29371>CVE-2020-29371</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-29371">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-29371</a></p> <p>Release Date: 2020-11-28</p> <p>Fix Resolution: v5.9-rc2,v5.8.4,v5.7.18,v5.4.61</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-29371 (Low) detected in linux-yocto-4.1v4.1.17 - ## CVE-2020-29371 - Low Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yocto-4.1v4.1.17</b></p></summary> <p> <p>[no description]</p> <p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto-4.1>https://git.yoctoproject.org/git/linux-yocto-4.1</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (3)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/romfs/storage.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/romfs/storage.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/romfs/storage.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in romfs_dev_read in fs/romfs/storage.c in the Linux kernel before 5.8.4. Uninitialized memory leaks to userspace, aka CID-bcf85fcedfdd. <p>Publish Date: 2020-11-28 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-29371>CVE-2020-29371</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-29371">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-29371</a></p> <p>Release Date: 2020-11-28</p> <p>Fix Resolution: v5.9-rc2,v5.8.4,v5.7.18,v5.4.61</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve low detected in linux yocto cve low severity vulnerability vulnerable library linux yocto library home page a href found in base branch master vulnerable source files fs romfs storage c fs romfs storage c fs romfs storage c vulnerability details an issue was discovered in romfs dev read in fs romfs storage c in the linux kernel before uninitialized memory leaks to userspace aka cid publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
0
81
2,532,040,874
IssuesEvent
2015-01-23 13:11:23
schoffelen/MOUS
https://api.github.com/repos/schoffelen/MOUS
reopened
do a new iteration of parcellation of mne-erf results (visual)
data processing meg
As discussed in the 2014 fall workshop, in order to pinpoint more the spatial location of the effects we should re-attempt the statistics based on a parcellation. For consistency, we will use the conte69 subparcellated Broadmann area atlas. To do: -revisit the parcellation step: this is an explicit step that needs to be done prior to doing the statistics, i.e. don't rely on ft_sourcestatistics doing anything meaningful with specifying a cfg.roi/cfg.avgoverroi -check the functionality for doing statistics based on parcellated data, essentially parcellations are represented in FieldTrip as a channel-level type structure, so it requires ft_timelockstatistics, rather than ft_sourcestatistics.
1.0
do a new iteration of parcellation of mne-erf results (visual) - As discussed in the 2014 fall workshop, in order to pinpoint more the spatial location of the effects we should re-attempt the statistics based on a parcellation. For consistency, we will use the conte69 subparcellated Broadmann area atlas. To do: -revisit the parcellation step: this is an explicit step that needs to be done prior to doing the statistics, i.e. don't rely on ft_sourcestatistics doing anything meaningful with specifying a cfg.roi/cfg.avgoverroi -check the functionality for doing statistics based on parcellated data, essentially parcellations are represented in FieldTrip as a channel-level type structure, so it requires ft_timelockstatistics, rather than ft_sourcestatistics.
process
do a new iteration of parcellation of mne erf results visual as discussed in the fall workshop in order to pinpoint more the spatial location of the effects we should re attempt the statistics based on a parcellation for consistency we will use the subparcellated broadmann area atlas to do revisit the parcellation step this is an explicit step that needs to be done prior to doing the statistics i e don t rely on ft sourcestatistics doing anything meaningful with specifying a cfg roi cfg avgoverroi check the functionality for doing statistics based on parcellated data essentially parcellations are represented in fieldtrip as a channel level type structure so it requires ft timelockstatistics rather than ft sourcestatistics
1
1,920
3,427,813,527
IssuesEvent
2015-12-10 04:49:27
ForNeVeR/kaiwa
https://api.github.com/repos/ForNeVeR/kaiwa
closed
Typescript integration
enhancement infrastructure
All the new code should be fully type checked by the TypeScript compiler. There are too many problems due to JavaScript's dynamic nature; it's time to overcome this. We'll need to find a way to integrate TypeScript compiler into Kaiwa build and serving process.
1.0
Typescript integration - All the new code should be fully type checked by the TypeScript compiler. There are too many problems due to JavaScript's dynamic nature; it's time to overcome this. We'll need to find a way to integrate TypeScript compiler into Kaiwa build and serving process.
non_process
typescript integration all the new code should be fully type checked by the typescript compiler there are too many problems due to javascript s dynamic nature it s time to overcome this we ll need to find a way to integrate typescript compiler into kaiwa build and serving process
0
61,697
12,194,867,381
IssuesEvent
2020-04-29 16:26:41
kwk/test-llvm-bz-import-5
https://api.github.com/repos/kwk/test-llvm-bz-import-5
closed
scheduler crash on CodeGen/Generic/add-with-overflow-128.ll when running on x86-32
BZ-BUG-STATUS: RESOLVED BZ-RESOLUTION: FIXED dummy import from bugzilla libraries/Common Code Generator Code
This issue was imported from Bugzilla https://bugs.llvm.org/show_bug.cgi?id=8823.
2.0
scheduler crash on CodeGen/Generic/add-with-overflow-128.ll when running on x86-32 - This issue was imported from Bugzilla https://bugs.llvm.org/show_bug.cgi?id=8823.
non_process
scheduler crash on codegen generic add with overflow ll when running on this issue was imported from bugzilla
0
10,801
13,609,287,696
IssuesEvent
2020-09-23 04:50:07
googleapis/java-datacatalog
https://api.github.com/repos/googleapis/java-datacatalog
closed
Dependency Dashboard
api: datacatalog type: process
This issue contains a list of Renovate updates and their statuses. ## Open These updates have all been created already. Click a checkbox below to force a retry/rebase of any. - [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-datacatalog-1.x -->chore(deps): update dependency com.google.cloud:google-cloud-datacatalog to v1.0.1 - [ ] <!-- rebase-branch=renovate/com.google.cloud-libraries-bom-10.x -->chore(deps): update dependency com.google.cloud:libraries-bom to v10 --- - [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
1.0
Dependency Dashboard - This issue contains a list of Renovate updates and their statuses. ## Open These updates have all been created already. Click a checkbox below to force a retry/rebase of any. - [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-datacatalog-1.x -->chore(deps): update dependency com.google.cloud:google-cloud-datacatalog to v1.0.1 - [ ] <!-- rebase-branch=renovate/com.google.cloud-libraries-bom-10.x -->chore(deps): update dependency com.google.cloud:libraries-bom to v10 --- - [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
process
dependency dashboard this issue contains a list of renovate updates and their statuses open these updates have all been created already click a checkbox below to force a retry rebase of any chore deps update dependency com google cloud google cloud datacatalog to chore deps update dependency com google cloud libraries bom to check this box to trigger a request for renovate to run again on this repository
1
7,934
11,115,060,504
IssuesEvent
2019-12-18 09:58:42
Open-EO/openeo-api
https://api.github.com/repos/Open-EO/openeo-api
closed
Process graph: Complex validation of array and objects that contain variables and other references
process graphs
This issue is a follow-up on #160 to keep track of issues we are facing with the new process graphs. **1. Validation of array and objects that contain variables and other references** An issue I am currently facing is that I can't easily validate arrays and objects that contain variables, references to results or callback arguments. Currently, I'm trying to implement a JSON Schema based approach for process graph validation where each argument of a process is validated against the JSON Schema that is specified for each parameter in the process specification. In theory, this works very well and in practice it does, too, unless you add variables and other references such as {from_node "abc"} in your process graph. The issue is that JSON schema expects something different and fails. For example, I'd like to load a collection by ID and restrict the temporal extent with a start date, both coming from a variable: ``` { process_id: "load_collection", arguments: { id: {variable_id: "collection", type: "string"}, temporal_extent: [{variable_id: "start_date", type: "string"}], spatial_extent: null } } ``` In theory, this works well and I should be able to easily check whether the specified variable type is compatible with the schema specified from the argument. This indeed works in my implementation for the id by simply checking whether the parameter is a variable or not and then running the JSON Schema validaton or not. However, it doesn't for the temporal_extent as the object is inside an array and therefore the JSON Schema based validation fails. I'd need to make sense of the JSON Schema and interfere with the JSON Schema validator. With most libraries this is not very easy or impossible. Therefore it seems hard to implement in the validation step. Additionally, it is also hard to construct in the Web Editor. It shouldn't be too much of a problem in the other CLI-based libraries though. The problems described here don't only apply for variables, but also for referring to results and callback arguments. We need to think about how we want to handle this. I don't have a good solution yet and proposals are more then welcome. **If you have other issues, please report them, too.**
1.0
Process graph: Complex validation of array and objects that contain variables and other references - This issue is a follow-up on #160 to keep track of issues we are facing with the new process graphs. **1. Validation of array and objects that contain variables and other references** An issue I am currently facing is that I can't easily validate arrays and objects that contain variables, references to results or callback arguments. Currently, I'm trying to implement a JSON Schema based approach for process graph validation where each argument of a process is validated against the JSON Schema that is specified for each parameter in the process specification. In theory, this works very well and in practice it does, too, unless you add variables and other references such as {from_node "abc"} in your process graph. The issue is that JSON schema expects something different and fails. For example, I'd like to load a collection by ID and restrict the temporal extent with a start date, both coming from a variable: ``` { process_id: "load_collection", arguments: { id: {variable_id: "collection", type: "string"}, temporal_extent: [{variable_id: "start_date", type: "string"}], spatial_extent: null } } ``` In theory, this works well and I should be able to easily check whether the specified variable type is compatible with the schema specified from the argument. This indeed works in my implementation for the id by simply checking whether the parameter is a variable or not and then running the JSON Schema validaton or not. However, it doesn't for the temporal_extent as the object is inside an array and therefore the JSON Schema based validation fails. I'd need to make sense of the JSON Schema and interfere with the JSON Schema validator. With most libraries this is not very easy or impossible. Therefore it seems hard to implement in the validation step. Additionally, it is also hard to construct in the Web Editor. It shouldn't be too much of a problem in the other CLI-based libraries though. The problems described here don't only apply for variables, but also for referring to results and callback arguments. We need to think about how we want to handle this. I don't have a good solution yet and proposals are more then welcome. **If you have other issues, please report them, too.**
process
process graph complex validation of array and objects that contain variables and other references this issue is a follow up on to keep track of issues we are facing with the new process graphs validation of array and objects that contain variables and other references an issue i am currently facing is that i can t easily validate arrays and objects that contain variables references to results or callback arguments currently i m trying to implement a json schema based approach for process graph validation where each argument of a process is validated against the json schema that is specified for each parameter in the process specification in theory this works very well and in practice it does too unless you add variables and other references such as from node abc in your process graph the issue is that json schema expects something different and fails for example i d like to load a collection by id and restrict the temporal extent with a start date both coming from a variable process id load collection arguments id variable id collection type string temporal extent spatial extent null in theory this works well and i should be able to easily check whether the specified variable type is compatible with the schema specified from the argument this indeed works in my implementation for the id by simply checking whether the parameter is a variable or not and then running the json schema validaton or not however it doesn t for the temporal extent as the object is inside an array and therefore the json schema based validation fails i d need to make sense of the json schema and interfere with the json schema validator with most libraries this is not very easy or impossible therefore it seems hard to implement in the validation step additionally it is also hard to construct in the web editor it shouldn t be too much of a problem in the other cli based libraries though the problems described here don t only apply for variables but also for referring to results and callback arguments we need to think about how we want to handle this i don t have a good solution yet and proposals are more then welcome if you have other issues please report them too
1
229,807
25,378,458,991
IssuesEvent
2022-11-21 15:44:07
elastic/integrations
https://api.github.com/repos/elastic/integrations
closed
AWS Security Hub #3589: SecurityHub Dashboard Overview and other dashboard improvements
Team:Security-External Integrations Integration:AWS
As we discussed in the demo call please consider adding the following resources to the AWS Security Hub overview dashboard: - Findings by resource type - Findings by Types - Top 10 security hub findings - Top 10 critical findings by severity - Critical findings count comparison in last week - Counts records by Severity - Counts of records by service name
True
AWS Security Hub #3589: SecurityHub Dashboard Overview and other dashboard improvements - As we discussed in the demo call please consider adding the following resources to the AWS Security Hub overview dashboard: - Findings by resource type - Findings by Types - Top 10 security hub findings - Top 10 critical findings by severity - Critical findings count comparison in last week - Counts records by Severity - Counts of records by service name
non_process
aws security hub securityhub dashboard overview and other dashboard improvements as we discussed in the demo call please consider adding the following resources to the aws security hub overview dashboard findings by resource type findings by types top security hub findings top critical findings by severity critical findings count comparison in last week counts records by severity counts of records by service name
0
54,756
6,402,926,497
IssuesEvent
2017-08-06 14:21:15
pixelhumain/co2
https://api.github.com/repos/pixelhumain/co2
closed
Editer profil
to test
Dans réseaux sociaux : la mise à jour du lien vers la page github ne fonctionne pas, ça tourne en boucle ![capture d'écran](https://user-images.githubusercontent.com/17404254/28363280-3c8d4c00-6c80-11e7-8b70-4811b31c468c.png) Dans informations générales, la date de naissance ne fonctionne pas. On peut en entrer une et suavegarder, mais quand on recharge la page elle est différente ![image](https://user-images.githubusercontent.com/17404254/28363389-ad98ba1a-6c80-11e7-9ba6-3a9719af63be.png)
1.0
Editer profil - Dans réseaux sociaux : la mise à jour du lien vers la page github ne fonctionne pas, ça tourne en boucle ![capture d'écran](https://user-images.githubusercontent.com/17404254/28363280-3c8d4c00-6c80-11e7-8b70-4811b31c468c.png) Dans informations générales, la date de naissance ne fonctionne pas. On peut en entrer une et suavegarder, mais quand on recharge la page elle est différente ![image](https://user-images.githubusercontent.com/17404254/28363389-ad98ba1a-6c80-11e7-9ba6-3a9719af63be.png)
non_process
editer profil dans réseaux sociaux la mise à jour du lien vers la page github ne fonctionne pas ça tourne en boucle dans informations générales la date de naissance ne fonctionne pas on peut en entrer une et suavegarder mais quand on recharge la page elle est différente
0
605,139
18,725,420,231
IssuesEvent
2021-11-03 15:49:39
GoogleCloudPlatform/nodejs-getting-started
https://api.github.com/repos/GoogleCloudPlatform/nodejs-getting-started
closed
background: "after all" hook for "should get the correct response" failed
type: bug priority: p1 flakybot: issue flakybot: flaky
Note: #478 was also for this test, but it is locked ---- commit: 5bccb4d294a17e64b3f561644595b688713ac13c buildURL: [Build Status](https://source.cloud.google.com/results/invocations/d3b26d36-dd78-459e-a4e8-5f7e7843ebbe), [Sponge](http://sponge2/d3b26d36-dd78-459e-a4e8-5f7e7843ebbe) status: failed <details><summary>Test output</summary><br><pre>16 UNAUTHENTICATED: Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project. Error: 16 UNAUTHENTICATED: Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project. at Object.callErrorFromStatus (node_modules/@grpc/grpc-js/build/src/call.js:31:26) at Object.onReceiveStatus (node_modules/@grpc/grpc-js/build/src/client.js:331:49) at Object.onReceiveStatus (node_modules/@grpc/grpc-js/build/src/client-interceptors.js:299:181) at process.nextTick (node_modules/@grpc/grpc-js/build/src/call-stream.js:160:78) at process._tickCallback (internal/process/next_tick.js:61:11) Caused by: Error at CollectionReference._get (node_modules/@google-cloud/firestore/build/src/reference.js:1519:23) at CollectionReference.get (node_modules/@google-cloud/firestore/build/src/reference.js:1507:21) at Context.after (test/app.test.js:73:54)</pre></details>
1.0
background: "after all" hook for "should get the correct response" failed - Note: #478 was also for this test, but it is locked ---- commit: 5bccb4d294a17e64b3f561644595b688713ac13c buildURL: [Build Status](https://source.cloud.google.com/results/invocations/d3b26d36-dd78-459e-a4e8-5f7e7843ebbe), [Sponge](http://sponge2/d3b26d36-dd78-459e-a4e8-5f7e7843ebbe) status: failed <details><summary>Test output</summary><br><pre>16 UNAUTHENTICATED: Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project. Error: 16 UNAUTHENTICATED: Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project. at Object.callErrorFromStatus (node_modules/@grpc/grpc-js/build/src/call.js:31:26) at Object.onReceiveStatus (node_modules/@grpc/grpc-js/build/src/client.js:331:49) at Object.onReceiveStatus (node_modules/@grpc/grpc-js/build/src/client-interceptors.js:299:181) at process.nextTick (node_modules/@grpc/grpc-js/build/src/call-stream.js:160:78) at process._tickCallback (internal/process/next_tick.js:61:11) Caused by: Error at CollectionReference._get (node_modules/@google-cloud/firestore/build/src/reference.js:1519:23) at CollectionReference.get (node_modules/@google-cloud/firestore/build/src/reference.js:1507:21) at Context.after (test/app.test.js:73:54)</pre></details>
non_process
background after all hook for should get the correct response failed note was also for this test but it is locked commit buildurl status failed test output unauthenticated request had invalid authentication credentials expected oauth access token login cookie or other valid authentication credential see error unauthenticated request had invalid authentication credentials expected oauth access token login cookie or other valid authentication credential see at object callerrorfromstatus node modules grpc grpc js build src call js at object onreceivestatus node modules grpc grpc js build src client js at object onreceivestatus node modules grpc grpc js build src client interceptors js at process nexttick node modules grpc grpc js build src call stream js at process tickcallback internal process next tick js caused by error at collectionreference get node modules google cloud firestore build src reference js at collectionreference get node modules google cloud firestore build src reference js at context after test app test js
0
16,722
21,885,291,079
IssuesEvent
2022-05-19 17:59:48
metabase/metabase
https://api.github.com/repos/metabase/metabase
opened
[Actions] Disallow executing `is_readwrite` Saved Questions from normal QP API endpoints
Querying/Processor Misc/API .Backend Actions/
Part of #22542 Attempting to execute an `is_readwrite` query thru the usual readonly query endpoints such as `POST /api/card/:id/query` or the DashCard execution endpoint should result in an error. Readwrite queries should only be executable via the Actions pathway for the time being.
1.0
[Actions] Disallow executing `is_readwrite` Saved Questions from normal QP API endpoints - Part of #22542 Attempting to execute an `is_readwrite` query thru the usual readonly query endpoints such as `POST /api/card/:id/query` or the DashCard execution endpoint should result in an error. Readwrite queries should only be executable via the Actions pathway for the time being.
process
disallow executing is readwrite saved questions from normal qp api endpoints part of attempting to execute an is readwrite query thru the usual readonly query endpoints such as post api card id query or the dashcard execution endpoint should result in an error readwrite queries should only be executable via the actions pathway for the time being
1
16,236
20,782,550,754
IssuesEvent
2022-03-16 15:58:42
opensearch-project/data-prepper
https://api.github.com/repos/opensearch-project/data-prepper
closed
String Manipulation Processor
plugin - processor proposal
Data Prepper should have a Processor for string manipulation. Some candidate operations: * `uppercase` - Make a string all uppercase * `lowercase` - Make a string all lowercase * `trim` - Trim whitespace from the start and end of a string * `split` - Split a string on a delimiter into an array of strings * `join` - Create a new string from other fields, separated by a delimiter * `format` - Create a new string from a format string and other values in an event This should be a new Processor. It can also deprecate the existing `string_converter` plugin. Unlike the existing `string_converter` processor, this new processor will rely on updating keys. See https://github.com/opensearch-project/data-prepper/pull/753/files#r770751733 for more information on the difference between the plugins.
1.0
String Manipulation Processor - Data Prepper should have a Processor for string manipulation. Some candidate operations: * `uppercase` - Make a string all uppercase * `lowercase` - Make a string all lowercase * `trim` - Trim whitespace from the start and end of a string * `split` - Split a string on a delimiter into an array of strings * `join` - Create a new string from other fields, separated by a delimiter * `format` - Create a new string from a format string and other values in an event This should be a new Processor. It can also deprecate the existing `string_converter` plugin. Unlike the existing `string_converter` processor, this new processor will rely on updating keys. See https://github.com/opensearch-project/data-prepper/pull/753/files#r770751733 for more information on the difference between the plugins.
process
string manipulation processor data prepper should have a processor for string manipulation some candidate operations uppercase make a string all uppercase lowercase make a string all lowercase trim trim whitespace from the start and end of a string split split a string on a delimiter into an array of strings join create a new string from other fields separated by a delimiter format create a new string from a format string and other values in an event this should be a new processor it can also deprecate the existing string converter plugin unlike the existing string converter processor this new processor will rely on updating keys see for more information on the difference between the plugins
1
19,755
2,622,166,781
IssuesEvent
2015-03-04 00:12:43
byzhang/rapidjson
https://api.github.com/repos/byzhang/rapidjson
closed
demo request
auto-migrated Priority-Medium Type-Task
``` Hi folks: I am trying to use rapidjson to get two value pairs unknown from a server. I mean sort like { "x":"y" } both x and y unknown (several of this pairs) and I couldn't find a demo to do that in rapidjson. I need to get/print both values. I think it would be great to have one around in the examples. All I could see are, getting predetermined values and some sort of iterators that are not easy to get. I mean to understand. ``` Original issue reported on code.google.com by `waldoalv...@gmail.com` on 13 Jan 2012 at 4:07
1.0
demo request - ``` Hi folks: I am trying to use rapidjson to get two value pairs unknown from a server. I mean sort like { "x":"y" } both x and y unknown (several of this pairs) and I couldn't find a demo to do that in rapidjson. I need to get/print both values. I think it would be great to have one around in the examples. All I could see are, getting predetermined values and some sort of iterators that are not easy to get. I mean to understand. ``` Original issue reported on code.google.com by `waldoalv...@gmail.com` on 13 Jan 2012 at 4:07
non_process
demo request hi folks i am trying to use rapidjson to get two value pairs unknown from a server i mean sort like x y both x and y unknown several of this pairs and i couldn t find a demo to do that in rapidjson i need to get print both values i think it would be great to have one around in the examples all i could see are getting predetermined values and some sort of iterators that are not easy to get i mean to understand original issue reported on code google com by waldoalv gmail com on jan at
0
193,469
6,885,179,046
IssuesEvent
2017-11-21 15:22:59
tsgrp/OpenAnnotate
https://api.github.com/repos/tsgrp/OpenAnnotate
closed
Duplicate Replace-Text Lines Created on Document Load
High Priority Issue
When loading a document with replace-text annotations, duplicate svg elements are created on the document. So when you attempt to delete the annotation, another copy of the lines remains. This only occurs with text selects, and is most likely occurring due to the "groupedReply" relationship with the Caret. ![text_select_annotation](https://user-images.githubusercontent.com/22036857/32462551-7109fe8a-c2ff-11e7-8340-530fa1249384.gif)
1.0
Duplicate Replace-Text Lines Created on Document Load - When loading a document with replace-text annotations, duplicate svg elements are created on the document. So when you attempt to delete the annotation, another copy of the lines remains. This only occurs with text selects, and is most likely occurring due to the "groupedReply" relationship with the Caret. ![text_select_annotation](https://user-images.githubusercontent.com/22036857/32462551-7109fe8a-c2ff-11e7-8340-530fa1249384.gif)
non_process
duplicate replace text lines created on document load when loading a document with replace text annotations duplicate svg elements are created on the document so when you attempt to delete the annotation another copy of the lines remains this only occurs with text selects and is most likely occurring due to the groupedreply relationship with the caret
0
6,666
9,782,499,378
IssuesEvent
2019-06-08 00:04:53
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
MP: fix in pathogen host branch missing parent
PomBase missing parentage multi-species process
![tmep](https://user-images.githubusercontent.com/7359272/58831118-de2e0700-8643-11e9-86b9-7def8548b92b.jpg) GO:0044055 modulation by symbiont of host system process has these descendants GO:0044061    modulation by symbiont of host excretion | is_a -- | -- GO:0044064    modulation by symbiont of host respiratory system process | is_a GO:0044063    modulation by symbiont of host neurological system process | is_a GO:0044056    modulation by symbiont of host digestive system process | is_a GO:0044059    modulation by symbiont of host endocrine process | is_a but other 'system processes' (immune system for example) are in a different branch (highlighted this above). Do we need to distinghuish different level of process like this ? (makes terms difficult to find), or could they all be under host process
1.0
MP: fix in pathogen host branch missing parent - ![tmep](https://user-images.githubusercontent.com/7359272/58831118-de2e0700-8643-11e9-86b9-7def8548b92b.jpg) GO:0044055 modulation by symbiont of host system process has these descendants GO:0044061    modulation by symbiont of host excretion | is_a -- | -- GO:0044064    modulation by symbiont of host respiratory system process | is_a GO:0044063    modulation by symbiont of host neurological system process | is_a GO:0044056    modulation by symbiont of host digestive system process | is_a GO:0044059    modulation by symbiont of host endocrine process | is_a but other 'system processes' (immune system for example) are in a different branch (highlighted this above). Do we need to distinghuish different level of process like this ? (makes terms difficult to find), or could they all be under host process
process
mp fix in pathogen host branch missing parent go modulation by symbiont of host system process has these descendants go     modulation by symbiont of host excretion is a go     modulation by symbiont of host respiratory system process is a go     modulation by symbiont of host neurological system process is a go     modulation by symbiont of host digestive system process is a go     modulation by symbiont of host endocrine process is a but other system processes immune system for example are in a different branch highlighted this above do we need to distinghuish different level of process like this makes terms difficult to find or could they all be under host process
1
9,629
8,057,429,625
IssuesEvent
2018-08-02 15:22:14
privacyidea/privacyidea
https://api.github.com/repos/privacyidea/privacyidea
closed
Provide python-croniter in Ubuntu Repo
infrastructure and config
We need to provide python-croniter >= 0.3.8 at least in our ubuntu 14.04 repos.
1.0
Provide python-croniter in Ubuntu Repo - We need to provide python-croniter >= 0.3.8 at least in our ubuntu 14.04 repos.
non_process
provide python croniter in ubuntu repo we need to provide python croniter at least in our ubuntu repos
0
16,581
21,625,440,220
IssuesEvent
2022-05-05 01:02:17
googleapis/repo-automation-bots
https://api.github.com/repos/googleapis/repo-automation-bots
closed
owlbot: make OwlBot a required check for Python repositories
type: process bot: owl-bot
If OwlBot doesn't run on a pull request, it can result in a bad merge to the main branch, e.g., merging the staging directory. OwlBot should be made a required check for languages that are using OwlBot. If OwlBot ever fails to run: * we would be grateful if you'd notify the GitHub automation chat immediately _(if we can catch OwlBot in the act of not running, we might be able to pin down an edge case that you're bumping into)_. * after we've made an effort to diagnose the issue, add the `owlbot:run` label to your PR to manually run OwlBot. Refs: #1714
1.0
owlbot: make OwlBot a required check for Python repositories - If OwlBot doesn't run on a pull request, it can result in a bad merge to the main branch, e.g., merging the staging directory. OwlBot should be made a required check for languages that are using OwlBot. If OwlBot ever fails to run: * we would be grateful if you'd notify the GitHub automation chat immediately _(if we can catch OwlBot in the act of not running, we might be able to pin down an edge case that you're bumping into)_. * after we've made an effort to diagnose the issue, add the `owlbot:run` label to your PR to manually run OwlBot. Refs: #1714
process
owlbot make owlbot a required check for python repositories if owlbot doesn t run on a pull request it can result in a bad merge to the main branch e g merging the staging directory owlbot should be made a required check for languages that are using owlbot if owlbot ever fails to run we would be grateful if you d notify the github automation chat immediately if we can catch owlbot in the act of not running we might be able to pin down an edge case that you re bumping into after we ve made an effort to diagnose the issue add the owlbot run label to your pr to manually run owlbot refs
1
436,696
12,551,719,992
IssuesEvent
2020-06-06 15:42:37
googleapis/nodejs-web-risk
https://api.github.com/repos/googleapis/nodejs-web-risk
closed
Synthesis failed for nodejs-web-risk
api: webrisk autosynth failure priority: p1 type: bug
Hello! Autosynth couldn't regenerate nodejs-web-risk. :broken_heart: Here's the output from running `synth.py`: ``` b'gleapis-16\nnothing to commit, working tree clean\n2020-06-05 04:43:33,489 synthtool [DEBUG] > Ensuring dependencies.\nDEBUG:synthtool:Ensuring dependencies.\n2020-06-05 04:43:33,494 synthtool [DEBUG] > Cloning googleapis.\nDEBUG:synthtool:Cloning googleapis.\n2020-06-05 04:43:33,495 synthtool [DEBUG] > Using precloned repo /home/kbuilder/.cache/synthtool/googleapis\nDEBUG:synthtool:Using precloned repo /home/kbuilder/.cache/synthtool/googleapis\n2020-06-05 04:43:33,498 synthtool [DEBUG] > Pulling Docker image: gapic-generator-typescript:latest\nDEBUG:synthtool:Pulling Docker image: gapic-generator-typescript:latest\nlatest: Pulling from gapic-images/gapic-generator-typescript\nDigest: sha256:c9bc12024eddcfb94501627ff5b3ea302370995e9a2c9cde6b3317375d7e7b66\nStatus: Image is up to date for gcr.io/gapic-images/gapic-generator-typescript:latest\n2020-06-05 04:43:34,366 synthtool [DEBUG] > Generating code for: google/cloud/webrisk/v1beta1.\nDEBUG:synthtool:Generating code for: google/cloud/webrisk/v1beta1.\n2020-06-05 04:43:35,196 synthtool [DEBUG] > Wrote metadata to synth.metadata.\nDEBUG:synthtool:Wrote metadata to synth.metadata.\nTraceback (most recent call last):\n File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main\n "__main__", mod_spec)\n File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code\n exec(code, run_globals)\n File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>\n main()\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__\n return self.main(*args, **kwargs)\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main\n rv = self.invoke(ctx)\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke\n return ctx.invoke(self.callback, **ctx.params)\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke\n return callback(*args, **kwargs)\n File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main\n spec.loader.exec_module(synth_module) # type: ignore\n File "<frozen importlib._bootstrap_external>", line 678, in exec_module\n File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed\n File "/home/kbuilder/.cache/synthtool/nodejs-web-risk/synth.py", line 43, in <module>\n version=version)\n File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_microgenerator.py", line 66, in typescript_library\n return self._generate_code(service, version, "typescript", **kwargs)\n File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_microgenerator.py", line 195, in _generate_code\n f"Code generation seemed to succeed, but {output_dir} is empty."\nRuntimeError: Code generation seemed to succeed, but /tmpfs/tmp/tmpszy89504 is empty.\n2020-06-05 04:43:35,236 autosynth [ERROR] > Synthesis failed\n2020-06-05 04:43:35,237 autosynth [DEBUG] > Running: git reset --hard HEAD\nHEAD is now at 934b63a fix: version rearrange (#163)\n2020-06-05 04:43:35,242 autosynth [DEBUG] > Running: git checkout autosynth-googleapis\nSwitched to branch \'autosynth-googleapis\'\n2020-06-05 04:43:35,246 autosynth [ERROR] > Command \'[\'/tmpfs/src/github/synthtool/env/bin/python3\', \'-m\', \'synthtool\', \'--metadata\', \'synth.metadata\', \'synth.py\', \'--\']\' returned non-zero exit status 1.\n2020-06-05 04:43:35,382 autosynth [DEBUG] > Running: git checkout 934b63aa0163c84aceb27d306e1f656bcb2d8afd\nNote: checking out \'934b63aa0163c84aceb27d306e1f656bcb2d8afd\'.\n\nYou are in \'detached HEAD\' state. You can look around, make experimental\nchanges and commit them, and you can discard any commits you make in this\nstate without impacting any branches by performing another checkout.\n\nIf you want to create a new branch to retain commits you create, you may\ndo so (now or later) by using -b with the checkout command again. Example:\n\n git checkout -b <new-branch-name>\n\nHEAD is now at 934b63a fix: version rearrange (#163)\n2020-06-05 04:43:35,387 autosynth [DEBUG] > Running: git checkout d53a5b45c46920932dbe7d0a95e10d8b58933dae\nPrevious HEAD position was f06a850 fix: strip PR numbers from commit (#597)\nHEAD is now at d53a5b4 docs: improve README (#600)\n2020-06-05 04:43:35,394 autosynth [DEBUG] > Running: git checkout cd804bab06e46dd1a4f16c32155fd3cddb931b52\nHEAD is now at cd804bab docs: cleaned docs for the Agents service and resource.\n2020-06-05 04:43:35,406 autosynth [DEBUG] > Running: git branch -f autosynth-22\n2020-06-05 04:43:35,410 autosynth [DEBUG] > Running: git checkout autosynth-22\nSwitched to branch \'autosynth-22\'\n2020-06-05 04:43:35,414 autosynth [INFO] > Running synthtool\n2020-06-05 04:43:35,414 autosynth [INFO] > [\'/tmpfs/src/github/synthtool/env/bin/python3\', \'-m\', \'synthtool\', \'--metadata\', \'synth.metadata\', \'synth.py\', \'--\']\n2020-06-05 04:43:35,416 autosynth [DEBUG] > Running: /tmpfs/src/github/synthtool/env/bin/python3 -m synthtool --metadata synth.metadata synth.py --\n2020-06-05 04:43:35,632 synthtool [DEBUG] > Executing /home/kbuilder/.cache/synthtool/nodejs-web-risk/synth.py.\nOn branch autosynth-22\nnothing to commit, working tree clean\n2020-06-05 04:43:35,762 synthtool [DEBUG] > Ensuring dependencies.\nDEBUG:synthtool:Ensuring dependencies.\n2020-06-05 04:43:35,767 synthtool [DEBUG] > Cloning googleapis.\nDEBUG:synthtool:Cloning googleapis.\n2020-06-05 04:43:35,768 synthtool [DEBUG] > Using precloned repo /home/kbuilder/.cache/synthtool/googleapis\nDEBUG:synthtool:Using precloned repo /home/kbuilder/.cache/synthtool/googleapis\n2020-06-05 04:43:35,772 synthtool [DEBUG] > Pulling Docker image: gapic-generator-typescript:latest\nDEBUG:synthtool:Pulling Docker image: gapic-generator-typescript:latest\nlatest: Pulling from gapic-images/gapic-generator-typescript\nDigest: sha256:c9bc12024eddcfb94501627ff5b3ea302370995e9a2c9cde6b3317375d7e7b66\nStatus: Image is up to date for gcr.io/gapic-images/gapic-generator-typescript:latest\n2020-06-05 04:43:36,631 synthtool [DEBUG] > Generating code for: google/cloud/webrisk/v1beta1.\nDEBUG:synthtool:Generating code for: google/cloud/webrisk/v1beta1.\n2020-06-05 04:43:37,458 synthtool [DEBUG] > Wrote metadata to synth.metadata.\nDEBUG:synthtool:Wrote metadata to synth.metadata.\nTraceback (most recent call last):\n File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main\n "__main__", mod_spec)\n File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code\n exec(code, run_globals)\n File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>\n main()\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__\n return self.main(*args, **kwargs)\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main\n rv = self.invoke(ctx)\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke\n return ctx.invoke(self.callback, **ctx.params)\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke\n return callback(*args, **kwargs)\n File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main\n spec.loader.exec_module(synth_module) # type: ignore\n File "<frozen importlib._bootstrap_external>", line 678, in exec_module\n File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed\n File "/home/kbuilder/.cache/synthtool/nodejs-web-risk/synth.py", line 43, in <module>\n version=version)\n File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_microgenerator.py", line 66, in typescript_library\n return self._generate_code(service, version, "typescript", **kwargs)\n File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_microgenerator.py", line 195, in _generate_code\n f"Code generation seemed to succeed, but {output_dir} is empty."\nRuntimeError: Code generation seemed to succeed, but /tmpfs/tmp/tmp4g6edl6q is empty.\n2020-06-05 04:43:37,498 autosynth [ERROR] > Synthesis failed\n2020-06-05 04:43:37,498 autosynth [DEBUG] > Running: git reset --hard HEAD\nHEAD is now at 934b63a fix: version rearrange (#163)\n2020-06-05 04:43:37,505 autosynth [DEBUG] > Running: git checkout autosynth\nSwitched to branch \'autosynth\'\n2020-06-05 04:43:37,510 autosynth [DEBUG] > Running: git clean -fdx\nRemoving __pycache__/\nTraceback (most recent call last):\n File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main\n "__main__", mod_spec)\n File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code\n exec(code, run_globals)\n File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 615, in <module>\n main()\n File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 476, in main\n return _inner_main(temp_dir)\n File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 595, in _inner_main\n commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)\n File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 371, in synthesize_loop\n synthesize_inner_loop(toolbox, synthesizer)\n File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 381, in synthesize_inner_loop\n synthesizer, len(toolbox.versions) - 1\n File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 266, in synthesize_version_in_new_branch\n synthesizer.synthesize(synth_log_path, self.environ)\n File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 119, in synthesize\n synth_proc.check_returncode() # Raise an exception.\n File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode\n self.stderr)\nsubprocess.CalledProcessError: Command \'[\'/tmpfs/src/github/synthtool/env/bin/python3\', \'-m\', \'synthtool\', \'--metadata\', \'synth.metadata\', \'synth.py\', \'--\']\' returned non-zero exit status 1.\n' ``` Google internal developers can see the full log [here](http://sponge/a84fe804-6b84-41a5-b1f2-e6d681e30fa2).
1.0
Synthesis failed for nodejs-web-risk - Hello! Autosynth couldn't regenerate nodejs-web-risk. :broken_heart: Here's the output from running `synth.py`: ``` b'gleapis-16\nnothing to commit, working tree clean\n2020-06-05 04:43:33,489 synthtool [DEBUG] > Ensuring dependencies.\nDEBUG:synthtool:Ensuring dependencies.\n2020-06-05 04:43:33,494 synthtool [DEBUG] > Cloning googleapis.\nDEBUG:synthtool:Cloning googleapis.\n2020-06-05 04:43:33,495 synthtool [DEBUG] > Using precloned repo /home/kbuilder/.cache/synthtool/googleapis\nDEBUG:synthtool:Using precloned repo /home/kbuilder/.cache/synthtool/googleapis\n2020-06-05 04:43:33,498 synthtool [DEBUG] > Pulling Docker image: gapic-generator-typescript:latest\nDEBUG:synthtool:Pulling Docker image: gapic-generator-typescript:latest\nlatest: Pulling from gapic-images/gapic-generator-typescript\nDigest: sha256:c9bc12024eddcfb94501627ff5b3ea302370995e9a2c9cde6b3317375d7e7b66\nStatus: Image is up to date for gcr.io/gapic-images/gapic-generator-typescript:latest\n2020-06-05 04:43:34,366 synthtool [DEBUG] > Generating code for: google/cloud/webrisk/v1beta1.\nDEBUG:synthtool:Generating code for: google/cloud/webrisk/v1beta1.\n2020-06-05 04:43:35,196 synthtool [DEBUG] > Wrote metadata to synth.metadata.\nDEBUG:synthtool:Wrote metadata to synth.metadata.\nTraceback (most recent call last):\n File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main\n "__main__", mod_spec)\n File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code\n exec(code, run_globals)\n File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>\n main()\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__\n return self.main(*args, **kwargs)\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main\n rv = self.invoke(ctx)\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke\n return ctx.invoke(self.callback, **ctx.params)\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke\n return callback(*args, **kwargs)\n File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main\n spec.loader.exec_module(synth_module) # type: ignore\n File "<frozen importlib._bootstrap_external>", line 678, in exec_module\n File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed\n File "/home/kbuilder/.cache/synthtool/nodejs-web-risk/synth.py", line 43, in <module>\n version=version)\n File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_microgenerator.py", line 66, in typescript_library\n return self._generate_code(service, version, "typescript", **kwargs)\n File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_microgenerator.py", line 195, in _generate_code\n f"Code generation seemed to succeed, but {output_dir} is empty."\nRuntimeError: Code generation seemed to succeed, but /tmpfs/tmp/tmpszy89504 is empty.\n2020-06-05 04:43:35,236 autosynth [ERROR] > Synthesis failed\n2020-06-05 04:43:35,237 autosynth [DEBUG] > Running: git reset --hard HEAD\nHEAD is now at 934b63a fix: version rearrange (#163)\n2020-06-05 04:43:35,242 autosynth [DEBUG] > Running: git checkout autosynth-googleapis\nSwitched to branch \'autosynth-googleapis\'\n2020-06-05 04:43:35,246 autosynth [ERROR] > Command \'[\'/tmpfs/src/github/synthtool/env/bin/python3\', \'-m\', \'synthtool\', \'--metadata\', \'synth.metadata\', \'synth.py\', \'--\']\' returned non-zero exit status 1.\n2020-06-05 04:43:35,382 autosynth [DEBUG] > Running: git checkout 934b63aa0163c84aceb27d306e1f656bcb2d8afd\nNote: checking out \'934b63aa0163c84aceb27d306e1f656bcb2d8afd\'.\n\nYou are in \'detached HEAD\' state. You can look around, make experimental\nchanges and commit them, and you can discard any commits you make in this\nstate without impacting any branches by performing another checkout.\n\nIf you want to create a new branch to retain commits you create, you may\ndo so (now or later) by using -b with the checkout command again. Example:\n\n git checkout -b <new-branch-name>\n\nHEAD is now at 934b63a fix: version rearrange (#163)\n2020-06-05 04:43:35,387 autosynth [DEBUG] > Running: git checkout d53a5b45c46920932dbe7d0a95e10d8b58933dae\nPrevious HEAD position was f06a850 fix: strip PR numbers from commit (#597)\nHEAD is now at d53a5b4 docs: improve README (#600)\n2020-06-05 04:43:35,394 autosynth [DEBUG] > Running: git checkout cd804bab06e46dd1a4f16c32155fd3cddb931b52\nHEAD is now at cd804bab docs: cleaned docs for the Agents service and resource.\n2020-06-05 04:43:35,406 autosynth [DEBUG] > Running: git branch -f autosynth-22\n2020-06-05 04:43:35,410 autosynth [DEBUG] > Running: git checkout autosynth-22\nSwitched to branch \'autosynth-22\'\n2020-06-05 04:43:35,414 autosynth [INFO] > Running synthtool\n2020-06-05 04:43:35,414 autosynth [INFO] > [\'/tmpfs/src/github/synthtool/env/bin/python3\', \'-m\', \'synthtool\', \'--metadata\', \'synth.metadata\', \'synth.py\', \'--\']\n2020-06-05 04:43:35,416 autosynth [DEBUG] > Running: /tmpfs/src/github/synthtool/env/bin/python3 -m synthtool --metadata synth.metadata synth.py --\n2020-06-05 04:43:35,632 synthtool [DEBUG] > Executing /home/kbuilder/.cache/synthtool/nodejs-web-risk/synth.py.\nOn branch autosynth-22\nnothing to commit, working tree clean\n2020-06-05 04:43:35,762 synthtool [DEBUG] > Ensuring dependencies.\nDEBUG:synthtool:Ensuring dependencies.\n2020-06-05 04:43:35,767 synthtool [DEBUG] > Cloning googleapis.\nDEBUG:synthtool:Cloning googleapis.\n2020-06-05 04:43:35,768 synthtool [DEBUG] > Using precloned repo /home/kbuilder/.cache/synthtool/googleapis\nDEBUG:synthtool:Using precloned repo /home/kbuilder/.cache/synthtool/googleapis\n2020-06-05 04:43:35,772 synthtool [DEBUG] > Pulling Docker image: gapic-generator-typescript:latest\nDEBUG:synthtool:Pulling Docker image: gapic-generator-typescript:latest\nlatest: Pulling from gapic-images/gapic-generator-typescript\nDigest: sha256:c9bc12024eddcfb94501627ff5b3ea302370995e9a2c9cde6b3317375d7e7b66\nStatus: Image is up to date for gcr.io/gapic-images/gapic-generator-typescript:latest\n2020-06-05 04:43:36,631 synthtool [DEBUG] > Generating code for: google/cloud/webrisk/v1beta1.\nDEBUG:synthtool:Generating code for: google/cloud/webrisk/v1beta1.\n2020-06-05 04:43:37,458 synthtool [DEBUG] > Wrote metadata to synth.metadata.\nDEBUG:synthtool:Wrote metadata to synth.metadata.\nTraceback (most recent call last):\n File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main\n "__main__", mod_spec)\n File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code\n exec(code, run_globals)\n File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>\n main()\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__\n return self.main(*args, **kwargs)\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main\n rv = self.invoke(ctx)\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke\n return ctx.invoke(self.callback, **ctx.params)\n File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke\n return callback(*args, **kwargs)\n File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main\n spec.loader.exec_module(synth_module) # type: ignore\n File "<frozen importlib._bootstrap_external>", line 678, in exec_module\n File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed\n File "/home/kbuilder/.cache/synthtool/nodejs-web-risk/synth.py", line 43, in <module>\n version=version)\n File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_microgenerator.py", line 66, in typescript_library\n return self._generate_code(service, version, "typescript", **kwargs)\n File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_microgenerator.py", line 195, in _generate_code\n f"Code generation seemed to succeed, but {output_dir} is empty."\nRuntimeError: Code generation seemed to succeed, but /tmpfs/tmp/tmp4g6edl6q is empty.\n2020-06-05 04:43:37,498 autosynth [ERROR] > Synthesis failed\n2020-06-05 04:43:37,498 autosynth [DEBUG] > Running: git reset --hard HEAD\nHEAD is now at 934b63a fix: version rearrange (#163)\n2020-06-05 04:43:37,505 autosynth [DEBUG] > Running: git checkout autosynth\nSwitched to branch \'autosynth\'\n2020-06-05 04:43:37,510 autosynth [DEBUG] > Running: git clean -fdx\nRemoving __pycache__/\nTraceback (most recent call last):\n File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main\n "__main__", mod_spec)\n File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code\n exec(code, run_globals)\n File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 615, in <module>\n main()\n File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 476, in main\n return _inner_main(temp_dir)\n File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 595, in _inner_main\n commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)\n File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 371, in synthesize_loop\n synthesize_inner_loop(toolbox, synthesizer)\n File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 381, in synthesize_inner_loop\n synthesizer, len(toolbox.versions) - 1\n File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 266, in synthesize_version_in_new_branch\n synthesizer.synthesize(synth_log_path, self.environ)\n File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 119, in synthesize\n synth_proc.check_returncode() # Raise an exception.\n File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode\n self.stderr)\nsubprocess.CalledProcessError: Command \'[\'/tmpfs/src/github/synthtool/env/bin/python3\', \'-m\', \'synthtool\', \'--metadata\', \'synth.metadata\', \'synth.py\', \'--\']\' returned non-zero exit status 1.\n' ``` Google internal developers can see the full log [here](http://sponge/a84fe804-6b84-41a5-b1f2-e6d681e30fa2).
non_process
synthesis failed for nodejs web risk hello autosynth couldn t regenerate nodejs web risk broken heart here s the output from running synth py b gleapis nnothing to commit working tree clean synthtool ensuring dependencies ndebug synthtool ensuring dependencies synthtool cloning googleapis ndebug synthtool cloning googleapis synthtool using precloned repo home kbuilder cache synthtool googleapis ndebug synthtool using precloned repo home kbuilder cache synthtool googleapis synthtool pulling docker image gapic generator typescript latest ndebug synthtool pulling docker image gapic generator typescript latest nlatest pulling from gapic images gapic generator typescript ndigest nstatus image is up to date for gcr io gapic images gapic generator typescript latest synthtool generating code for google cloud webrisk ndebug synthtool generating code for google cloud webrisk synthtool wrote metadata to synth metadata ndebug synthtool wrote metadata to synth metadata ntraceback most recent call last n file home kbuilder pyenv versions lib runpy py line in run module as main n main mod spec n file home kbuilder pyenv versions lib runpy py line in run code n exec code run globals n file tmpfs src github synthtool synthtool main py line in n main n file tmpfs src github synthtool env lib site packages click core py line in call n return self main args kwargs n file tmpfs src github synthtool env lib site packages click core py line in main n rv self invoke ctx n file tmpfs src github synthtool env lib site packages click core py line in invoke n return ctx invoke self callback ctx params n file tmpfs src github synthtool env lib site packages click core py line in invoke n return callback args kwargs n file tmpfs src github synthtool synthtool main py line in main n spec loader exec module synth module type ignore n file line in exec module n file line in call with frames removed n file home kbuilder cache synthtool nodejs web risk synth py line in n version version n file tmpfs src github synthtool synthtool gcp gapic microgenerator py line in typescript library n return self generate code service version typescript kwargs n file tmpfs src github synthtool synthtool gcp gapic microgenerator py line in generate code n f code generation seemed to succeed but output dir is empty nruntimeerror code generation seemed to succeed but tmpfs tmp is empty autosynth synthesis failed autosynth running git reset hard head nhead is now at fix version rearrange autosynth running git checkout autosynth googleapis nswitched to branch autosynth googleapis autosynth command returned non zero exit status autosynth running git checkout nnote checking out n nyou are in detached head state you can look around make experimental nchanges and commit them and you can discard any commits you make in this nstate without impacting any branches by performing another checkout n nif you want to create a new branch to retain commits you create you may ndo so now or later by using b with the checkout command again example n n git checkout b n nhead is now at fix version rearrange autosynth running git checkout nprevious head position was fix strip pr numbers from commit nhead is now at docs improve readme autosynth running git checkout nhead is now at docs cleaned docs for the agents service and resource autosynth running git branch f autosynth autosynth running git checkout autosynth nswitched to branch autosynth autosynth running synthtool autosynth autosynth running tmpfs src github synthtool env bin m synthtool metadata synth metadata synth py synthtool executing home kbuilder cache synthtool nodejs web risk synth py non branch autosynth nnothing to commit working tree clean synthtool ensuring dependencies ndebug synthtool ensuring dependencies synthtool cloning googleapis ndebug synthtool cloning googleapis synthtool using precloned repo home kbuilder cache synthtool googleapis ndebug synthtool using precloned repo home kbuilder cache synthtool googleapis synthtool pulling docker image gapic generator typescript latest ndebug synthtool pulling docker image gapic generator typescript latest nlatest pulling from gapic images gapic generator typescript ndigest nstatus image is up to date for gcr io gapic images gapic generator typescript latest synthtool generating code for google cloud webrisk ndebug synthtool generating code for google cloud webrisk synthtool wrote metadata to synth metadata ndebug synthtool wrote metadata to synth metadata ntraceback most recent call last n file home kbuilder pyenv versions lib runpy py line in run module as main n main mod spec n file home kbuilder pyenv versions lib runpy py line in run code n exec code run globals n file tmpfs src github synthtool synthtool main py line in n main n file tmpfs src github synthtool env lib site packages click core py line in call n return self main args kwargs n file tmpfs src github synthtool env lib site packages click core py line in main n rv self invoke ctx n file tmpfs src github synthtool env lib site packages click core py line in invoke n return ctx invoke self callback ctx params n file tmpfs src github synthtool env lib site packages click core py line in invoke n return callback args kwargs n file tmpfs src github synthtool synthtool main py line in main n spec loader exec module synth module type ignore n file line in exec module n file line in call with frames removed n file home kbuilder cache synthtool nodejs web risk synth py line in n version version n file tmpfs src github synthtool synthtool gcp gapic microgenerator py line in typescript library n return self generate code service version typescript kwargs n file tmpfs src github synthtool synthtool gcp gapic microgenerator py line in generate code n f code generation seemed to succeed but output dir is empty nruntimeerror code generation seemed to succeed but tmpfs tmp is empty autosynth synthesis failed autosynth running git reset hard head nhead is now at fix version rearrange autosynth running git checkout autosynth nswitched to branch autosynth autosynth running git clean fdx nremoving pycache ntraceback most recent call last n file home kbuilder pyenv versions lib runpy py line in run module as main n main mod spec n file home kbuilder pyenv versions lib runpy py line in run code n exec code run globals n file tmpfs src github synthtool autosynth synth py line in n main n file tmpfs src github synthtool autosynth synth py line in main n return inner main temp dir n file tmpfs src github synthtool autosynth synth py line in inner main n commit count synthesize loop x multiple prs change pusher synthesizer n file tmpfs src github synthtool autosynth synth py line in synthesize loop n synthesize inner loop toolbox synthesizer n file tmpfs src github synthtool autosynth synth py line in synthesize inner loop n synthesizer len toolbox versions n file tmpfs src github synthtool autosynth synth py line in synthesize version in new branch n synthesizer synthesize synth log path self environ n file tmpfs src github synthtool autosynth synthesizer py line in synthesize n synth proc check returncode raise an exception n file home kbuilder pyenv versions lib subprocess py line in check returncode n self stderr nsubprocess calledprocesserror command returned non zero exit status n google internal developers can see the full log
0
597,663
18,168,412,987
IssuesEvent
2021-09-27 17:00:16
googleapis/google-cloud-go
https://api.github.com/repos/googleapis/google-cloud-go
closed
bigquery: TestIntegration_RemoveTimePartitioning failed
type: bug api: bigquery priority: p1 flakybot: issue
This test failed! To configure my behavior, see [the Flaky Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/main/packages/flakybot). If I'm commenting on this issue too often, add the `flakybot: quiet` label and I will stop commenting. --- commit: 05fe61c5aa4860bdebbbe3e91a9afaba16aa6184 buildURL: [Build Status](https://source.cloud.google.com/results/invocations/0a012b04-a3d5-4041-8b71-725d6359a89a), [Sponge](http://sponge2/0a012b04-a3d5-4041-8b71-725d6359a89a) status: failed <details><summary>Test output</summary><br><pre> integration_test.go:663: googleapi: Error 503: Error encountered during execution. Retrying may solve the problem., backendError</pre></details>
1.0
bigquery: TestIntegration_RemoveTimePartitioning failed - This test failed! To configure my behavior, see [the Flaky Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/main/packages/flakybot). If I'm commenting on this issue too often, add the `flakybot: quiet` label and I will stop commenting. --- commit: 05fe61c5aa4860bdebbbe3e91a9afaba16aa6184 buildURL: [Build Status](https://source.cloud.google.com/results/invocations/0a012b04-a3d5-4041-8b71-725d6359a89a), [Sponge](http://sponge2/0a012b04-a3d5-4041-8b71-725d6359a89a) status: failed <details><summary>Test output</summary><br><pre> integration_test.go:663: googleapi: Error 503: Error encountered during execution. Retrying may solve the problem., backendError</pre></details>
non_process
bigquery testintegration removetimepartitioning failed this test failed to configure my behavior see if i m commenting on this issue too often add the flakybot quiet label and i will stop commenting commit buildurl status failed test output integration test go googleapi error error encountered during execution retrying may solve the problem backenderror
0
357,248
25,176,352,130
IssuesEvent
2022-11-11 09:36:21
arnav-ag/pe
https://api.github.com/repos/arnav-ag/pe
opened
[DG] NFR Scope
severity.Low type.DocumentationBug
For point 4: ![image.png](https://raw.githubusercontent.com/arnav-ag/pe/main/files/2493f7a4-f643-4975-96fd-819881cc9ae0.png) It is hard to know when it is achieved. How do we know if the workload indicators are easily distinguishable? On the first run of using the application, I did not notice the indicators, so do I consider that being indistinguishable? <!--session: 1668153189126-cbf942b0-4f05-4a2e-b24a-142a2981a703--> <!--Version: Web v3.4.4-->
1.0
[DG] NFR Scope - For point 4: ![image.png](https://raw.githubusercontent.com/arnav-ag/pe/main/files/2493f7a4-f643-4975-96fd-819881cc9ae0.png) It is hard to know when it is achieved. How do we know if the workload indicators are easily distinguishable? On the first run of using the application, I did not notice the indicators, so do I consider that being indistinguishable? <!--session: 1668153189126-cbf942b0-4f05-4a2e-b24a-142a2981a703--> <!--Version: Web v3.4.4-->
non_process
nfr scope for point it is hard to know when it is achieved how do we know if the workload indicators are easily distinguishable on the first run of using the application i did not notice the indicators so do i consider that being indistinguishable
0
75,829
26,082,594,755
IssuesEvent
2022-12-25 15:57:53
vector-im/element-ios
https://api.github.com/repos/vector-im/element-ios
opened
Loudspeaker not automatically activated when giving a video call - Loudspeaker's icon not explicit
T-Defect
### Steps to reproduce Where are you starting? What can you see? One of my relative uses an iphone 8 with Element 1.9.14 Problem 1. When she calls me by video, the loudspeaker is not automatically activated. Also the loudspeaker's icon is greyed (see screenshot), so the loudspeaker's status is not very clear as it could be, for example the icon could be striked through when deactivated. Problem 2. When she taps on the loudspeaker's icon, the sound comes from the loudspeaker as attended but the icon don't change (it stays greyed). ### Outcome #### What did you expect? When the iphone of my relative calls me by video, the loudspeaker should be activated on her iphone. Also the loudspeaker's icon should: - change when loudspeaker is activated/deactivated - be more explicit (striked through = deactivated, not striked through = activated). #### What happened instead? The loudspeaker is not activated when my relative calls me by video. The meaning of the loudspeaker's icon is not explicit enough. The icon does not change when toggled. ### Your phone model iphone 8 ### Operating system version _No response_ ### Application version Element 1.9.14 ### Homeserver Matrix.org ### Will you send logs? No
1.0
Loudspeaker not automatically activated when giving a video call - Loudspeaker's icon not explicit - ### Steps to reproduce Where are you starting? What can you see? One of my relative uses an iphone 8 with Element 1.9.14 Problem 1. When she calls me by video, the loudspeaker is not automatically activated. Also the loudspeaker's icon is greyed (see screenshot), so the loudspeaker's status is not very clear as it could be, for example the icon could be striked through when deactivated. Problem 2. When she taps on the loudspeaker's icon, the sound comes from the loudspeaker as attended but the icon don't change (it stays greyed). ### Outcome #### What did you expect? When the iphone of my relative calls me by video, the loudspeaker should be activated on her iphone. Also the loudspeaker's icon should: - change when loudspeaker is activated/deactivated - be more explicit (striked through = deactivated, not striked through = activated). #### What happened instead? The loudspeaker is not activated when my relative calls me by video. The meaning of the loudspeaker's icon is not explicit enough. The icon does not change when toggled. ### Your phone model iphone 8 ### Operating system version _No response_ ### Application version Element 1.9.14 ### Homeserver Matrix.org ### Will you send logs? No
non_process
loudspeaker not automatically activated when giving a video call loudspeaker s icon not explicit steps to reproduce where are you starting what can you see one of my relative uses an iphone with element problem when she calls me by video the loudspeaker is not automatically activated also the loudspeaker s icon is greyed see screenshot so the loudspeaker s status is not very clear as it could be for example the icon could be striked through when deactivated problem when she taps on the loudspeaker s icon the sound comes from the loudspeaker as attended but the icon don t change it stays greyed outcome what did you expect when the iphone of my relative calls me by video the loudspeaker should be activated on her iphone also the loudspeaker s icon should change when loudspeaker is activated deactivated be more explicit striked through deactivated not striked through activated what happened instead the loudspeaker is not activated when my relative calls me by video the meaning of the loudspeaker s icon is not explicit enough the icon does not change when toggled your phone model iphone operating system version no response application version element homeserver matrix org will you send logs no
0
9,491
12,484,206,101
IssuesEvent
2020-05-30 13:36:35
underdocs/underdocs
https://api.github.com/repos/underdocs/underdocs
opened
Preprocessor Conditional as Entity
area: parser area: renderer epic: preprocessor type: epic :100:
## Brief Promote preprocessor conditionals to separate, documentable entities. ## Background There are multiple reasons why this change would be beneficial: * complex conditionals deserve their own documentation, so that clients would know how different values of macros would affect compilation, * we would be able to show for each declaration which conditionals affect them, * optionally, we can even evaluate conditionals on the client side.
1.0
Preprocessor Conditional as Entity - ## Brief Promote preprocessor conditionals to separate, documentable entities. ## Background There are multiple reasons why this change would be beneficial: * complex conditionals deserve their own documentation, so that clients would know how different values of macros would affect compilation, * we would be able to show for each declaration which conditionals affect them, * optionally, we can even evaluate conditionals on the client side.
process
preprocessor conditional as entity brief promote preprocessor conditionals to separate documentable entities background there are multiple reasons why this change would be beneficial complex conditionals deserve their own documentation so that clients would know how different values of macros would affect compilation we would be able to show for each declaration which conditionals affect them optionally we can even evaluate conditionals on the client side
1
21,625
30,024,821,553
IssuesEvent
2023-06-27 04:48:33
h4sh5/pypi-auto-scanner
https://api.github.com/repos/h4sh5/pypi-auto-scanner
opened
pih 1.45 has 2 GuardDog issues
guarddog typosquatting silent-process-execution
https://pypi.org/project/pih https://inspector.pypi.io/project/pih ```{ "dependency": "pih", "version": "1.45", "result": { "issues": 2, "errors": {}, "results": { "typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: pip, pid", "silent-process-execution": [ { "location": "pih-1.45/pih/tools.py:719", "code": " result = subprocess.run(command, stdin=subprocess.DEVNULL, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)", "message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null" } ] }, "path": "/tmp/tmpxbynfoez/pih" } }```
1.0
pih 1.45 has 2 GuardDog issues - https://pypi.org/project/pih https://inspector.pypi.io/project/pih ```{ "dependency": "pih", "version": "1.45", "result": { "issues": 2, "errors": {}, "results": { "typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: pip, pid", "silent-process-execution": [ { "location": "pih-1.45/pih/tools.py:719", "code": " result = subprocess.run(command, stdin=subprocess.DEVNULL, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)", "message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null" } ] }, "path": "/tmp/tmpxbynfoez/pih" } }```
process
pih has guarddog issues dependency pih version result issues errors results typosquatting this package closely ressembles the following package names and might be a typosquatting attempt pip pid silent process execution location pih pih tools py code result subprocess run command stdin subprocess devnull stdout subprocess devnull stderr subprocess devnull message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null path tmp tmpxbynfoez pih
1
139,953
20,986,804,777
IssuesEvent
2022-03-29 04:41:40
batch-dart/batch.dart
https://api.github.com/repos/batch-dart/batch.dart
opened
Support logging features in parallel processing
Feedback: improvement Type: design
<!-- When reporting improvement, please read this complete template and fill all the questions in order to get a better response --> # 1. What could be improved <!-- What part of the code/functionality could be improved? --> Support logging features in parallel processing. # 2. Why should this be improved <!-- Why is this necessary to be improved? --> Design a process that enables logging during parallel processing, since logger instances created by the main thread cannot be shared during parallel processing. # 3. Any risks? <!-- Are there any risks in improving this? Will the API change? Will other functionality change? --> No. # 4. More information <!-- Do you have any other useful information about this improvement report? Please write it down here --> <!-- Possible helpful information: references to other sites/repositories --> <!-- Are you interested in working on a PR for this? -->
1.0
Support logging features in parallel processing - <!-- When reporting improvement, please read this complete template and fill all the questions in order to get a better response --> # 1. What could be improved <!-- What part of the code/functionality could be improved? --> Support logging features in parallel processing. # 2. Why should this be improved <!-- Why is this necessary to be improved? --> Design a process that enables logging during parallel processing, since logger instances created by the main thread cannot be shared during parallel processing. # 3. Any risks? <!-- Are there any risks in improving this? Will the API change? Will other functionality change? --> No. # 4. More information <!-- Do you have any other useful information about this improvement report? Please write it down here --> <!-- Possible helpful information: references to other sites/repositories --> <!-- Are you interested in working on a PR for this? -->
non_process
support logging features in parallel processing what could be improved support logging features in parallel processing why should this be improved design a process that enables logging during parallel processing since logger instances created by the main thread cannot be shared during parallel processing any risks no more information
0
612,090
18,990,616,869
IssuesEvent
2021-11-22 06:39:35
rafinkanisa/ngm-reportDesk
https://api.github.com/repos/rafinkanisa/ngm-reportDesk
closed
RH AF Health Updates Activity Name
priority
As requested by the Health cluster Kindly update the following activity into a new name Risk Community and Community Engagement to : Risk Communication and Community Engagement Required todo : Update the database
1.0
RH AF Health Updates Activity Name - As requested by the Health cluster Kindly update the following activity into a new name Risk Community and Community Engagement to : Risk Communication and Community Engagement Required todo : Update the database
non_process
rh af health updates activity name as requested by the health cluster kindly update the following activity into a new name risk community and community engagement to risk communication and community engagement required todo update the database
0
5,378
5,672,304,296
IssuesEvent
2017-04-12 00:50:00
LiskHQ/lisk
https://api.github.com/repos/LiskHQ/lisk
closed
SQL performance & flexibility
enhancement performance security
- [x] SQL performance & flexibility - [x] Review queries and test their performance, find slow queries/views - [x] Improve performance for existing queries/view (if needed) - [x] Review data types performance & consistency From: #449
True
SQL performance & flexibility - - [x] SQL performance & flexibility - [x] Review queries and test their performance, find slow queries/views - [x] Improve performance for existing queries/view (if needed) - [x] Review data types performance & consistency From: #449
non_process
sql performance flexibility sql performance flexibility review queries and test their performance find slow queries views improve performance for existing queries view if needed review data types performance consistency from
0
345,491
24,862,202,710
IssuesEvent
2022-10-27 09:08:23
woocommerce/woocommerce-blocks
https://api.github.com/repos/woocommerce/woocommerce-blocks
opened
Document how to use @woocommerce/shared-hocs
type: documentation type: cooldown
In #7454, @Tropicalista asked how to use `@woocommerce/shared-hocs` as this was not clear in our docs. This issue aims to improve our docs by explaining how to use `@woocommerce/shared-hocs` and showing an example, e.g. https://github.com/woocommerce/woocommerce-blocks/issues/7454#issuecomment-1289519939.
1.0
Document how to use @woocommerce/shared-hocs - In #7454, @Tropicalista asked how to use `@woocommerce/shared-hocs` as this was not clear in our docs. This issue aims to improve our docs by explaining how to use `@woocommerce/shared-hocs` and showing an example, e.g. https://github.com/woocommerce/woocommerce-blocks/issues/7454#issuecomment-1289519939.
non_process
document how to use woocommerce shared hocs in tropicalista asked how to use woocommerce shared hocs as this was not clear in our docs this issue aims to improve our docs by explaining how to use woocommerce shared hocs and showing an example e g
0
661,707
22,066,191,533
IssuesEvent
2022-05-31 03:41:01
status-im/status-desktop
https://api.github.com/repos/status-im/status-desktop
opened
The date is shown when trying to copy Profile URL on the Profile
bug priority 3: low Settings
# Bug Report ## Description <!-- Provide a short description describing the problem you are experiencing. --> ## Steps to reproduce 1. Open a public chat and click on any name of the members 2. Click 'View Profile' - click on the icon to copy the Profile URL #### Expected behavior The date is not shown while using on the Profile page #### Actual behavior The date appears under the tip 'Copy to clipboard'. It looks like the Chat is active under the Profile page ### Additional Information <img width="1116" alt="image" src="https://user-images.githubusercontent.com/14942081/171087700-536791ec-f7f8-4b68-896c-478c5678b37b.png"> Status desktop version: [Build #2137 (May 30, 2022, 9:02:25 AM)](https://ci.status.im/job/status-desktop/job/branches/job/macos/job/master/2137/) Operating System: macOS Monterey 12.3 Beta (21E5227a) M1
1.0
The date is shown when trying to copy Profile URL on the Profile - # Bug Report ## Description <!-- Provide a short description describing the problem you are experiencing. --> ## Steps to reproduce 1. Open a public chat and click on any name of the members 2. Click 'View Profile' - click on the icon to copy the Profile URL #### Expected behavior The date is not shown while using on the Profile page #### Actual behavior The date appears under the tip 'Copy to clipboard'. It looks like the Chat is active under the Profile page ### Additional Information <img width="1116" alt="image" src="https://user-images.githubusercontent.com/14942081/171087700-536791ec-f7f8-4b68-896c-478c5678b37b.png"> Status desktop version: [Build #2137 (May 30, 2022, 9:02:25 AM)](https://ci.status.im/job/status-desktop/job/branches/job/macos/job/master/2137/) Operating System: macOS Monterey 12.3 Beta (21E5227a) M1
non_process
the date is shown when trying to copy profile url on the profile bug report description steps to reproduce open a public chat and click on any name of the members click view profile click on the icon to copy the profile url expected behavior the date is not shown while using on the profile page actual behavior the date appears under the tip copy to clipboard it looks like the chat is active under the profile page additional information img width alt image src status desktop version operating system macos monterey beta
0
10,058
13,044,161,774
IssuesEvent
2020-07-29 03:47:26
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `AddDateDatetimeString` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `AddDateDatetimeString` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @breeswish ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `AddDateDatetimeString` from TiDB - ## Description Port the scalar function `AddDateDatetimeString` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @breeswish ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function adddatedatetimestring from tidb description port the scalar function adddatedatetimestring from tidb to coprocessor score mentor s breeswish recommended skills rust programming learning materials already implemented expressions ported from tidb
1
206,312
23,374,924,098
IssuesEvent
2022-08-11 01:08:14
Galaxy-Software-Service/WebGoat
https://api.github.com/repos/Galaxy-Software-Service/WebGoat
opened
CVE-2022-31160 (Medium) detected in jquery-ui-1.12.1.min.js
security vulnerability
## CVE-2022-31160 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-ui-1.12.1.min.js</b></p></summary> <p>A curated set of user interface interactions, effects, widgets, and themes built on top of the jQuery JavaScript Library.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jqueryui/1.12.1/jquery-ui.min.js">https://cdnjs.cloudflare.com/ajax/libs/jqueryui/1.12.1/jquery-ui.min.js</a></p> <p>Path to vulnerable library: /webgoat-container/src/main/resources/static/js/libs/jquery-ui.min.js,/webgoat-container/target/classes/static/js/libs/jquery-ui.min.js</p> <p> Dependency Hierarchy: - :x: **jquery-ui-1.12.1.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Galaxy-Software-Service/WebGoat/commit/12040d57fd51ffa0b6407f8b4e9cc04e47656d2d">12040d57fd51ffa0b6407f8b4e9cc04e47656d2d</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jQuery UI is a curated set of user interface interactions, effects, widgets, and themes built on top of jQuery. Versions prior to 1.13.2 are potentially vulnerable to cross-site scripting. Initializing a checkboxradio widget on an input enclosed within a label makes that parent label contents considered as the input label. Calling `.checkboxradio( "refresh" )` on such a widget and the initial HTML contained encoded HTML entities will make them erroneously get decoded. This can lead to potentially executing JavaScript code. The bug has been patched in jQuery UI 1.13.2. To remediate the issue, someone who can change the initial HTML can wrap all the non-input contents of the `label` in a `span`. <p>Publish Date: 2022-07-20 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31160>CVE-2022-31160</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31160">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31160</a></p> <p>Release Date: 2022-07-20</p> <p>Fix Resolution: jquery-ui - 1.13.2</p> </p> </details> <p></p>
True
CVE-2022-31160 (Medium) detected in jquery-ui-1.12.1.min.js - ## CVE-2022-31160 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-ui-1.12.1.min.js</b></p></summary> <p>A curated set of user interface interactions, effects, widgets, and themes built on top of the jQuery JavaScript Library.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jqueryui/1.12.1/jquery-ui.min.js">https://cdnjs.cloudflare.com/ajax/libs/jqueryui/1.12.1/jquery-ui.min.js</a></p> <p>Path to vulnerable library: /webgoat-container/src/main/resources/static/js/libs/jquery-ui.min.js,/webgoat-container/target/classes/static/js/libs/jquery-ui.min.js</p> <p> Dependency Hierarchy: - :x: **jquery-ui-1.12.1.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Galaxy-Software-Service/WebGoat/commit/12040d57fd51ffa0b6407f8b4e9cc04e47656d2d">12040d57fd51ffa0b6407f8b4e9cc04e47656d2d</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jQuery UI is a curated set of user interface interactions, effects, widgets, and themes built on top of jQuery. Versions prior to 1.13.2 are potentially vulnerable to cross-site scripting. Initializing a checkboxradio widget on an input enclosed within a label makes that parent label contents considered as the input label. Calling `.checkboxradio( "refresh" )` on such a widget and the initial HTML contained encoded HTML entities will make them erroneously get decoded. This can lead to potentially executing JavaScript code. The bug has been patched in jQuery UI 1.13.2. To remediate the issue, someone who can change the initial HTML can wrap all the non-input contents of the `label` in a `span`. <p>Publish Date: 2022-07-20 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31160>CVE-2022-31160</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31160">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31160</a></p> <p>Release Date: 2022-07-20</p> <p>Fix Resolution: jquery-ui - 1.13.2</p> </p> </details> <p></p>
non_process
cve medium detected in jquery ui min js cve medium severity vulnerability vulnerable library jquery ui min js a curated set of user interface interactions effects widgets and themes built on top of the jquery javascript library library home page a href path to vulnerable library webgoat container src main resources static js libs jquery ui min js webgoat container target classes static js libs jquery ui min js dependency hierarchy x jquery ui min js vulnerable library found in head commit a href found in base branch master vulnerability details jquery ui is a curated set of user interface interactions effects widgets and themes built on top of jquery versions prior to are potentially vulnerable to cross site scripting initializing a checkboxradio widget on an input enclosed within a label makes that parent label contents considered as the input label calling checkboxradio refresh on such a widget and the initial html contained encoded html entities will make them erroneously get decoded this can lead to potentially executing javascript code the bug has been patched in jquery ui to remediate the issue someone who can change the initial html can wrap all the non input contents of the label in a span publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery ui
0
218,298
16,758,603,678
IssuesEvent
2021-06-13 10:12:54
bounswe/2021SpringGroup9
https://api.github.com/repos/bounswe/2021SpringGroup9
opened
Write Evaluation of Tools and Processes for Milestone Report-2
difficulty: medium documentation practice-app priority: critical
I will write the evaluation for some tools and processes. @mecolak will also work on evaluations for other tools: #189 My tools are: - [ ] AWS - [ ] Docker - [ ] Git - [ ] GitHub - [ ] VSCode
1.0
Write Evaluation of Tools and Processes for Milestone Report-2 - I will write the evaluation for some tools and processes. @mecolak will also work on evaluations for other tools: #189 My tools are: - [ ] AWS - [ ] Docker - [ ] Git - [ ] GitHub - [ ] VSCode
non_process
write evaluation of tools and processes for milestone report i will write the evaluation for some tools and processes mecolak will also work on evaluations for other tools my tools are aws docker git github vscode
0
19,668
26,027,875,141
IssuesEvent
2022-12-21 17:59:59
hashgraph/hedera-mirror-node
https://api.github.com/repos/hashgraph/hedera-mirror-node
closed
Automated release workflow fails at step Install JDK
bug regression process
### Description The automated release which works with the deprecated maven build system fails at step `Install SDK`: ``` Run actions/setup-java@v3 Installed distributions Creating settings.xml with server-id: github Writing to /home/runner/.m2/settings.xml Error: No file in /home/runner/work/hedera-mirror-node/hedera-mirror-node matched to [**/pom.xml], make sure you have checked out the target repository ``` ### Steps to reproduce https://github.com/hashgraph/hedera-mirror-node/actions/runs/3751326043/jobs/6372181877 ### Additional context _No response_ ### Hedera network other ### Version v0.72.0-SNAPSHOT ### Operating system None
1.0
Automated release workflow fails at step Install JDK - ### Description The automated release which works with the deprecated maven build system fails at step `Install SDK`: ``` Run actions/setup-java@v3 Installed distributions Creating settings.xml with server-id: github Writing to /home/runner/.m2/settings.xml Error: No file in /home/runner/work/hedera-mirror-node/hedera-mirror-node matched to [**/pom.xml], make sure you have checked out the target repository ``` ### Steps to reproduce https://github.com/hashgraph/hedera-mirror-node/actions/runs/3751326043/jobs/6372181877 ### Additional context _No response_ ### Hedera network other ### Version v0.72.0-SNAPSHOT ### Operating system None
process
automated release workflow fails at step install jdk description the automated release which works with the deprecated maven build system fails at step install sdk run actions setup java installed distributions creating settings xml with server id github writing to home runner settings xml error no file in home runner work hedera mirror node hedera mirror node matched to make sure you have checked out the target repository steps to reproduce additional context no response hedera network other version snapshot operating system none
1
21,929
30,446,559,208
IssuesEvent
2023-07-15 18:48:30
h4sh5/pypi-auto-scanner
https://api.github.com/repos/h4sh5/pypi-auto-scanner
opened
pyutils 0.0.1b10 has 2 GuardDog issues
guarddog typosquatting silent-process-execution
https://pypi.org/project/pyutils https://inspector.pypi.io/project/pyutils ```{ "dependency": "pyutils", "version": "0.0.1b10", "result": { "issues": 2, "errors": {}, "results": { "typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: pytils, python-utils", "silent-process-execution": [ { "location": "pyutils/exec_utils.py/pyutils/exec_utils.py:205", "code": " subproc = subprocess.Popen(\n args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n )", "message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null" } ] }, "path": "/tmp/tmph_c__4ti/pyutils" } }```
1.0
pyutils 0.0.1b10 has 2 GuardDog issues - https://pypi.org/project/pyutils https://inspector.pypi.io/project/pyutils ```{ "dependency": "pyutils", "version": "0.0.1b10", "result": { "issues": 2, "errors": {}, "results": { "typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: pytils, python-utils", "silent-process-execution": [ { "location": "pyutils/exec_utils.py/pyutils/exec_utils.py:205", "code": " subproc = subprocess.Popen(\n args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n )", "message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null" } ] }, "path": "/tmp/tmph_c__4ti/pyutils" } }```
process
pyutils has guarddog issues dependency pyutils version result issues errors results typosquatting this package closely ressembles the following package names and might be a typosquatting attempt pytils python utils silent process execution location pyutils exec utils py pyutils exec utils py code subproc subprocess popen n args n stdin subprocess devnull n stdout subprocess devnull n stderr subprocess devnull n message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null path tmp tmph c pyutils
1
89
2,534,632,073
IssuesEvent
2015-01-25 05:23:23
rg3/youtube-dl
https://api.github.com/repos/rg3/youtube-dl
closed
Write a wiki page on how to use axel
postprocessors request
Or another external downloader. We should even implement bindings for some common external downloaders... Like "have axel or curl or wget in your path and use this option and we will call 'em"
1.0
Write a wiki page on how to use axel - Or another external downloader. We should even implement bindings for some common external downloaders... Like "have axel or curl or wget in your path and use this option and we will call 'em"
process
write a wiki page on how to use axel or another external downloader we should even implement bindings for some common external downloaders like have axel or curl or wget in your path and use this option and we will call em
1
2,405
5,193,154,559
IssuesEvent
2017-01-22 16:34:08
raphym/Simulation_of_message_routing_by_intelligent_agents
https://api.github.com/repos/raphym/Simulation_of_message_routing_by_intelligent_agents
opened
backbone discover
being processed
Add to the function runBFS() the reciprocal. if x is in the quorum A so also A is in the quorum B
1.0
backbone discover - Add to the function runBFS() the reciprocal. if x is in the quorum A so also A is in the quorum B
process
backbone discover add to the function runbfs the reciprocal if x is in the quorum a so also a is in the quorum b
1
979
3,438,008,537
IssuesEvent
2015-12-13 17:36:37
pwittchen/ReactiveSensors
https://api.github.com/repos/pwittchen/ReactiveSensors
closed
Release 0.1.1
release process
**Initial release notes**: - bumped RxJava dependency to v. 1.1.0 - bumped RxAndroid dependency to v. 1.1.0 - bumped Google Truth test dependency to v. 0.27 - bumped Gradle Build Tools to v. 1.3.1 **Things to do**: - [x] bump library version - [x] upload archives to Maven Central - [x] close and release artifact on Maven Central - [x] update `CHANGELOG.md` after Maven Sync - [x] bump library version in `README.md` - [x] create new GitHub release
1.0
Release 0.1.1 - **Initial release notes**: - bumped RxJava dependency to v. 1.1.0 - bumped RxAndroid dependency to v. 1.1.0 - bumped Google Truth test dependency to v. 0.27 - bumped Gradle Build Tools to v. 1.3.1 **Things to do**: - [x] bump library version - [x] upload archives to Maven Central - [x] close and release artifact on Maven Central - [x] update `CHANGELOG.md` after Maven Sync - [x] bump library version in `README.md` - [x] create new GitHub release
process
release initial release notes bumped rxjava dependency to v bumped rxandroid dependency to v bumped google truth test dependency to v bumped gradle build tools to v things to do bump library version upload archives to maven central close and release artifact on maven central update changelog md after maven sync bump library version in readme md create new github release
1
13,934
16,701,484,739
IssuesEvent
2021-06-09 03:30:25
googleapis/python-api-core
https://api.github.com/repos/googleapis/python-api-core
closed
pytype is failing for standard `pkgutil.extend_path` usage
priority: p1 type: process
From [this Kokoro failure](https://source.cloud.google.com/results/invocations/c887f16f-87c2-4dcc-acb3-a067d0c4685c/targets/cloud-devrel%2Fclient-libraries%2Fpython%2Fgoogleapis%2Fpython-api-core%2Fpresubmit%2Fpresubmit/log) (there are lots): ```\ nox > Running session pytype nox > Creating virtual environment (virtualenv) using python3.6 in .nox/pytype nox > python -m pip install . grpcio >= 1.8.2 grpcio-gcp >= 0.2.2 pytype >= 2019.3.21 nox > pytype Computing dependencies Analyzing 37 sources with 0 local dependencies ninja: Entering directory `.pytype' [1/37] check google.api_core.gapic_v1.routing_header [2/37] check google.api_core.datetime_helpers [3/37] check google.api_core.exceptions [4/37] check google.api_core.general_helpers [5/37] check google.api_core.timeout [6/37] check google.api_core.retry [7/37] check google.api_core.retry_async [8/37] check google.api_core.version [9/37] check google.api_core.gapic_v1.config [10/37] check google.api_core.gapic_v1.config_async [11/37] check google.api_core.grpc_helpers [12/37] check google.api_core.grpc_helpers_async [13/37] check google.api_core.client_info [14/37] check google.api_core.gapic_v1.client_info [15/37] check google.api_core.gapic_v1.method [16/37] check google.api_core.gapic_v1.method_async [17/37] check google.api_core.gapic_v1.__init__ [18/37] check google.api_core.protobuf_helpers [19/37] check google.api_core.operations_v1.operations_client_config [20/37] check google.api_core.page_iterator [21/37] check google.api_core.operations_v1.operations_client [22/37] check google.api_core.page_iterator_async [23/37] check google.api_core.operations_v1.operations_async_client [24/37] check google.api_core.operations_v1.__init__ [25/37] check google.api_core.future.base [26/37] check google.api_core.future.async_future [27/37] check google.api_core.operation_async [28/37] check google.api_core.future._helpers [29/37] check google.api_core.iam [30/37] check google.api_core.future.polling [31/37] check google.api_core.operation [32/37] check google.__init__ FAILED: /tmpfs/src/github/python-api-core/.pytype/pyi/google/__init__.pyi /tmpfs/src/github/python-api-core/.nox/pytype/bin/python -m pytype.single --disable pyi-error --imports_info /tmpfs/src/github/python-api-core/.pytype/imports/google.__init__.imports --module-name google.__init__ -V 3.6 -o /tmpfs/src/github/python-api-core/.pytype/pyi/google/__init__.pyi --analyze-annotated --nofail --quick /tmpfs/src/github/python-api-core/google/__init__.py File "/tmpfs/src/github/python-api-core/google/__init__.py", line 24, in <module>: Function pkgutil.extend_path was called with the wrong arguments [wrong-arg-types] Expected: (path: List[str], ...) Actually passed: (path: Iterable[str], ...) For more details, see https://google.github.io/pytype/errors.html#wrong-arg-types ninja: build stopped: subcommand failed. Leaving directory '.pytype' nox > Command pytype failed with exit code 1 nox > Session pytype failed. ``` I suspect we should upgrade to a newer version of `pytype` -- the [current release](https://github.com/google/pytype/releases/tag/2021.05.11) is more than two years newer than the one we are using, and e.g. supports Python 3.9.
1.0
pytype is failing for standard `pkgutil.extend_path` usage - From [this Kokoro failure](https://source.cloud.google.com/results/invocations/c887f16f-87c2-4dcc-acb3-a067d0c4685c/targets/cloud-devrel%2Fclient-libraries%2Fpython%2Fgoogleapis%2Fpython-api-core%2Fpresubmit%2Fpresubmit/log) (there are lots): ```\ nox > Running session pytype nox > Creating virtual environment (virtualenv) using python3.6 in .nox/pytype nox > python -m pip install . grpcio >= 1.8.2 grpcio-gcp >= 0.2.2 pytype >= 2019.3.21 nox > pytype Computing dependencies Analyzing 37 sources with 0 local dependencies ninja: Entering directory `.pytype' [1/37] check google.api_core.gapic_v1.routing_header [2/37] check google.api_core.datetime_helpers [3/37] check google.api_core.exceptions [4/37] check google.api_core.general_helpers [5/37] check google.api_core.timeout [6/37] check google.api_core.retry [7/37] check google.api_core.retry_async [8/37] check google.api_core.version [9/37] check google.api_core.gapic_v1.config [10/37] check google.api_core.gapic_v1.config_async [11/37] check google.api_core.grpc_helpers [12/37] check google.api_core.grpc_helpers_async [13/37] check google.api_core.client_info [14/37] check google.api_core.gapic_v1.client_info [15/37] check google.api_core.gapic_v1.method [16/37] check google.api_core.gapic_v1.method_async [17/37] check google.api_core.gapic_v1.__init__ [18/37] check google.api_core.protobuf_helpers [19/37] check google.api_core.operations_v1.operations_client_config [20/37] check google.api_core.page_iterator [21/37] check google.api_core.operations_v1.operations_client [22/37] check google.api_core.page_iterator_async [23/37] check google.api_core.operations_v1.operations_async_client [24/37] check google.api_core.operations_v1.__init__ [25/37] check google.api_core.future.base [26/37] check google.api_core.future.async_future [27/37] check google.api_core.operation_async [28/37] check google.api_core.future._helpers [29/37] check google.api_core.iam [30/37] check google.api_core.future.polling [31/37] check google.api_core.operation [32/37] check google.__init__ FAILED: /tmpfs/src/github/python-api-core/.pytype/pyi/google/__init__.pyi /tmpfs/src/github/python-api-core/.nox/pytype/bin/python -m pytype.single --disable pyi-error --imports_info /tmpfs/src/github/python-api-core/.pytype/imports/google.__init__.imports --module-name google.__init__ -V 3.6 -o /tmpfs/src/github/python-api-core/.pytype/pyi/google/__init__.pyi --analyze-annotated --nofail --quick /tmpfs/src/github/python-api-core/google/__init__.py File "/tmpfs/src/github/python-api-core/google/__init__.py", line 24, in <module>: Function pkgutil.extend_path was called with the wrong arguments [wrong-arg-types] Expected: (path: List[str], ...) Actually passed: (path: Iterable[str], ...) For more details, see https://google.github.io/pytype/errors.html#wrong-arg-types ninja: build stopped: subcommand failed. Leaving directory '.pytype' nox > Command pytype failed with exit code 1 nox > Session pytype failed. ``` I suspect we should upgrade to a newer version of `pytype` -- the [current release](https://github.com/google/pytype/releases/tag/2021.05.11) is more than two years newer than the one we are using, and e.g. supports Python 3.9.
process
pytype is failing for standard pkgutil extend path usage from there are lots nox running session pytype nox creating virtual environment virtualenv using in nox pytype nox python m pip install grpcio grpcio gcp pytype nox pytype computing dependencies analyzing sources with local dependencies ninja entering directory pytype check google api core gapic routing header check google api core datetime helpers check google api core exceptions check google api core general helpers check google api core timeout check google api core retry check google api core retry async check google api core version check google api core gapic config check google api core gapic config async check google api core grpc helpers check google api core grpc helpers async check google api core client info check google api core gapic client info check google api core gapic method check google api core gapic method async check google api core gapic init check google api core protobuf helpers check google api core operations operations client config check google api core page iterator check google api core operations operations client check google api core page iterator async check google api core operations operations async client check google api core operations init check google api core future base check google api core future async future check google api core operation async check google api core future helpers check google api core iam check google api core future polling check google api core operation check google init failed tmpfs src github python api core pytype pyi google init pyi tmpfs src github python api core nox pytype bin python m pytype single disable pyi error imports info tmpfs src github python api core pytype imports google init imports module name google init v o tmpfs src github python api core pytype pyi google init pyi analyze annotated nofail quick tmpfs src github python api core google init py file tmpfs src github python api core google init py line in function pkgutil extend path was called with the wrong arguments expected path list actually passed path iterable for more details see ninja build stopped subcommand failed leaving directory pytype nox command pytype failed with exit code nox session pytype failed i suspect we should upgrade to a newer version of pytype the is more than two years newer than the one we are using and e g supports python
1
18,012
24,031,467,298
IssuesEvent
2022-09-15 15:20:15
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
Merge envenomation/ cytolysis
multi-species process
After discussion with @FJungo - these terms will be merged: - [ ] GO:0044649 envenomation resulting in cytolysis in another organism GO:0051715 cytolysis in another organism - [ ] GO:1990142 envenomation resulting in hemolysis in another organism into GO:0044179 hemolysis in another organism Thanks, Pascale
1.0
Merge envenomation/ cytolysis - After discussion with @FJungo - these terms will be merged: - [ ] GO:0044649 envenomation resulting in cytolysis in another organism GO:0051715 cytolysis in another organism - [ ] GO:1990142 envenomation resulting in hemolysis in another organism into GO:0044179 hemolysis in another organism Thanks, Pascale
process
merge envenomation cytolysis after discussion with fjungo these terms will be merged go envenomation resulting in cytolysis in another organism go cytolysis in another organism go envenomation resulting in hemolysis in another organism into go hemolysis in another organism thanks pascale
1
203,520
15,373,125,780
IssuesEvent
2021-03-02 12:11:06
Altinn/altinn-studio
https://api.github.com/repos/Altinn/altinn-studio
closed
Establish automated tests for app frontend using cypress
area/test kind/user-story
## Description Establish automated tests for app frontend using [cypress ](https://www.cypress.io/) . For the first round, make the tests covering different areas of app frontend in localtest (refer the regression scenarios documented in altinn pedia). This is being considered to extensively test app frontend in local test before merging the changes to master which maskes it available for the app owners. ## Development tasks - [x] Write tests for all regression cases of app frontend - refer altinnpedia - [x] Set up scripts that start localtest, frontend and app with one command - [x] Set up an app that covers the different functionalities - [x] Run the tests on a locally running app frontend at altinn3local.no ## Definition of done - [x] Documentation is updated (if relevant) - [x] Technical documentation (Altinnpedia) - [x] Read me file in github - [x] QA - [x] All tasks in this userstory are closed (i.e. remaining tasks are moved to other user stories or marked obsolete)
1.0
Establish automated tests for app frontend using cypress - ## Description Establish automated tests for app frontend using [cypress ](https://www.cypress.io/) . For the first round, make the tests covering different areas of app frontend in localtest (refer the regression scenarios documented in altinn pedia). This is being considered to extensively test app frontend in local test before merging the changes to master which maskes it available for the app owners. ## Development tasks - [x] Write tests for all regression cases of app frontend - refer altinnpedia - [x] Set up scripts that start localtest, frontend and app with one command - [x] Set up an app that covers the different functionalities - [x] Run the tests on a locally running app frontend at altinn3local.no ## Definition of done - [x] Documentation is updated (if relevant) - [x] Technical documentation (Altinnpedia) - [x] Read me file in github - [x] QA - [x] All tasks in this userstory are closed (i.e. remaining tasks are moved to other user stories or marked obsolete)
non_process
establish automated tests for app frontend using cypress description establish automated tests for app frontend using for the first round make the tests covering different areas of app frontend in localtest refer the regression scenarios documented in altinn pedia this is being considered to extensively test app frontend in local test before merging the changes to master which maskes it available for the app owners development tasks write tests for all regression cases of app frontend refer altinnpedia set up scripts that start localtest frontend and app with one command set up an app that covers the different functionalities run the tests on a locally running app frontend at no definition of done documentation is updated if relevant technical documentation altinnpedia read me file in github qa all tasks in this userstory are closed i e remaining tasks are moved to other user stories or marked obsolete
0
21,476
29,511,370,798
IssuesEvent
2023-06-04 00:44:02
devssa/onde-codar-em-salvador
https://api.github.com/repos/devssa/onde-codar-em-salvador
closed
[Remoto] Tech Lead (.NET/React) na Coodesh
SALVADOR PJ JAVASCRIPT FULL-STACK MVC SQL ENTITY FRAMEWORK JSON REACT REQUISITOS REMOTO ASP.NET PROCESSOS INOVAÇÃO GITHUB AZURE EXCEL UMA LIDERANÇA METODOLOGIAS ÁGEIS TECH LEAD Stale
## Descrição da vaga: Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios. Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/jobs/tech-lead-fullstack-194200289?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋 <p>A Supply4Med está em busca de Tech Lead para compor seu time!</p> <p>Somos uma empresa de soluções para a cadeia de suprimentos da saúde agregando inovação para alcançar excelência e otimização dos processos de compra e venda, logística, finanças, compliance e inteligência de dados, entre outros. Buscamos otimizar a cadeia de suprimentos do setor da saúde para aumentar a eficiência de todo o processo e contribuir para a melhoria da prestação de serviços médico-hospitalares e para os resultados financeiros de nossos clientes.</p> <p>Responsabilidades:</p> <ul> <li>Você será responsável por liderar tecnicamente nossa equipe de desenvolvedores e participar do desenvolvimento de novos projetos e melhorar/qualificar os produtos existentes;</li> <li>Ter conhecimento na parte de metodologias ágeis.</li> </ul> ## Supply4Med: <p>Somos uma empresa de soluções para a cadeia de suprimentos da saúde agregando inovação para alcançar excelência e otimização dos processos de compra e venda, logística, finanças, compliance e inteligência de dados, entre outros. Buscamos otimizar a cadeia de suprimentos do setor da saúde para aumentar a eficiência de todo o processo e contribuir para a melhoria da prestação de serviços médico-hospitalares e para os resultados financeiros de nossos clientes.</p></p> ## Habilidades: - .NET - Javascript - React.js - JSON - Microsoft SQL Server - Entity Framework - Asp.Net - MVC ## Local: 100% Remoto ## Requisitos: - Experiência Sólida em .NET; - Experiência Sólida em React; - Perfil de Liderança; - Ter conhecimento em metodologias ágeis; - Experiência em MVC, SQL Server e Json. ## Diferenciais: - Experiência com Azure. ## Como se candidatar: Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Tech Lead (.NET/React) na Supply4Med](https://coodesh.com/jobs/tech-lead-fullstack-194200289?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação. ## Labels #### Alocação Remoto #### Regime PJ #### Categoria Full-Stack
1.0
[Remoto] Tech Lead (.NET/React) na Coodesh - ## Descrição da vaga: Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios. Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/jobs/tech-lead-fullstack-194200289?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋 <p>A Supply4Med está em busca de Tech Lead para compor seu time!</p> <p>Somos uma empresa de soluções para a cadeia de suprimentos da saúde agregando inovação para alcançar excelência e otimização dos processos de compra e venda, logística, finanças, compliance e inteligência de dados, entre outros. Buscamos otimizar a cadeia de suprimentos do setor da saúde para aumentar a eficiência de todo o processo e contribuir para a melhoria da prestação de serviços médico-hospitalares e para os resultados financeiros de nossos clientes.</p> <p>Responsabilidades:</p> <ul> <li>Você será responsável por liderar tecnicamente nossa equipe de desenvolvedores e participar do desenvolvimento de novos projetos e melhorar/qualificar os produtos existentes;</li> <li>Ter conhecimento na parte de metodologias ágeis.</li> </ul> ## Supply4Med: <p>Somos uma empresa de soluções para a cadeia de suprimentos da saúde agregando inovação para alcançar excelência e otimização dos processos de compra e venda, logística, finanças, compliance e inteligência de dados, entre outros. Buscamos otimizar a cadeia de suprimentos do setor da saúde para aumentar a eficiência de todo o processo e contribuir para a melhoria da prestação de serviços médico-hospitalares e para os resultados financeiros de nossos clientes.</p></p> ## Habilidades: - .NET - Javascript - React.js - JSON - Microsoft SQL Server - Entity Framework - Asp.Net - MVC ## Local: 100% Remoto ## Requisitos: - Experiência Sólida em .NET; - Experiência Sólida em React; - Perfil de Liderança; - Ter conhecimento em metodologias ágeis; - Experiência em MVC, SQL Server e Json. ## Diferenciais: - Experiência com Azure. ## Como se candidatar: Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Tech Lead (.NET/React) na Supply4Med](https://coodesh.com/jobs/tech-lead-fullstack-194200289?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação. ## Labels #### Alocação Remoto #### Regime PJ #### Categoria Full-Stack
process
tech lead net react na coodesh descrição da vaga esta é uma vaga de um parceiro da plataforma coodesh ao candidatar se você terá acesso as informações completas sobre a empresa e benefícios fique atento ao redirecionamento que vai te levar para uma url com o pop up personalizado de candidatura 👋 a está em busca de tech lead para compor seu time somos uma empresa de soluções para a cadeia de suprimentos da saúde agregando inovação para alcançar excelência e otimização dos processos de compra e venda logística finanças compliance e inteligência de dados entre outros buscamos otimizar a cadeia de suprimentos do setor da saúde para aumentar a eficiência de todo o processo e contribuir para a melhoria da prestação de serviços médico hospitalares e para os resultados financeiros de nossos clientes responsabilidades você será responsável por liderar tecnicamente nossa equipe de desenvolvedores e participar do desenvolvimento de novos projetos e melhorar qualificar os produtos existentes ter conhecimento na parte de metodologias ágeis somos uma empresa de soluções para a cadeia de suprimentos da saúde agregando inovação para alcançar excelência e otimização dos processos de compra e venda logística finanças compliance e inteligência de dados entre outros buscamos otimizar a cadeia de suprimentos do setor da saúde para aumentar a eficiência de todo o processo e contribuir para a melhoria da prestação de serviços médico hospitalares e para os resultados financeiros de nossos clientes habilidades net javascript react js json microsoft sql server entity framework asp net mvc local remoto requisitos experiência sólida em net experiência sólida em react perfil de liderança ter conhecimento em metodologias ágeis experiência em mvc sql server e json diferenciais experiência com azure como se candidatar candidatar se exclusivamente através da plataforma coodesh no link a seguir após candidatar se via plataforma coodesh e validar o seu login você poderá acompanhar e receber todas as interações do processo por lá utilize a opção pedir feedback entre uma etapa e outra na vaga que se candidatou isso fará com que a pessoa recruiter responsável pelo processo na empresa receba a notificação labels alocação remoto regime pj categoria full stack
1
11,699
14,544,883,824
IssuesEvent
2020-12-15 18:49:44
pacificclimate/quail
https://api.github.com/repos/pacificclimate/quail
closed
climdex data pre-processing climdexInput.csv
process
## Description This process takes `csv` files to generate `climdexInput` object ## Function to wrap [climdexInput.csv](https://github.com/pacificclimate/climdex.pcic/blob/master/R/climdex.r#L675)
1.0
climdex data pre-processing climdexInput.csv - ## Description This process takes `csv` files to generate `climdexInput` object ## Function to wrap [climdexInput.csv](https://github.com/pacificclimate/climdex.pcic/blob/master/R/climdex.r#L675)
process
climdex data pre processing climdexinput csv description this process takes csv files to generate climdexinput object function to wrap
1
7,083
2,597,126,640
IssuesEvent
2015-02-21 03:25:27
chessmasterhong/WaterEmblem
https://api.github.com/repos/chessmasterhong/WaterEmblem
closed
Stop Spamming My Github Notifications
bug high priority wontfix
You guys are working too hard. Every time I look at github I have notifications. Please take a break. Thanks, Doc
1.0
Stop Spamming My Github Notifications - You guys are working too hard. Every time I look at github I have notifications. Please take a break. Thanks, Doc
non_process
stop spamming my github notifications you guys are working too hard every time i look at github i have notifications please take a break thanks doc
0
19,875
26,289,154,126
IssuesEvent
2023-01-08 07:07:38
varabyte/kobweb
https://api.github.com/repos/varabyte/kobweb
opened
Review Gradle tasks and see if they can be improved by opting into caching mechanisms
enhancement good first issue process
See also: https://docs.gradle.org/current/userguide/build_cache.html
1.0
Review Gradle tasks and see if they can be improved by opting into caching mechanisms - See also: https://docs.gradle.org/current/userguide/build_cache.html
process
review gradle tasks and see if they can be improved by opting into caching mechanisms see also
1
368,412
10,878,383,661
IssuesEvent
2019-11-16 17:16:25
HabitRPG/habitica
https://api.github.com/repos/HabitRPG/habitica
closed
on-hover state of tiered display names in group chats should not include a colour change
priority: medium section: Guilds section: Tavern Chat status: issue: in progress type: medium level coding
As reported by @citrusella (2d6ef231-50b4-4a22-90e7-45eb97147a2c): "Mousing/hovering over the display name of anyone with a tier turns their name dark blue (like, darker blue than even the mod color)." When you hover over the name of a player who doesn't have a tier, the colour does not change. That should be the correct behaviour for tiered players too <strike>but I'll leave this issue as suggestion-discussion for a few days then if there's no objections, I'll mark it as help wanted. FYI @Tressley</strike> By the way, when you hover over the name of any player, tiered or not, their name becomes underlined. I believe this is the correct behaviour for all players. I.e., the only on-hover effect should be an underline, not a colour change. EDIT: See the comment below from Tressley for confirmation of the desired behaviour. In addition to changing the on-hover behaviour, the :focus and :active states should be changed too if needed, as he says.
1.0
on-hover state of tiered display names in group chats should not include a colour change - As reported by @citrusella (2d6ef231-50b4-4a22-90e7-45eb97147a2c): "Mousing/hovering over the display name of anyone with a tier turns their name dark blue (like, darker blue than even the mod color)." When you hover over the name of a player who doesn't have a tier, the colour does not change. That should be the correct behaviour for tiered players too <strike>but I'll leave this issue as suggestion-discussion for a few days then if there's no objections, I'll mark it as help wanted. FYI @Tressley</strike> By the way, when you hover over the name of any player, tiered or not, their name becomes underlined. I believe this is the correct behaviour for all players. I.e., the only on-hover effect should be an underline, not a colour change. EDIT: See the comment below from Tressley for confirmation of the desired behaviour. In addition to changing the on-hover behaviour, the :focus and :active states should be changed too if needed, as he says.
non_process
on hover state of tiered display names in group chats should not include a colour change as reported by citrusella mousing hovering over the display name of anyone with a tier turns their name dark blue like darker blue than even the mod color when you hover over the name of a player who doesn t have a tier the colour does not change that should be the correct behaviour for tiered players too but i ll leave this issue as suggestion discussion for a few days then if there s no objections i ll mark it as help wanted fyi tressley by the way when you hover over the name of any player tiered or not their name becomes underlined i believe this is the correct behaviour for all players i e the only on hover effect should be an underline not a colour change edit see the comment below from tressley for confirmation of the desired behaviour in addition to changing the on hover behaviour the focus and active states should be changed too if needed as he says
0
22,240
30,792,042,975
IssuesEvent
2023-07-31 16:53:28
hashgraph/hedera-mirror-node
https://api.github.com/repos/hashgraph/hedera-mirror-node
closed
Release Checklist 0.84
enhancement process
### Problem We need a checklist to verify the release is rolled out successfully. ### Solution ## Preparation - [x] Milestone field populated on relevant [issues](https://github.com/hashgraph/hedera-mirror-node/issues?q=is%3Aclosed+no%3Amilestone+sort%3Aupdated-desc) - [x] Nothing open for [milestone](https://github.com/hashgraph/hedera-mirror-node/issues?q=is%3Aopen+sort%3Aupdated-desc+milestone%3A0.84.0) - [x] GitHub checks for branch are passing - [x] No pre-release or snapshot dependencies present in build files - [x] Automated Kubernetes deployment successful - [x] Tag release - [x] Upload release artifacts - [x] Manual Submission for GCP Marketplace verification by google - [x] Publish marketplace release - [x] Publish release ## Performance - [x] Deployed - [x] gRPC API performance tests - [x] Importer performance tests - [x] REST API performance tests ## Previewnet - [x] Deployed ## Staging - [x] Deployed ## Testnet - [x] Deployed ## Mainnet - [x] Deployed to public - [x] Deployed to private ### Alternatives _No response_
1.0
Release Checklist 0.84 - ### Problem We need a checklist to verify the release is rolled out successfully. ### Solution ## Preparation - [x] Milestone field populated on relevant [issues](https://github.com/hashgraph/hedera-mirror-node/issues?q=is%3Aclosed+no%3Amilestone+sort%3Aupdated-desc) - [x] Nothing open for [milestone](https://github.com/hashgraph/hedera-mirror-node/issues?q=is%3Aopen+sort%3Aupdated-desc+milestone%3A0.84.0) - [x] GitHub checks for branch are passing - [x] No pre-release or snapshot dependencies present in build files - [x] Automated Kubernetes deployment successful - [x] Tag release - [x] Upload release artifacts - [x] Manual Submission for GCP Marketplace verification by google - [x] Publish marketplace release - [x] Publish release ## Performance - [x] Deployed - [x] gRPC API performance tests - [x] Importer performance tests - [x] REST API performance tests ## Previewnet - [x] Deployed ## Staging - [x] Deployed ## Testnet - [x] Deployed ## Mainnet - [x] Deployed to public - [x] Deployed to private ### Alternatives _No response_
process
release checklist problem we need a checklist to verify the release is rolled out successfully solution preparation milestone field populated on relevant nothing open for github checks for branch are passing no pre release or snapshot dependencies present in build files automated kubernetes deployment successful tag release upload release artifacts manual submission for gcp marketplace verification by google publish marketplace release publish release performance deployed grpc api performance tests importer performance tests rest api performance tests previewnet deployed staging deployed testnet deployed mainnet deployed to public deployed to private alternatives no response
1
518,132
15,024,492,419
IssuesEvent
2021-02-01 19:42:52
Psychoanalytic-Electronic-Publishing/PEP-Web-User-Interface
https://api.github.com/repos/Psychoanalytic-Electronic-Publishing/PEP-Web-User-Interface
closed
Browse mode selection indicator
enhancement low priority
In Browse mode, when I select an item in the left browse list, it doesn't highlight (and stay highlighted)..which would have the benefit of identifying what I picked to see in the middle pane. It could also technically serve as a cursor position so that up and down arrow would move me to the previous and next item in the list (if that's the last pane I clicked). Sort of like the Explorer folder interface. ![image](https://user-images.githubusercontent.com/4837554/101079019-eaff2400-3574-11eb-992c-e9a451a00348.png) For that matter, that should be true in every selection list leading to another pane change.
1.0
Browse mode selection indicator - In Browse mode, when I select an item in the left browse list, it doesn't highlight (and stay highlighted)..which would have the benefit of identifying what I picked to see in the middle pane. It could also technically serve as a cursor position so that up and down arrow would move me to the previous and next item in the list (if that's the last pane I clicked). Sort of like the Explorer folder interface. ![image](https://user-images.githubusercontent.com/4837554/101079019-eaff2400-3574-11eb-992c-e9a451a00348.png) For that matter, that should be true in every selection list leading to another pane change.
non_process
browse mode selection indicator in browse mode when i select an item in the left browse list it doesn t highlight and stay highlighted which would have the benefit of identifying what i picked to see in the middle pane it could also technically serve as a cursor position so that up and down arrow would move me to the previous and next item in the list if that s the last pane i clicked sort of like the explorer folder interface for that matter that should be true in every selection list leading to another pane change
0
61,915
12,196,685,266
IssuesEvent
2020-04-29 19:25:57
eclipse/codewind
https://api.github.com/repos/eclipse/codewind
closed
Un-closable logs left over after connection is taken down
area/vscode-ide kind/bug
it's possible to have codewind project logs in the output view that can't be closed. 1. show all logs for a project 2. deactivate / remove the remote connection, or stop the local connection they're supposed to be disposed with the project, but they are not.
1.0
Un-closable logs left over after connection is taken down - it's possible to have codewind project logs in the output view that can't be closed. 1. show all logs for a project 2. deactivate / remove the remote connection, or stop the local connection they're supposed to be disposed with the project, but they are not.
non_process
un closable logs left over after connection is taken down it s possible to have codewind project logs in the output view that can t be closed show all logs for a project deactivate remove the remote connection or stop the local connection they re supposed to be disposed with the project but they are not
0
21,556
29,872,995,953
IssuesEvent
2023-06-20 09:41:39
openfoodfacts/openfoodfacts-server
https://api.github.com/repos/openfoodfacts/openfoodfacts-server
opened
fruits-vegetables-nuts-estimate-from-ingredients_100g is not reset when ingredients are deleted
bug ingredients task data quality ingredients processing
When deleting all the ingredients of a product, the `fruits-vegetables-nuts-estimate-from-ingredients_100g` and `fruits-vegetables-nuts-estimate-from-ingredients_serving` are not reset. As a consequence, it seems **it's not possible to fix `Nutrition value over 105 - Fruits vegetables nuts estimate from ingredients` data quality error**. Example: * https://world.openfoodfacts.org/product/07562174/salade-3-fromages-u (see "Contribution" panel and changes history panel) * JSON version: https://world.openfoodfacts.org/api/v0/product/07562174.json Another example on openfoodfacts.net, for tests purpose: https://world.openfoodfacts.net/cgi/product.pl?type=edit&code=3021761207849
1.0
fruits-vegetables-nuts-estimate-from-ingredients_100g is not reset when ingredients are deleted - When deleting all the ingredients of a product, the `fruits-vegetables-nuts-estimate-from-ingredients_100g` and `fruits-vegetables-nuts-estimate-from-ingredients_serving` are not reset. As a consequence, it seems **it's not possible to fix `Nutrition value over 105 - Fruits vegetables nuts estimate from ingredients` data quality error**. Example: * https://world.openfoodfacts.org/product/07562174/salade-3-fromages-u (see "Contribution" panel and changes history panel) * JSON version: https://world.openfoodfacts.org/api/v0/product/07562174.json Another example on openfoodfacts.net, for tests purpose: https://world.openfoodfacts.net/cgi/product.pl?type=edit&code=3021761207849
process
fruits vegetables nuts estimate from ingredients is not reset when ingredients are deleted when deleting all the ingredients of a product the fruits vegetables nuts estimate from ingredients and fruits vegetables nuts estimate from ingredients serving are not reset as a consequence it seems it s not possible to fix nutrition value over fruits vegetables nuts estimate from ingredients data quality error example see contribution panel and changes history panel json version another example on openfoodfacts net for tests purpose
1
111,637
17,030,586,693
IssuesEvent
2021-07-04 13:32:10
GusSand/Anubis
https://api.github.com/repos/GusSand/Anubis
opened
ADD selinux profile or security context to things that exec student code
backend housekeeping security
[k8s security context](https://kubernetes.io/docs/tasks/configure-pod-container/security-context/) I think there were some changes to the way security contexts work in the latest Kubernetes release (1.21). Anubis runs on 1.20, but we should consider if something wont be able to be upgraded.
True
ADD selinux profile or security context to things that exec student code - [k8s security context](https://kubernetes.io/docs/tasks/configure-pod-container/security-context/) I think there were some changes to the way security contexts work in the latest Kubernetes release (1.21). Anubis runs on 1.20, but we should consider if something wont be able to be upgraded.
non_process
add selinux profile or security context to things that exec student code i think there were some changes to the way security contexts work in the latest kubernetes release anubis runs on but we should consider if something wont be able to be upgraded
0
237,787
7,764,216,661
IssuesEvent
2018-06-01 19:26:43
vmware/vic
https://api.github.com/repos/vmware/vic
reopened
Concurrent container auto remove orphans containers on vSphere
area/docker component/vmomi-gateway kind/bug priority/p1 team/container team/foundation
I created 16 containers concurrently with --rm specified and named 2 - 17 via: `docker run -d --rm --name {#} ubuntu /bin/sh -c 'a=0; while [ $a -lt 75 ]; do echo "line $a"; a=`expr $a + 1`; sleep 2; done;'` This resulted in 16 containers: ``` docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES b07f1ab1d37a ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 5 bb7c09c14676 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 4 111fb3a28833 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 12 2c733c90a322 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 13 5062fadd9df3 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 15 15ba4f59d9ba ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 11 fee7c5d822b8 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 2 c03a83fb6a85 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 7 f81b635bceca ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 3 4c672443ec4d ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 6 6a1cb2bb5636 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 16 b0e0de57d268 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 10 6d929e8800ef ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 17 bf9edaf8f983 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 14 29ff1eb418ea ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 9 ebca47a44ccf ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 8 ``` The command provided to the container will run for roughly 2.5 minutes, so the containers will all exit at the same time. All but 2 containers were removed (as detailed in #6341): ``` docker ps -a CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES b07f1ab1d37a ubuntu "/bin/sh -c a=0; w..." 20 minutes ago Stopped 5 2c733c90a322 ubuntu "/bin/sh -c a=0; w..." 20 minutes ago Exited (0) 16 minutes ago 13 ``` Additionally, one container was orphaned on vSphere. Meaning it was removed from the docker persona, but the containerVM wasn't removed: ``` govc ls vm /vcqaDC/vm/Discovered virtual machine /vcqaDC/vm/9-29ff1eb418ea /vcqaDC/vm/victor ``` Environment details: vSphere 6.0 u3 (nimbus) ESXi 6.0 u3 Storage: nfs vic 1.2.1 deployed to resource pool via `--use-rp` Reproducibility: easy [logs](https://github.com/vmware/vic/files/1304701/orphaned_network.zip)
1.0
Concurrent container auto remove orphans containers on vSphere - I created 16 containers concurrently with --rm specified and named 2 - 17 via: `docker run -d --rm --name {#} ubuntu /bin/sh -c 'a=0; while [ $a -lt 75 ]; do echo "line $a"; a=`expr $a + 1`; sleep 2; done;'` This resulted in 16 containers: ``` docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES b07f1ab1d37a ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 5 bb7c09c14676 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 4 111fb3a28833 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 12 2c733c90a322 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 13 5062fadd9df3 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 15 15ba4f59d9ba ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 11 fee7c5d822b8 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 2 c03a83fb6a85 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 7 f81b635bceca ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 3 4c672443ec4d ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 6 6a1cb2bb5636 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 16 b0e0de57d268 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 10 6d929e8800ef ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 17 bf9edaf8f983 ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 14 29ff1eb418ea ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 9 ebca47a44ccf ubuntu "/bin/sh -c a=0; w..." About a minute ago Up About a minute 8 ``` The command provided to the container will run for roughly 2.5 minutes, so the containers will all exit at the same time. All but 2 containers were removed (as detailed in #6341): ``` docker ps -a CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES b07f1ab1d37a ubuntu "/bin/sh -c a=0; w..." 20 minutes ago Stopped 5 2c733c90a322 ubuntu "/bin/sh -c a=0; w..." 20 minutes ago Exited (0) 16 minutes ago 13 ``` Additionally, one container was orphaned on vSphere. Meaning it was removed from the docker persona, but the containerVM wasn't removed: ``` govc ls vm /vcqaDC/vm/Discovered virtual machine /vcqaDC/vm/9-29ff1eb418ea /vcqaDC/vm/victor ``` Environment details: vSphere 6.0 u3 (nimbus) ESXi 6.0 u3 Storage: nfs vic 1.2.1 deployed to resource pool via `--use-rp` Reproducibility: easy [logs](https://github.com/vmware/vic/files/1304701/orphaned_network.zip)
non_process
concurrent container auto remove orphans containers on vsphere i created containers concurrently with rm specified and named via docker run d rm name ubuntu bin sh c a while do echo line a a expr a sleep done this resulted in containers docker ps container id image command created status ports names ubuntu bin sh c a w about a minute ago up about a minute ubuntu bin sh c a w about a minute ago up about a minute ubuntu bin sh c a w about a minute ago up about a minute ubuntu bin sh c a w about a minute ago up about a minute ubuntu bin sh c a w about a minute ago up about a minute ubuntu bin sh c a w about a minute ago up about a minute ubuntu bin sh c a w about a minute ago up about a minute ubuntu bin sh c a w about a minute ago up about a minute ubuntu bin sh c a w about a minute ago up about a minute ubuntu bin sh c a w about a minute ago up about a minute ubuntu bin sh c a w about a minute ago up about a minute ubuntu bin sh c a w about a minute ago up about a minute ubuntu bin sh c a w about a minute ago up about a minute ubuntu bin sh c a w about a minute ago up about a minute ubuntu bin sh c a w about a minute ago up about a minute ubuntu bin sh c a w about a minute ago up about a minute the command provided to the container will run for roughly minutes so the containers will all exit at the same time all but containers were removed as detailed in docker ps a container id image command created status ports names ubuntu bin sh c a w minutes ago stopped ubuntu bin sh c a w minutes ago exited minutes ago additionally one container was orphaned on vsphere meaning it was removed from the docker persona but the containervm wasn t removed govc ls vm vcqadc vm discovered virtual machine vcqadc vm vcqadc vm victor environment details vsphere nimbus esxi storage nfs vic deployed to resource pool via use rp reproducibility easy
0
16,556
21,569,027,404
IssuesEvent
2022-05-02 05:12:19
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
reopened
Problems with interpreting invalid link keyrefs in XHTML and PDF outputs
bug priority/medium preprocess/keyref stale
[EXM-41117.zip](https://github.com/dita-ot/dita-ot/files/1770597/EXM-41117.zip) I attached a sample small project. We have a glossgroup which looks like this: <glossgroup id="glossgroup"> <title>GG</title> <glossentry id="ddl"> <glossterm id="gt1">GT1</glossterm> <glossdef></glossdef> </glossentry> </glossgroup> there is a key defined on it in the DITA Map: <topicref href="glossGroup.dita" keys="ggroup"/> and we have two xref keyrefs which use the key: <xref keyref="ggroup/ddl"/> <xref keyref="ggroup/gt1">DEF</xref> Both these links are invalid links, the first one references a topic ID inside the glossgroup topic, and the second one references an element ID located inside an inner topic. I publish to PDF using DITA OT 3.0.2, both links appear broken in the PDF output but there are no warnings reported by the publishing engine. I publish to XHTML using DITA OT 3.0.2, both links work, they can be clicked and the target is opened, although they should not work. For the first link the title cannot be determined which is OK. Again no warnings are reported by the publishing engine.
1.0
Problems with interpreting invalid link keyrefs in XHTML and PDF outputs - [EXM-41117.zip](https://github.com/dita-ot/dita-ot/files/1770597/EXM-41117.zip) I attached a sample small project. We have a glossgroup which looks like this: <glossgroup id="glossgroup"> <title>GG</title> <glossentry id="ddl"> <glossterm id="gt1">GT1</glossterm> <glossdef></glossdef> </glossentry> </glossgroup> there is a key defined on it in the DITA Map: <topicref href="glossGroup.dita" keys="ggroup"/> and we have two xref keyrefs which use the key: <xref keyref="ggroup/ddl"/> <xref keyref="ggroup/gt1">DEF</xref> Both these links are invalid links, the first one references a topic ID inside the glossgroup topic, and the second one references an element ID located inside an inner topic. I publish to PDF using DITA OT 3.0.2, both links appear broken in the PDF output but there are no warnings reported by the publishing engine. I publish to XHTML using DITA OT 3.0.2, both links work, they can be clicked and the target is opened, although they should not work. For the first link the title cannot be determined which is OK. Again no warnings are reported by the publishing engine.
process
problems with interpreting invalid link keyrefs in xhtml and pdf outputs i attached a sample small project we have a glossgroup which looks like this gg there is a key defined on it in the dita map and we have two xref keyrefs which use the key def both these links are invalid links the first one references a topic id inside the glossgroup topic and the second one references an element id located inside an inner topic i publish to pdf using dita ot both links appear broken in the pdf output but there are no warnings reported by the publishing engine i publish to xhtml using dita ot both links work they can be clicked and the target is opened although they should not work for the first link the title cannot be determined which is ok again no warnings are reported by the publishing engine
1
587,142
17,605,522,992
IssuesEvent
2021-08-17 16:33:27
eventespresso/barista
https://api.github.com/repos/eventespresso/barista
closed
Ticket capabilities not saving within EDTR
T: bug 🐞 C: data systems 🗑 P2: HIGH priority 😮 S:1 new 👶🏻 D: WP User Add-on 👤
Using dev branch of both EE4 and WP Users Try to set a capability on a ticket within EDTR, it looks like it saves but when you open up the ticket options again nothing saved. Refreshing the event editor and checking again shows no values for ticket cap. This happens when using either a 'standard' option or setting a custom capability. Setting the capability: https://monosnap.com/file/5Bou2rpCFmhSJT0CekwR1bFBf9N9eX ![edit ticket caps 1](https://user-images.githubusercontent.com/1751030/128749751-d4501427-8d86-463f-bba0-1d08a4a40b17.png) The ticket shows as updated: https://monosnap.com/file/yzgwCLQjLCGHIGtibD49k9fZ5EEUBP ![edit ticket caps 2](https://user-images.githubusercontent.com/1751030/128749763-11b8d8d2-64af-4a5c-92e6-8ba506049b48.png) Reopen the ticket options: https://monosnap.com/file/8CcDtqgkb1nt8V76D7FWuaWBQrTJRG ![edit ticket caps 3](https://user-images.githubusercontent.com/1751030/128749796-47bf33de-a4f9-404b-970e-6c913cbf215e.png) (Same after refreshing the event editor)
1.0
Ticket capabilities not saving within EDTR - Using dev branch of both EE4 and WP Users Try to set a capability on a ticket within EDTR, it looks like it saves but when you open up the ticket options again nothing saved. Refreshing the event editor and checking again shows no values for ticket cap. This happens when using either a 'standard' option or setting a custom capability. Setting the capability: https://monosnap.com/file/5Bou2rpCFmhSJT0CekwR1bFBf9N9eX ![edit ticket caps 1](https://user-images.githubusercontent.com/1751030/128749751-d4501427-8d86-463f-bba0-1d08a4a40b17.png) The ticket shows as updated: https://monosnap.com/file/yzgwCLQjLCGHIGtibD49k9fZ5EEUBP ![edit ticket caps 2](https://user-images.githubusercontent.com/1751030/128749763-11b8d8d2-64af-4a5c-92e6-8ba506049b48.png) Reopen the ticket options: https://monosnap.com/file/8CcDtqgkb1nt8V76D7FWuaWBQrTJRG ![edit ticket caps 3](https://user-images.githubusercontent.com/1751030/128749796-47bf33de-a4f9-404b-970e-6c913cbf215e.png) (Same after refreshing the event editor)
non_process
ticket capabilities not saving within edtr using dev branch of both and wp users try to set a capability on a ticket within edtr it looks like it saves but when you open up the ticket options again nothing saved refreshing the event editor and checking again shows no values for ticket cap this happens when using either a standard option or setting a custom capability setting the capability the ticket shows as updated reopen the ticket options same after refreshing the event editor
0
7,648
10,738,685,784
IssuesEvent
2019-10-29 15:10:07
openopps/openopps-platform
https://api.github.com/repos/openopps/openopps-platform
opened
Add "We're pulling data" modal after Next Steps
Apply Process State Dept.
Who: Applicants What: Provide modal to let them know we're pulling over USAJOBS data Why: in order to let them know there may be a time lag and also make it clear we're pulling over the data Acceptance: When a user clicks "Continue" on next steps, display a modal that lets them know we are pulling data over from USAJOBS. Confirming content with Debbie We could do a modal Title: "Importing data from USAJOBS" Message: "Please wait while we import your education, work experience, references, and languages from USAJOBS." No buttons and now "X" since you won't be able to stop this process
1.0
Add "We're pulling data" modal after Next Steps - Who: Applicants What: Provide modal to let them know we're pulling over USAJOBS data Why: in order to let them know there may be a time lag and also make it clear we're pulling over the data Acceptance: When a user clicks "Continue" on next steps, display a modal that lets them know we are pulling data over from USAJOBS. Confirming content with Debbie We could do a modal Title: "Importing data from USAJOBS" Message: "Please wait while we import your education, work experience, references, and languages from USAJOBS." No buttons and now "X" since you won't be able to stop this process
process
add we re pulling data modal after next steps who applicants what provide modal to let them know we re pulling over usajobs data why in order to let them know there may be a time lag and also make it clear we re pulling over the data acceptance when a user clicks continue on next steps display a modal that lets them know we are pulling data over from usajobs confirming content with debbie we could do a modal title importing data from usajobs message please wait while we import your education work experience references and languages from usajobs no buttons and now x since you won t be able to stop this process
1
11,996
14,737,260,615
IssuesEvent
2021-01-07 01:19:56
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Speed-E'z Exchange & Courier Service
anc-external anc-ops anc-process anc-ui anp-1 ant-bug ant-support
In GitLab by @kdjstudios on May 1, 2018, 10:46 **Submitted by:** "Joann Browne" <joann@speedez.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-05-01-27898/conversation **Server:** External **Client/Site:** Speed Ez **Account:** NA **Issue:** I tried to change the Usage Period and it did NOT work. My usage period is always starting the first day of the month to the last day of the month. Printing of the statements are messed up. The net terms are in the usage period place and the usage period is above it in a blank spot.
1.0
Speed-E'z Exchange & Courier Service - In GitLab by @kdjstudios on May 1, 2018, 10:46 **Submitted by:** "Joann Browne" <joann@speedez.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-05-01-27898/conversation **Server:** External **Client/Site:** Speed Ez **Account:** NA **Issue:** I tried to change the Usage Period and it did NOT work. My usage period is always starting the first day of the month to the last day of the month. Printing of the statements are messed up. The net terms are in the usage period place and the usage period is above it in a blank spot.
process
speed e z exchange courier service in gitlab by kdjstudios on may submitted by joann browne helpdesk server external client site speed ez account na issue i tried to change the usage period and it did not work my usage period is always starting the first day of the month to the last day of the month printing of the statements are messed up the net terms are in the usage period place and the usage period is above it in a blank spot
1
232,777
25,706,299,083
IssuesEvent
2022-12-07 01:02:55
dmyers87/surge
https://api.github.com/repos/dmyers87/surge
closed
CVE-2022-1941 (High) detected in google.protobuf.3.18.1.nupkg - autoclosed
security vulnerability
## CVE-2022-1941 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>google.protobuf.3.18.1.nupkg</b></p></summary> <p>C# runtime library for Protocol Buffers - Google's data interchange format.</p> <p>Library home page: <a href="https://api.nuget.org/packages/google.protobuf.3.18.1.nupkg">https://api.nuget.org/packages/google.protobuf.3.18.1.nupkg</a></p> <p>Path to dependency file: /modules/multilanguage-csharp-sdk/Surge.csproj</p> <p>Path to vulnerable library: /.nuget/packages/google.protobuf/3.18.1/google.protobuf.3.18.1.nupkg</p> <p> Dependency Hierarchy: - :x: **google.protobuf.3.18.1.nupkg** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/dmyers87/surge/commit/534418963c4dd1f3fb26a8f8c57115b1e35004c4">534418963c4dd1f3fb26a8f8c57115b1e35004c4</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A parsing vulnerability for the MessageSet type in the ProtocolBuffers versions prior to and including 3.16.1, 3.17.3, 3.18.2, 3.19.4, 3.20.1 and 3.21.5 for protobuf-cpp, and versions prior to and including 3.16.1, 3.17.3, 3.18.2, 3.19.4, 3.20.1 and 4.21.5 for protobuf-python can lead to out of memory failures. A specially crafted message with multiple key-value per elements creates parsing issues, and can lead to a Denial of Service against services receiving unsanitized input. We recommend upgrading to versions 3.18.3, 3.19.5, 3.20.2, 3.21.6 for protobuf-cpp and 3.18.3, 3.19.5, 3.20.2, 4.21.6 for protobuf-python. Versions for 3.16 and 3.17 are no longer updated. <p>Publish Date: 2022-09-22 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-1941>CVE-2022-1941</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cloud.google.com/support/bulletins#GCP-2022-019">https://cloud.google.com/support/bulletins#GCP-2022-019</a></p> <p>Release Date: 2022-09-22</p> <p>Fix Resolution: Google.Protobuf - 3.18.3,3.19.5,3.20.2,3.21.6;protobuf-python - 3.18.3,3.19.5,3.20.2,4.21.6</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
True
CVE-2022-1941 (High) detected in google.protobuf.3.18.1.nupkg - autoclosed - ## CVE-2022-1941 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>google.protobuf.3.18.1.nupkg</b></p></summary> <p>C# runtime library for Protocol Buffers - Google's data interchange format.</p> <p>Library home page: <a href="https://api.nuget.org/packages/google.protobuf.3.18.1.nupkg">https://api.nuget.org/packages/google.protobuf.3.18.1.nupkg</a></p> <p>Path to dependency file: /modules/multilanguage-csharp-sdk/Surge.csproj</p> <p>Path to vulnerable library: /.nuget/packages/google.protobuf/3.18.1/google.protobuf.3.18.1.nupkg</p> <p> Dependency Hierarchy: - :x: **google.protobuf.3.18.1.nupkg** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/dmyers87/surge/commit/534418963c4dd1f3fb26a8f8c57115b1e35004c4">534418963c4dd1f3fb26a8f8c57115b1e35004c4</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A parsing vulnerability for the MessageSet type in the ProtocolBuffers versions prior to and including 3.16.1, 3.17.3, 3.18.2, 3.19.4, 3.20.1 and 3.21.5 for protobuf-cpp, and versions prior to and including 3.16.1, 3.17.3, 3.18.2, 3.19.4, 3.20.1 and 4.21.5 for protobuf-python can lead to out of memory failures. A specially crafted message with multiple key-value per elements creates parsing issues, and can lead to a Denial of Service against services receiving unsanitized input. We recommend upgrading to versions 3.18.3, 3.19.5, 3.20.2, 3.21.6 for protobuf-cpp and 3.18.3, 3.19.5, 3.20.2, 4.21.6 for protobuf-python. Versions for 3.16 and 3.17 are no longer updated. <p>Publish Date: 2022-09-22 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-1941>CVE-2022-1941</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cloud.google.com/support/bulletins#GCP-2022-019">https://cloud.google.com/support/bulletins#GCP-2022-019</a></p> <p>Release Date: 2022-09-22</p> <p>Fix Resolution: Google.Protobuf - 3.18.3,3.19.5,3.20.2,3.21.6;protobuf-python - 3.18.3,3.19.5,3.20.2,4.21.6</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
non_process
cve high detected in google protobuf nupkg autoclosed cve high severity vulnerability vulnerable library google protobuf nupkg c runtime library for protocol buffers google s data interchange format library home page a href path to dependency file modules multilanguage csharp sdk surge csproj path to vulnerable library nuget packages google protobuf google protobuf nupkg dependency hierarchy x google protobuf nupkg vulnerable library found in head commit a href found in base branch main vulnerability details a parsing vulnerability for the messageset type in the protocolbuffers versions prior to and including and for protobuf cpp and versions prior to and including and for protobuf python can lead to out of memory failures a specially crafted message with multiple key value per elements creates parsing issues and can lead to a denial of service against services receiving unsanitized input we recommend upgrading to versions for protobuf cpp and for protobuf python versions for and are no longer updated publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution google protobuf protobuf python check this box to open an automated fix pr
0
7,421
10,542,795,089
IssuesEvent
2019-10-02 13:51:47
fablabbcn/fablabs.io
https://api.github.com/repos/fablabbcn/fablabs.io
opened
Search under backstage not working
Approval Process bug
**Describe the bug** Normal search doesn't work **To Reproduce** The search engine in the backstage is not working, to search a lab you need to: 1. Add Name of the lab 2. Filter This system will tell you that there are no labs under this name, so the right way is to: 1. Add Name of the lab 2. Select an option between (Is referee, with discourse ID, without discourse ID, with discourse sync errors) normally the one that works is with discourse ID 3. Search This will give you a list of all the labs with this name Steps to reproduce the behavior: 1. Go to 'Name' 2. Click on 'Filter' 3. Scroll down to 'see all labs under that name' 4. See error **Expected behavior** All labs under this name should appear, the problem with this is that they are not appearing on the backstage but are on the frontend **Screenshots** normal search on backstage: <img width="880" alt="Screen Shot 2019-10-02 at 8 45 40 AM" src="https://user-images.githubusercontent.com/24419466/66049526-2a555f00-e4f1-11e9-9fc1-92bcc8a4c171.png"> on fronted: <img width="760" alt="Screen Shot 2019-10-02 at 8 46 01 AM" src="https://user-images.githubusercontent.com/24419466/66049535-2de8e600-e4f1-11e9-8dfe-27cb9dfce391.png"> with discourse ID added: <img width="872" alt="Screen Shot 2019-10-02 at 8 49 44 AM" src="https://user-images.githubusercontent.com/24419466/66049770-9fc12f80-e4f1-11e9-9b3d-ceddddd57960.png"> **Desktop (please complete the following information):** - OS: macOS 10.14.6 - Browser: Chrome - Version: Version 77.0.3865.90 (Official Build) (64-bit) **Smartphone (please complete the following information):** Not tested - Device: [e.g. iPhone6] - OS: [e.g. iOS8.1] - Browser [e.g. stock browser, safari] - Version [e.g. 22] **Additional context** This is also affecting the number of labs that we can see on the backstage which differs on the number of labs that the public can see.
1.0
Search under backstage not working - **Describe the bug** Normal search doesn't work **To Reproduce** The search engine in the backstage is not working, to search a lab you need to: 1. Add Name of the lab 2. Filter This system will tell you that there are no labs under this name, so the right way is to: 1. Add Name of the lab 2. Select an option between (Is referee, with discourse ID, without discourse ID, with discourse sync errors) normally the one that works is with discourse ID 3. Search This will give you a list of all the labs with this name Steps to reproduce the behavior: 1. Go to 'Name' 2. Click on 'Filter' 3. Scroll down to 'see all labs under that name' 4. See error **Expected behavior** All labs under this name should appear, the problem with this is that they are not appearing on the backstage but are on the frontend **Screenshots** normal search on backstage: <img width="880" alt="Screen Shot 2019-10-02 at 8 45 40 AM" src="https://user-images.githubusercontent.com/24419466/66049526-2a555f00-e4f1-11e9-9fc1-92bcc8a4c171.png"> on fronted: <img width="760" alt="Screen Shot 2019-10-02 at 8 46 01 AM" src="https://user-images.githubusercontent.com/24419466/66049535-2de8e600-e4f1-11e9-8dfe-27cb9dfce391.png"> with discourse ID added: <img width="872" alt="Screen Shot 2019-10-02 at 8 49 44 AM" src="https://user-images.githubusercontent.com/24419466/66049770-9fc12f80-e4f1-11e9-9b3d-ceddddd57960.png"> **Desktop (please complete the following information):** - OS: macOS 10.14.6 - Browser: Chrome - Version: Version 77.0.3865.90 (Official Build) (64-bit) **Smartphone (please complete the following information):** Not tested - Device: [e.g. iPhone6] - OS: [e.g. iOS8.1] - Browser [e.g. stock browser, safari] - Version [e.g. 22] **Additional context** This is also affecting the number of labs that we can see on the backstage which differs on the number of labs that the public can see.
process
search under backstage not working describe the bug normal search doesn t work to reproduce the search engine in the backstage is not working to search a lab you need to add name of the lab filter this system will tell you that there are no labs under this name so the right way is to add name of the lab select an option between is referee with discourse id without discourse id with discourse sync errors normally the one that works is with discourse id search this will give you a list of all the labs with this name steps to reproduce the behavior go to name click on filter scroll down to see all labs under that name see error expected behavior all labs under this name should appear the problem with this is that they are not appearing on the backstage but are on the frontend screenshots normal search on backstage img width alt screen shot at am src on fronted img width alt screen shot at am src with discourse id added img width alt screen shot at am src desktop please complete the following information os macos browser chrome version version official build bit smartphone please complete the following information not tested device os browser version additional context this is also affecting the number of labs that we can see on the backstage which differs on the number of labs that the public can see
1
587,849
17,632,942,168
IssuesEvent
2021-08-19 10:14:53
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
applications.labor.ny.gov - see bug description
browser-firefox priority-normal os-mac engine-gecko
<!-- @browser: Firefox 91.0 --> <!-- @ua_header: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:91.0) Gecko/20100101 Firefox/91.0 --> <!-- @reported_with: unknown --> <!-- @public_url: https://github.com/webcompat/web-bugs/issues/83539 --> **URL**: https://applications.labor.ny.gov/IndividualReg/ **Browser / Version**: Firefox 91.0 **Operating System**: Mac OS X 10.15 **Tested Another Browser**: Yes Other **Problem type**: Something else **Description**: Login Error **Steps to Reproduce**: Most often, but not always, the site will was the site is busy and that I won't be able to login now, but I will immediately move over to Brave browser and get into the site with no problem - always works. I don't think the site suddenly became "unbusy" in the 10 seconds it takes to change to Brave and login with that Browser especially since it works every time this happens. Something is off with the way Firefox communicates with the site for login. Since it's The New York State Dept of Labor site it's pretty important that login should work. <details> <summary>Browser Configuration</summary> <ul> <li>None</li> </ul> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
applications.labor.ny.gov - see bug description - <!-- @browser: Firefox 91.0 --> <!-- @ua_header: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:91.0) Gecko/20100101 Firefox/91.0 --> <!-- @reported_with: unknown --> <!-- @public_url: https://github.com/webcompat/web-bugs/issues/83539 --> **URL**: https://applications.labor.ny.gov/IndividualReg/ **Browser / Version**: Firefox 91.0 **Operating System**: Mac OS X 10.15 **Tested Another Browser**: Yes Other **Problem type**: Something else **Description**: Login Error **Steps to Reproduce**: Most often, but not always, the site will was the site is busy and that I won't be able to login now, but I will immediately move over to Brave browser and get into the site with no problem - always works. I don't think the site suddenly became "unbusy" in the 10 seconds it takes to change to Brave and login with that Browser especially since it works every time this happens. Something is off with the way Firefox communicates with the site for login. Since it's The New York State Dept of Labor site it's pretty important that login should work. <details> <summary>Browser Configuration</summary> <ul> <li>None</li> </ul> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
non_process
applications labor ny gov see bug description url browser version firefox operating system mac os x tested another browser yes other problem type something else description login error steps to reproduce most often but not always the site will was the site is busy and that i won t be able to login now but i will immediately move over to brave browser and get into the site with no problem always works i don t think the site suddenly became unbusy in the seconds it takes to change to brave and login with that browser especially since it works every time this happens something is off with the way firefox communicates with the site for login since it s the new york state dept of labor site it s pretty important that login should work browser configuration none from with ❤️
0
5,284
8,071,814,327
IssuesEvent
2018-08-06 14:16:17
annip95/Gdg2_D3
https://api.github.com/repos/annip95/Gdg2_D3
opened
[P03] Bänder nach dem Break
processing
Für den letzten Teil der Animation sollen Bänder verschiedene Bewegungen machen. Hier nach Beispielen online suchen und diese umsetzen.
1.0
[P03] Bänder nach dem Break - Für den letzten Teil der Animation sollen Bänder verschiedene Bewegungen machen. Hier nach Beispielen online suchen und diese umsetzen.
process
bänder nach dem break für den letzten teil der animation sollen bänder verschiedene bewegungen machen hier nach beispielen online suchen und diese umsetzen
1
69,075
17,567,030,456
IssuesEvent
2021-08-14 00:08:12
tensorflow/tensorflow
https://api.github.com/repos/tensorflow/tensorflow
closed
Will TF2.0 build with Python3 ONLY, without Python2?
stat:awaiting tensorflower type:build/install TF 2.0 subtype: ubuntu/linux
**System information** - OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu 19.10 - TensorFlow installed from (source or binary): source - TensorFlow version: 2.0 - Python version: 3.7.5 - Installed using virtualenv? pip? conda?: pip - Bazel version (if compiling from source):1.1.0 - GCC/Compiler version (if compiling from source): 9.2.1 - CUDA/cuDNN version:10.2/7.6.5 - GPU model and memory: GeForce GTX 980M/4035MiB ```console ERROR: ~/Downloads/....../tensorflow/python/keras/api/BUILD:129:1: Executing genrule //tensorflow/python/keras/api:keras_python_api_gen_compat_v2 failed (Exit 1) Traceback (most recent call last): File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/tools/api/generator/create_python_api.py", line 771, in <module> main() File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/tools/api/generator/create_python_api.py", line 767, in main lazy_loading, args.use_relative_imports) File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/tools/api/generator/create_python_api.py", line 625, in create_api_files api_version, compat_api_versions, lazy_loading, use_relative_imports) File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/tools/api/generator/create_python_api.py", line 502, in get_api_init_text _, attr = tf_decorator.unwrap(attr) File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/util/tf_decorator.py", line 219, in unwrap elif _has_tf_decorator_attr(cur): File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/util/tf_decorator.py", line 124, in _has_tf_decorator_attr hasattr(obj, '_tf_decorator') and File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/util/lazy_loader.py", line 62, in __getattr__ module = self._load() File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/util/lazy_loader.py", line 45, in _load module = importlib.import_module(self.__name__) File "/usr/lib/python3.7/importlib/__init__.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "<frozen importlib._bootstrap>", line 1006, in _gcd_import File "<frozen importlib._bootstrap>", line 983, in _find_and_load File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 677, in _load_unlocked File "<frozen importlib._bootstrap_external>", line 728, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/compiler/tf2tensorrt/wrap_py_utils.py", line 28, in <module> _wrap_py_utils = swig_import_helper() File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/compiler/tf2tensorrt/wrap_py_utils.py", line 24, in swig_import_helper _mod = imp.load_module('_wrap_py_utils', fp, pathname, description) File "/usr/lib/python3.7/imp.py", line 242, in load_module return load_dynamic(name, filename, file) File "/usr/lib/python3.7/imp.py", line 342, in load_dynamic return _load(spec) File "<frozen importlib._bootstrap>", line 696, in _load File "<frozen importlib._bootstrap>", line 670, in _load_unlocked File "<frozen importlib._bootstrap>", line 583, in module_from_spec File "<frozen importlib._bootstrap_external>", line 1043, in create_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed ImportError: ~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/compiler/tf2tensorrt/_wrap_py_utils.so: undefined symbol: _ZN15stream_executor14StreamExecutor18EnablePeerAccessToEPS0_ ---------------- Note: The failure of target //tensorflow/python/keras/api:create_tensorflow.python_api_2_keras_python_api_gen_compat_v2 (with exit code 1) may have been caused by the fact that it is a Python 2 program that was built in the host configuration, which uses Python 3. You can change the host configuration (for the entire build) to instead use Python 2 by setting --host_force_python=PY2. If this error started occurring in Bazel 0.27 and later, it may be because the Python toolchain now enforces that targets analyzed as PY2 and PY3 run under a Python 2 and Python 3 interpreter, respectively. See https://github.com/bazelbuild/bazel/issues/7899 for more information. ---------------- Target //tensorflow/tools/pip_package:build_pip_package failed to build Use --verbose_failures to see the command lines of failed build steps. ERROR: ~/....../tensorflow/tools/pip_package/BUILD:40:1 Executing genrule //tensorflow/python/keras/api:keras_python_api_gen_compat_v2 failed (Exit 1) INFO: Elapsed time: 12747.266s, Critical Path: 300.17s INFO: 16213 processes: 16213 local. FAILED: Build did NOT complete successfully ➜ tensorflow git:(master) ✗ ```
1.0
Will TF2.0 build with Python3 ONLY, without Python2? - **System information** - OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu 19.10 - TensorFlow installed from (source or binary): source - TensorFlow version: 2.0 - Python version: 3.7.5 - Installed using virtualenv? pip? conda?: pip - Bazel version (if compiling from source):1.1.0 - GCC/Compiler version (if compiling from source): 9.2.1 - CUDA/cuDNN version:10.2/7.6.5 - GPU model and memory: GeForce GTX 980M/4035MiB ```console ERROR: ~/Downloads/....../tensorflow/python/keras/api/BUILD:129:1: Executing genrule //tensorflow/python/keras/api:keras_python_api_gen_compat_v2 failed (Exit 1) Traceback (most recent call last): File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/tools/api/generator/create_python_api.py", line 771, in <module> main() File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/tools/api/generator/create_python_api.py", line 767, in main lazy_loading, args.use_relative_imports) File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/tools/api/generator/create_python_api.py", line 625, in create_api_files api_version, compat_api_versions, lazy_loading, use_relative_imports) File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/tools/api/generator/create_python_api.py", line 502, in get_api_init_text _, attr = tf_decorator.unwrap(attr) File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/util/tf_decorator.py", line 219, in unwrap elif _has_tf_decorator_attr(cur): File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/util/tf_decorator.py", line 124, in _has_tf_decorator_attr hasattr(obj, '_tf_decorator') and File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/util/lazy_loader.py", line 62, in __getattr__ module = self._load() File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/python/util/lazy_loader.py", line 45, in _load module = importlib.import_module(self.__name__) File "/usr/lib/python3.7/importlib/__init__.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "<frozen importlib._bootstrap>", line 1006, in _gcd_import File "<frozen importlib._bootstrap>", line 983, in _find_and_load File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 677, in _load_unlocked File "<frozen importlib._bootstrap_external>", line 728, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/compiler/tf2tensorrt/wrap_py_utils.py", line 28, in <module> _wrap_py_utils = swig_import_helper() File "~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/compiler/tf2tensorrt/wrap_py_utils.py", line 24, in swig_import_helper _mod = imp.load_module('_wrap_py_utils', fp, pathname, description) File "/usr/lib/python3.7/imp.py", line 242, in load_module return load_dynamic(name, filename, file) File "/usr/lib/python3.7/imp.py", line 342, in load_dynamic return _load(spec) File "<frozen importlib._bootstrap>", line 696, in _load File "<frozen importlib._bootstrap>", line 670, in _load_unlocked File "<frozen importlib._bootstrap>", line 583, in module_from_spec File "<frozen importlib._bootstrap_external>", line 1043, in create_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed ImportError: ~/.cache/bazel/_bazel_longervision/90066176d51f3058b5ce7c4e1b3a40d7/execroot/org_tensorflow/bazel-out/host/bin/tensorflow/python/keras/api/create_tensorflow.python_api_2_keras_python_api_gen_compat_v2.runfiles/org_tensorflow/tensorflow/compiler/tf2tensorrt/_wrap_py_utils.so: undefined symbol: _ZN15stream_executor14StreamExecutor18EnablePeerAccessToEPS0_ ---------------- Note: The failure of target //tensorflow/python/keras/api:create_tensorflow.python_api_2_keras_python_api_gen_compat_v2 (with exit code 1) may have been caused by the fact that it is a Python 2 program that was built in the host configuration, which uses Python 3. You can change the host configuration (for the entire build) to instead use Python 2 by setting --host_force_python=PY2. If this error started occurring in Bazel 0.27 and later, it may be because the Python toolchain now enforces that targets analyzed as PY2 and PY3 run under a Python 2 and Python 3 interpreter, respectively. See https://github.com/bazelbuild/bazel/issues/7899 for more information. ---------------- Target //tensorflow/tools/pip_package:build_pip_package failed to build Use --verbose_failures to see the command lines of failed build steps. ERROR: ~/....../tensorflow/tools/pip_package/BUILD:40:1 Executing genrule //tensorflow/python/keras/api:keras_python_api_gen_compat_v2 failed (Exit 1) INFO: Elapsed time: 12747.266s, Critical Path: 300.17s INFO: 16213 processes: 16213 local. FAILED: Build did NOT complete successfully ➜ tensorflow git:(master) ✗ ```
non_process
will build with only without system information os platform and distribution e g linux ubuntu ubuntu tensorflow installed from source or binary source tensorflow version python version installed using virtualenv pip conda pip bazel version if compiling from source gcc compiler version if compiling from source cuda cudnn version gpu model and memory geforce gtx console error downloads tensorflow python keras api build executing genrule tensorflow python keras api keras python api gen compat failed exit traceback most recent call last file cache bazel bazel longervision execroot org tensorflow bazel out host bin tensorflow python keras api create tensorflow python api keras python api gen compat runfiles org tensorflow tensorflow python tools api generator create python api py line in main file cache bazel bazel longervision execroot org tensorflow bazel out host bin tensorflow python keras api create tensorflow python api keras python api gen compat runfiles org tensorflow tensorflow python tools api generator create python api py line in main lazy loading args use relative imports file cache bazel bazel longervision execroot org tensorflow bazel out host bin tensorflow python keras api create tensorflow python api keras python api gen compat runfiles org tensorflow tensorflow python tools api generator create python api py line in create api files api version compat api versions lazy loading use relative imports file cache bazel bazel longervision execroot org tensorflow bazel out host bin tensorflow python keras api create tensorflow python api keras python api gen compat runfiles org tensorflow tensorflow python tools api generator create python api py line in get api init text attr tf decorator unwrap attr file cache bazel bazel longervision execroot org tensorflow bazel out host bin tensorflow python keras api create tensorflow python api keras python api gen compat runfiles org tensorflow tensorflow python util tf decorator py line in unwrap elif has tf decorator attr cur file cache bazel bazel longervision execroot org tensorflow bazel out host bin tensorflow python keras api create tensorflow python api keras python api gen compat runfiles org tensorflow tensorflow python util tf decorator py line in has tf decorator attr hasattr obj tf decorator and file cache bazel bazel longervision execroot org tensorflow bazel out host bin tensorflow python keras api create tensorflow python api keras python api gen compat runfiles org tensorflow tensorflow python util lazy loader py line in getattr module self load file cache bazel bazel longervision execroot org tensorflow bazel out host bin tensorflow python keras api create tensorflow python api keras python api gen compat runfiles org tensorflow tensorflow python util lazy loader py line in load module importlib import module self name file usr lib importlib init py line in import module return bootstrap gcd import name package level file line in gcd import file line in find and load file line in find and load unlocked file line in load unlocked file line in exec module file line in call with frames removed file cache bazel bazel longervision execroot org tensorflow bazel out host bin tensorflow python keras api create tensorflow python api keras python api gen compat runfiles org tensorflow tensorflow compiler wrap py utils py line in wrap py utils swig import helper file cache bazel bazel longervision execroot org tensorflow bazel out host bin tensorflow python keras api create tensorflow python api keras python api gen compat runfiles org tensorflow tensorflow compiler wrap py utils py line in swig import helper mod imp load module wrap py utils fp pathname description file usr lib imp py line in load module return load dynamic name filename file file usr lib imp py line in load dynamic return load spec file line in load file line in load unlocked file line in module from spec file line in create module file line in call with frames removed importerror cache bazel bazel longervision execroot org tensorflow bazel out host bin tensorflow python keras api create tensorflow python api keras python api gen compat runfiles org tensorflow tensorflow compiler wrap py utils so undefined symbol note the failure of target tensorflow python keras api create tensorflow python api keras python api gen compat with exit code may have been caused by the fact that it is a python program that was built in the host configuration which uses python you can change the host configuration for the entire build to instead use python by setting host force python if this error started occurring in bazel and later it may be because the python toolchain now enforces that targets analyzed as and run under a python and python interpreter respectively see for more information target tensorflow tools pip package build pip package failed to build use verbose failures to see the command lines of failed build steps error tensorflow tools pip package build executing genrule tensorflow python keras api keras python api gen compat failed exit info elapsed time critical path info processes local failed build did not complete successfully ➜ tensorflow git master ✗
0
19,664
26,026,810,403
IssuesEvent
2022-12-21 17:02:43
MicrosoftDocs/azure-devops-docs
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
closed
set variables
doc-enhancement devops/prod Pri1 devops-cicd-process/tech
Hi I struggled with this documentation because i always tried to set the variable and read the variable in the one task! I was testing... It wasn't clear to me that you set the variable in one task and read it in another task, maybe its clear to other but it wasn't to me :-) Hope this helps --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 92ad0d9b-2e51-de2e-a529-6cbe55692023 * Version Independent ID: 609b6196-cc6b-677a-c76f-f82bb7cce10a * Content: [Set variables in scripts - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/set-variables-scripts?view=azure-devops&tabs=bash) * Content Source: [docs/pipelines/process/set-variables-scripts.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/set-variables-scripts.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
1.0
set variables - Hi I struggled with this documentation because i always tried to set the variable and read the variable in the one task! I was testing... It wasn't clear to me that you set the variable in one task and read it in another task, maybe its clear to other but it wasn't to me :-) Hope this helps --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 92ad0d9b-2e51-de2e-a529-6cbe55692023 * Version Independent ID: 609b6196-cc6b-677a-c76f-f82bb7cce10a * Content: [Set variables in scripts - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/set-variables-scripts?view=azure-devops&tabs=bash) * Content Source: [docs/pipelines/process/set-variables-scripts.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/set-variables-scripts.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
process
set variables hi i struggled with this documentation because i always tried to set the variable and read the variable in the one task i was testing it wasn t clear to me that you set the variable in one task and read it in another task maybe its clear to other but it wasn t to me hope this helps document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
1
21,285
28,479,590,308
IssuesEvent
2023-04-18 00:42:59
metabase/metabase
https://api.github.com/repos/metabase/metabase
closed
[MLv2] We need to validate that references you're adding to a query are actually valid for that stage
.Backend .metabase-lib .Team/QueryProcessor :hammer_and_wrench:
See also: #30083 Right now pMBQL is pretty forgiving and lets you add almost any sort of reference to a query as long as the type information is correct; You can do something like ```clj {:lib/type :mbql/query :stages [{:source-table 1, :filter [:= {...} [:field {..., :join-alias "X"} 2] 3]}]} ``` even tho there is no join `X`, or use a `[:field {...} <string-name>]` reference in the first stage of a query, or use `[:aggregation {...} 0]` in a query with no aggregations, etc. We should disallow as much of these things as we can in the stage schemas themselves. Some more advanced validation will require a metadata provider, but I suppose we should be able to do those checks against the query as a whole if it has `:lib/metadata`, right? If it's not feasible to do these checks in the actual schemas then maybe we can do them in some other way. Maybe some sort of general `(valid-reference-for-stage? query stage-number reference)` function would make sense for implementing this. TBD
1.0
[MLv2] We need to validate that references you're adding to a query are actually valid for that stage - See also: #30083 Right now pMBQL is pretty forgiving and lets you add almost any sort of reference to a query as long as the type information is correct; You can do something like ```clj {:lib/type :mbql/query :stages [{:source-table 1, :filter [:= {...} [:field {..., :join-alias "X"} 2] 3]}]} ``` even tho there is no join `X`, or use a `[:field {...} <string-name>]` reference in the first stage of a query, or use `[:aggregation {...} 0]` in a query with no aggregations, etc. We should disallow as much of these things as we can in the stage schemas themselves. Some more advanced validation will require a metadata provider, but I suppose we should be able to do those checks against the query as a whole if it has `:lib/metadata`, right? If it's not feasible to do these checks in the actual schemas then maybe we can do them in some other way. Maybe some sort of general `(valid-reference-for-stage? query stage-number reference)` function would make sense for implementing this. TBD
process
we need to validate that references you re adding to a query are actually valid for that stage see also right now pmbql is pretty forgiving and lets you add almost any sort of reference to a query as long as the type information is correct you can do something like clj lib type mbql query stages even tho there is no join x or use a reference in the first stage of a query or use in a query with no aggregations etc we should disallow as much of these things as we can in the stage schemas themselves some more advanced validation will require a metadata provider but i suppose we should be able to do those checks against the query as a whole if it has lib metadata right if it s not feasible to do these checks in the actual schemas then maybe we can do them in some other way maybe some sort of general valid reference for stage query stage number reference function would make sense for implementing this tbd
1
17,664
23,487,194,466
IssuesEvent
2022-08-17 15:17:34
MPMG-DCC-UFMG/C01
https://api.github.com/repos/MPMG-DCC-UFMG/C01
closed
Adicionar opções de navegadores
[2] Alta Prioridade [1] Requisito [0] Desenvolvimento [3] Processamento Dinâmico
## Comportamento Esperado Desejamos que seja possível escolher o navegador do webdriver na criação do coletor, mantendo, como default, o Chromium. ## Comportamento Atual Atualmente, temos apenas o Chromium como navegador. Usando o Playwright no Processamento Dinâmico, podemos adicionar outras opções. Apesar de ser uma demanda muito específica, uma opção como essa poderia resolver blockers relacionados a `user agents`, conforme documentado na #806. ## Passos para reproduzir o erro Não se aplica. ## Especificações da Coleta Não se aplica
1.0
Adicionar opções de navegadores - ## Comportamento Esperado Desejamos que seja possível escolher o navegador do webdriver na criação do coletor, mantendo, como default, o Chromium. ## Comportamento Atual Atualmente, temos apenas o Chromium como navegador. Usando o Playwright no Processamento Dinâmico, podemos adicionar outras opções. Apesar de ser uma demanda muito específica, uma opção como essa poderia resolver blockers relacionados a `user agents`, conforme documentado na #806. ## Passos para reproduzir o erro Não se aplica. ## Especificações da Coleta Não se aplica
process
adicionar opções de navegadores comportamento esperado desejamos que seja possível escolher o navegador do webdriver na criação do coletor mantendo como default o chromium comportamento atual atualmente temos apenas o chromium como navegador usando o playwright no processamento dinâmico podemos adicionar outras opções apesar de ser uma demanda muito específica uma opção como essa poderia resolver blockers relacionados a user agents conforme documentado na passos para reproduzir o erro não se aplica especificações da coleta não se aplica
1