Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
163,100
12,703,900,301
IssuesEvent
2020-06-22 23:38:57
M0nica/ambition-fund-website
https://api.github.com/repos/M0nica/ambition-fund-website
closed
[Increase Test Coverage] Sign Up Form
test-coverage
- jest/react-testing-library unit tests should be added to the signup component to confirm that the form fields appear and the form is functional. The request/request URL for the form service should be mocked.
1.0
[Increase Test Coverage] Sign Up Form - - jest/react-testing-library unit tests should be added to the signup component to confirm that the form fields appear and the form is functional. The request/request URL for the form service should be mocked.
non_process
sign up form jest react testing library unit tests should be added to the signup component to confirm that the form fields appear and the form is functional the request request url for the form service should be mocked
0
12,283
5,183,379,350
IssuesEvent
2017-01-20 00:36:20
dotnet/coreclr
https://api.github.com/repos/dotnet/coreclr
closed
Win Arm64 and Arm32 crossgen fails due to JIT asserts
area-CodeGen blocking-pipeline-build
ERROR: type should be string, got "\r\nhttps://devdiv.visualstudio.com/DefaultCollection/DevDiv/_build?_a=summary&buildId=525330&tab=details (arm)\r\n \r\n```\r\nAssert failure(PID 6352 [0x000018d0], Thread: 7604 [0x1db4]): Assertion failed '(block->bbFlags & BBF_FINALLY_TARGET) != 0' in 'System.Reflection.LoaderAllocatorScout:Finalize():this' (IL size 63)\r\n \r\n    File: e:\\a\\_work\\35\\s\\src\\jit\\flowgraph.cpp Line: 11661\r\n    Image: E:\\A\\_work\\35\\s\\bin\\Product\\Windows_NT.arm.Checked\\x86\\crossgen.exe\r\n```\r\n \r\nhttps://devdiv.visualstudio.com/DefaultCollection/DevDiv/_build?_a=summary&buildId=525329&tab=details (arm64)\r\n \r\n```\r\nAssert failure(PID 780 [0x0000030c], Thread: 5808 [0x16b0]): Assertion failed '(lastConsumedNode == nullptr) || (node->gtUseNum == -1) || (node->gtUseNum > lastConsumedNode->gtUseNum)' in 'System.Globalization.HebrewCalendar:GetDatePart(long,int):int:this' (IL size 467)\r\n \r\n    File: e:\\a\\_work\\12\\s\\src\\jit\\codegenlinear.cpp Line: 1101\r\n    Image: E:\\A\\_work\\12\\s\\bin\\Product\\Windows_NT.arm64.Checked\\x64\\crossgen.exe\r\n```\r\n\r\nAlso reported in today's (1/18) build - https://devdiv.visualstudio.com/DevDiv/_build?buildId=526476."
1.0
Win Arm64 and Arm32 crossgen fails due to JIT asserts - https://devdiv.visualstudio.com/DefaultCollection/DevDiv/_build?_a=summary&buildId=525330&tab=details (arm)   ``` Assert failure(PID 6352 [0x000018d0], Thread: 7604 [0x1db4]): Assertion failed '(block->bbFlags & BBF_FINALLY_TARGET) != 0' in 'System.Reflection.LoaderAllocatorScout:Finalize():this' (IL size 63)       File: e:\a\_work\35\s\src\jit\flowgraph.cpp Line: 11661     Image: E:\A\_work\35\s\bin\Product\Windows_NT.arm.Checked\x86\crossgen.exe ```   https://devdiv.visualstudio.com/DefaultCollection/DevDiv/_build?_a=summary&buildId=525329&tab=details (arm64)   ``` Assert failure(PID 780 [0x0000030c], Thread: 5808 [0x16b0]): Assertion failed '(lastConsumedNode == nullptr) || (node->gtUseNum == -1) || (node->gtUseNum > lastConsumedNode->gtUseNum)' in 'System.Globalization.HebrewCalendar:GetDatePart(long,int):int:this' (IL size 467)       File: e:\a\_work\12\s\src\jit\codegenlinear.cpp Line: 1101     Image: E:\A\_work\12\s\bin\Product\Windows_NT.arm64.Checked\x64\crossgen.exe ``` Also reported in today's (1/18) build - https://devdiv.visualstudio.com/DevDiv/_build?buildId=526476.
non_process
win and crossgen fails due to jit asserts arm   assert failure pid thread assertion failed block bbflags bbf finally target in system reflection loaderallocatorscout finalize this il size       file e a work s src jit flowgraph cpp line     image e a work s bin product windows nt arm checked crossgen exe     assert failure pid thread assertion failed lastconsumednode nullptr node gtusenum node gtusenum lastconsumednode gtusenum in system globalization hebrewcalendar getdatepart long int int this il size       file e a work s src jit codegenlinear cpp line     image e a work s bin product windows nt checked crossgen exe also reported in today s build
0
332,548
10,097,781,883
IssuesEvent
2019-07-28 09:25:52
arya107/AskArya-Node.js-Vue.js
https://api.github.com/repos/arya107/AskArya-Node.js-Vue.js
closed
Auto Close Functionality Doesn't Work Work on Dashboard Navbar
Priority Fix Required
When a link is clicked, the page changes but the Navbar doesn't close like the Frontend Navbar. ![Screenshot (218)](https://user-images.githubusercontent.com/28385499/61574585-fdd79c80-aad6-11e9-8c74-8c909fdf80bb.png)
1.0
Auto Close Functionality Doesn't Work Work on Dashboard Navbar - When a link is clicked, the page changes but the Navbar doesn't close like the Frontend Navbar. ![Screenshot (218)](https://user-images.githubusercontent.com/28385499/61574585-fdd79c80-aad6-11e9-8c74-8c909fdf80bb.png)
non_process
auto close functionality doesn t work work on dashboard navbar when a link is clicked the page changes but the navbar doesn t close like the frontend navbar
0
204
2,612,696,005
IssuesEvent
2015-02-27 16:09:36
Graylog2/graylog2-server
https://api.github.com/repos/Graylog2/graylog2-server
closed
Correlation of messages
processing
Right now graylog2 has the streams where you could define rules to match messages, the thing is that you need to know the specifics of what you are looking for. There's on feature that I think would make graylog2 really powerful, the ability to correlate messages, example: Alert01: If field "user" has the same value over a period of N time and field "event=300" give me an alert. Alert02: If field "host" has the same value over a period of N time and field "event=500" give me an alert. Alert03: If field "host" has the same value and "event=150" is followed by "event=250" over a period of N time alert me. The idea is that you could do behavior searches and alerts and not have only alerts based on discrete values. I've used this feature extensively in splunk, the alert would send the messages that triggered the rule. I think that's the main missing piece to bridge the gap between them.
1.0
Correlation of messages - Right now graylog2 has the streams where you could define rules to match messages, the thing is that you need to know the specifics of what you are looking for. There's on feature that I think would make graylog2 really powerful, the ability to correlate messages, example: Alert01: If field "user" has the same value over a period of N time and field "event=300" give me an alert. Alert02: If field "host" has the same value over a period of N time and field "event=500" give me an alert. Alert03: If field "host" has the same value and "event=150" is followed by "event=250" over a period of N time alert me. The idea is that you could do behavior searches and alerts and not have only alerts based on discrete values. I've used this feature extensively in splunk, the alert would send the messages that triggered the rule. I think that's the main missing piece to bridge the gap between them.
process
correlation of messages right now has the streams where you could define rules to match messages the thing is that you need to know the specifics of what you are looking for there s on feature that i think would make really powerful the ability to correlate messages example if field user has the same value over a period of n time and field event give me an alert if field host has the same value over a period of n time and field event give me an alert if field host has the same value and event is followed by event over a period of n time alert me the idea is that you could do behavior searches and alerts and not have only alerts based on discrete values i ve used this feature extensively in splunk the alert would send the messages that triggered the rule i think that s the main missing piece to bridge the gap between them
1
7,853
11,027,681,369
IssuesEvent
2019-12-06 09:59:02
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
race condition affecting the TilesXYZ MBTiles export plugin
Bug Processing
[i already reported the bug to original author (https://github.com/lutraconsulting/qgis-xyz-tiles/issues/30), but because the plugin is in the meanwhile a default component of QGIS and the issue could perhaps familiar to the developers of the main application, i try to report it here as well.] your TilesXYZ MBTiles export plugin (`qgis/python/plugins/processing/algs/qgis/TilesXYZ.py`) included in QGIS 3.10 for linux seems to be plagued by race conditions when it becomes executed in multiple threads. the issue doesn't appear in case of the DirectoryWriter, but the MBTilesWriter will hardly work at all and produce corrupt files with lots of missing tiles and other defects... as a symptom of this particular issue, you should often see error messages like this: ERROR 5: /tmp/test7.mbtiles: Access window out of range in RasterIO(). Requested (-569088,-368384) of size 256x256 on raster of 512x512. the strange negative numbers are a side effect of unexpected changed values of the `self._first_tile` variable resp. colliding zoom factors which are running in parallel and overwrite the expected state one another... i did try to fix the issue by replacing `self._first_tile` and `self._zoom_ds` with dictionaries indexed by the zoom factor to reduce the observable collisions. this had some positive effect, but it doesn't completely fix the issues. :( unfortunately i do not know, how to fix this kind of thread lock requirements in case of QGIS plugins in a more adequate manner. the versions of my affected debian linux installation: ``` QGIS-Version: 3.10.0-A Coruña QGIS-Codeversion: 6c816b4204 Qt-Version: 5.11.3 GDAL-Version: 2.4.2 GEOS-Version: 3.8.0-CAPI-1.13.1 PROJ-Version: Rel. 5.2.0, September 15th, 2018 Verarbeite Algorithmus… Algorithmus XYZ-Kacheln erzeugen (MBTiles) startet… Eingabeparameter: { 'BACKGROUND_COLOR' : QColor(0, 0, 0, 0), 'DPI' : 96, 'EXTENT' : '1716053.858612443,1721455.9642655044,5951485.136729757,5955838.399322829 [EPSG:3857]', 'METATILESIZE' : 4, 'OUTPUT_FILE' : '/tmp/test7.mbtiles', 'QUALITY' : 75, 'TILE_FORMAT' : 0, 'ZOOM_MAX' : 16, 'ZOOM_MIN' : 12 } Using 8 CPU Threads: Pushing all tiles at once: 11 tiles. Ausführung nach 2.34 Sekunden abgeschlossen Ergebnisse: {'OUTPUT_FILE': '/tmp/test7.mbtiles'} ```
1.0
race condition affecting the TilesXYZ MBTiles export plugin - [i already reported the bug to original author (https://github.com/lutraconsulting/qgis-xyz-tiles/issues/30), but because the plugin is in the meanwhile a default component of QGIS and the issue could perhaps familiar to the developers of the main application, i try to report it here as well.] your TilesXYZ MBTiles export plugin (`qgis/python/plugins/processing/algs/qgis/TilesXYZ.py`) included in QGIS 3.10 for linux seems to be plagued by race conditions when it becomes executed in multiple threads. the issue doesn't appear in case of the DirectoryWriter, but the MBTilesWriter will hardly work at all and produce corrupt files with lots of missing tiles and other defects... as a symptom of this particular issue, you should often see error messages like this: ERROR 5: /tmp/test7.mbtiles: Access window out of range in RasterIO(). Requested (-569088,-368384) of size 256x256 on raster of 512x512. the strange negative numbers are a side effect of unexpected changed values of the `self._first_tile` variable resp. colliding zoom factors which are running in parallel and overwrite the expected state one another... i did try to fix the issue by replacing `self._first_tile` and `self._zoom_ds` with dictionaries indexed by the zoom factor to reduce the observable collisions. this had some positive effect, but it doesn't completely fix the issues. :( unfortunately i do not know, how to fix this kind of thread lock requirements in case of QGIS plugins in a more adequate manner. the versions of my affected debian linux installation: ``` QGIS-Version: 3.10.0-A Coruña QGIS-Codeversion: 6c816b4204 Qt-Version: 5.11.3 GDAL-Version: 2.4.2 GEOS-Version: 3.8.0-CAPI-1.13.1 PROJ-Version: Rel. 5.2.0, September 15th, 2018 Verarbeite Algorithmus… Algorithmus XYZ-Kacheln erzeugen (MBTiles) startet… Eingabeparameter: { 'BACKGROUND_COLOR' : QColor(0, 0, 0, 0), 'DPI' : 96, 'EXTENT' : '1716053.858612443,1721455.9642655044,5951485.136729757,5955838.399322829 [EPSG:3857]', 'METATILESIZE' : 4, 'OUTPUT_FILE' : '/tmp/test7.mbtiles', 'QUALITY' : 75, 'TILE_FORMAT' : 0, 'ZOOM_MAX' : 16, 'ZOOM_MIN' : 12 } Using 8 CPU Threads: Pushing all tiles at once: 11 tiles. Ausführung nach 2.34 Sekunden abgeschlossen Ergebnisse: {'OUTPUT_FILE': '/tmp/test7.mbtiles'} ```
process
race condition affecting the tilesxyz mbtiles export plugin your tilesxyz mbtiles export plugin qgis python plugins processing algs qgis tilesxyz py included in qgis for linux seems to be plagued by race conditions when it becomes executed in multiple threads the issue doesn t appear in case of the directorywriter but the mbtileswriter will hardly work at all and produce corrupt files with lots of missing tiles and other defects as a symptom of this particular issue you should often see error messages like this error tmp mbtiles access window out of range in rasterio requested of size on raster of the strange negative numbers are a side effect of unexpected changed values of the self first tile variable resp colliding zoom factors which are running in parallel and overwrite the expected state one another i did try to fix the issue by replacing self first tile and self zoom ds with dictionaries indexed by the zoom factor to reduce the observable collisions this had some positive effect but it doesn t completely fix the issues unfortunately i do not know how to fix this kind of thread lock requirements in case of qgis plugins in a more adequate manner the versions of my affected debian linux installation qgis version a coruña qgis codeversion qt version gdal version geos version capi proj version rel september verarbeite algorithmus… algorithmus xyz kacheln erzeugen mbtiles startet… eingabeparameter background color qcolor dpi extent metatilesize output file tmp mbtiles quality tile format zoom max zoom min using cpu threads pushing all tiles at once tiles ausführung nach sekunden abgeschlossen ergebnisse output file tmp mbtiles
1
24,952
6,608,877,143
IssuesEvent
2017-09-19 12:46:07
joomla/joomla-cms
https://api.github.com/repos/joomla/joomla-cms
closed
Duplicate article when change category on frontend with custom field
No Code Attached Yet
### Steps to reproduce the issue 1. Edit any article on frontend 2. Make sure that article has custom field 3. Change category 4. Hit save 5. Go to articles management on backend and see the result ### Expected result - Article Category is changed ### Actual result - Article is duplicated with new category ### System information (as much as possible) - Tested with joomla 3.7.5, 3.8 rc - PHP 5.6 - MariaDB 10.2.3 ### Additional comments Video: https://www.youtube.com/watch?v=YM8McupJBow&feature=youtu.be
1.0
Duplicate article when change category on frontend with custom field - ### Steps to reproduce the issue 1. Edit any article on frontend 2. Make sure that article has custom field 3. Change category 4. Hit save 5. Go to articles management on backend and see the result ### Expected result - Article Category is changed ### Actual result - Article is duplicated with new category ### System information (as much as possible) - Tested with joomla 3.7.5, 3.8 rc - PHP 5.6 - MariaDB 10.2.3 ### Additional comments Video: https://www.youtube.com/watch?v=YM8McupJBow&feature=youtu.be
non_process
duplicate article when change category on frontend with custom field steps to reproduce the issue edit any article on frontend make sure that article has custom field change category hit save go to articles management on backend and see the result expected result article category is changed actual result article is duplicated with new category system information as much as possible tested with joomla rc php mariadb additional comments video
0
57,883
8,212,567,639
IssuesEvent
2018-09-04 16:43:30
mrdoob/three.js
https://api.github.com/repos/mrdoob/three.js
closed
Documentation: Broken link in InstancedBufferGeometry doc page
Bug Documentation
##### Description of the problem The bottom of the doc page of [InstancedBufferGeometry](https://threejs.org/docs/index.html#api/en/core/InstancedBufferGeometry) has a link to the source [src/en/core/InstancedBufferGeometry.js](https://github.com/mrdoob/three.js/blob/master/src/en/core/InstancedBufferGeometry.js) which is broken. ##### Three.js version - [x] r96 ##### Browser - [x] Opera - [x] Chrome - [x] and maybe all others ##### OS - [x] All of them ![untitled](https://user-images.githubusercontent.com/20371270/44948837-b9df9880-ae2d-11e8-9368-50cb98e9345d.png) »(◕‸◕)«
1.0
Documentation: Broken link in InstancedBufferGeometry doc page - ##### Description of the problem The bottom of the doc page of [InstancedBufferGeometry](https://threejs.org/docs/index.html#api/en/core/InstancedBufferGeometry) has a link to the source [src/en/core/InstancedBufferGeometry.js](https://github.com/mrdoob/three.js/blob/master/src/en/core/InstancedBufferGeometry.js) which is broken. ##### Three.js version - [x] r96 ##### Browser - [x] Opera - [x] Chrome - [x] and maybe all others ##### OS - [x] All of them ![untitled](https://user-images.githubusercontent.com/20371270/44948837-b9df9880-ae2d-11e8-9368-50cb98e9345d.png) »(◕‸◕)«
non_process
documentation broken link in instancedbuffergeometry doc page description of the problem the bottom of the doc page of has a link to the source which is broken three js version browser opera chrome and maybe all others os all of them » ◕‸◕ «
0
49,579
13,454,372,727
IssuesEvent
2020-09-09 03:31:37
ErezDasa/RB2
https://api.github.com/repos/ErezDasa/RB2
opened
CVE-2020-14060 (High) detected in jackson-databind-2.9.9.jar
security vulnerability
## CVE-2020-14060 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.9.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /tmp/ws-scm/RB2/infra_github/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.9/jackson-databind-2.9.9.jar</p> <p> Dependency Hierarchy: - spring-cloud-starter-config-2.0.3.RELEASE.jar (Root Library) - :x: **jackson-databind-2.9.9.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ErezDasa/RB2/commit/76c9fcc27c447880699d2960c8688c59736444af">76c9fcc27c447880699d2960c8688c59736444af</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.5 mishandles the interaction between serialization gadgets and typing, related to oadd.org.apache.xalan.lib.sql.JNDIConnectionPool (aka apache/drill). <p>Publish Date: 2020-06-14 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14060>CVE-2020-14060</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14060">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14060</a></p> <p>Release Date: 2020-06-14</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.10.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-14060 (High) detected in jackson-databind-2.9.9.jar - ## CVE-2020-14060 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.9.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /tmp/ws-scm/RB2/infra_github/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.9/jackson-databind-2.9.9.jar</p> <p> Dependency Hierarchy: - spring-cloud-starter-config-2.0.3.RELEASE.jar (Root Library) - :x: **jackson-databind-2.9.9.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ErezDasa/RB2/commit/76c9fcc27c447880699d2960c8688c59736444af">76c9fcc27c447880699d2960c8688c59736444af</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.5 mishandles the interaction between serialization gadgets and typing, related to oadd.org.apache.xalan.lib.sql.JNDIConnectionPool (aka apache/drill). <p>Publish Date: 2020-06-14 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14060>CVE-2020-14060</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14060">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14060</a></p> <p>Release Date: 2020-06-14</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.10.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file tmp ws scm infra github pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy spring cloud starter config release jar root library x jackson databind jar vulnerable library found in head commit a href vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to oadd org apache xalan lib sql jndiconnectionpool aka apache drill publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind step up your open source security game with whitesource
0
35,605
7,787,795,432
IssuesEvent
2018-06-07 00:29:48
google/sanitizers
https://api.github.com/repos/google/sanitizers
closed
Wrong line number in MSan report
Priority-Medium ProjectMemorySanitizer Status-Accepted Type-Defect
Originally reported on Google Code with ID 49 ``` https://code.google.com/p/chromium/issues/detail?id=338382 This chromium bug mentions a report in S32A_Opaque_BlitRow32_SSE2 with wrong line number in the top frame. Please make a reduced test case out of it. ``` Reported by `eugenis@google.com` on 2014-02-20 10:42:54
1.0
Wrong line number in MSan report - Originally reported on Google Code with ID 49 ``` https://code.google.com/p/chromium/issues/detail?id=338382 This chromium bug mentions a report in S32A_Opaque_BlitRow32_SSE2 with wrong line number in the top frame. Please make a reduced test case out of it. ``` Reported by `eugenis@google.com` on 2014-02-20 10:42:54
non_process
wrong line number in msan report originally reported on google code with id this chromium bug mentions a report in opaque with wrong line number in the top frame please make a reduced test case out of it reported by eugenis google com on
0
5,336
8,154,988,163
IssuesEvent
2018-08-23 06:29:47
zeebe-io/zeebe
https://api.github.com/repos/zeebe-io/zeebe
opened
Concurrent reprocessing of WorkflowInstanceStreamProcessor and DeploymentProcessor is broken
broker bug stream processor
## Scenario * deploy workflow * start workflow instance * stop broker * purge snapshots * start broker * reprocess is happening in parallel * reprocess workflow instance events * reprocess deployment events ## Problem Since deployment are not yet deployed again (not added to the WorkflowRepository) workflow reprocessing fails. Workflow reprocessing tries to fetch workflow, which returns `null`. ## Note Workflow repo api returns nothing/null if workflow is not deployed/added. Current code in BpmnStepProcessor does not expect that and continues without checking returned workflow. ## Possible solution We could integrate the deployment processor into the workflow instance processor. This would mean that the deployment events are read before and added to the cache and then the workflow events are reprocessed successfully.
1.0
Concurrent reprocessing of WorkflowInstanceStreamProcessor and DeploymentProcessor is broken - ## Scenario * deploy workflow * start workflow instance * stop broker * purge snapshots * start broker * reprocess is happening in parallel * reprocess workflow instance events * reprocess deployment events ## Problem Since deployment are not yet deployed again (not added to the WorkflowRepository) workflow reprocessing fails. Workflow reprocessing tries to fetch workflow, which returns `null`. ## Note Workflow repo api returns nothing/null if workflow is not deployed/added. Current code in BpmnStepProcessor does not expect that and continues without checking returned workflow. ## Possible solution We could integrate the deployment processor into the workflow instance processor. This would mean that the deployment events are read before and added to the cache and then the workflow events are reprocessed successfully.
process
concurrent reprocessing of workflowinstancestreamprocessor and deploymentprocessor is broken scenario deploy workflow start workflow instance stop broker purge snapshots start broker reprocess is happening in parallel reprocess workflow instance events reprocess deployment events problem since deployment are not yet deployed again not added to the workflowrepository workflow reprocessing fails workflow reprocessing tries to fetch workflow which returns null note workflow repo api returns nothing null if workflow is not deployed added current code in bpmnstepprocessor does not expect that and continues without checking returned workflow possible solution we could integrate the deployment processor into the workflow instance processor this would mean that the deployment events are read before and added to the cache and then the workflow events are reprocessed successfully
1
18,860
24,781,222,765
IssuesEvent
2022-10-24 05:17:41
prisma/prisma
https://api.github.com/repos/prisma/prisma
closed
Error in migration engine: We should only be setting a changed default if there was one on the previous schema and in the next with the same enum.
kind/bug process/candidate tech/engines/datamodel topic: error reporting team/schema topic: postgresql
<!-- If required, please update the title to be clear and descriptive --> Command: `prisma migrate dev` Version: `4.5.0` Binary Version: `0362da9eebca54d94c8ef5edd3b2e90af99ba452` Report: https://prisma-errors.netlify.app/report/14387 OS: `arm64 darwin 21.6.0` Rust Stacktrace: ``` Starting migration engine RPC server Analysis run in 26ms [migration-engine/connectors/sql-migration-connector/src/sql_renderer/postgres_renderer.rs:1011:22] We should only be setting a changed default if there was one on the previous schema and in the next with the same enum. ``` ## Context This error is happening because we're attempting to unwrap a `None` value into `default_str` via an `expect()` call: https://github.com/prisma/prisma-engines/blob/1efe6d372597cdb48067c5dd1f441702a1b0861e/migration-engine/connectors/sql-migration-connector/src/sql_renderer/postgres_renderer.rs#L1006-L1020 ## Potential Solution I think we should conditionally apply the `SET DEFAULT` expression in the pushed statement only when `default_str` is a `Some(_)`. I could easily do that as a side task.
1.0
Error in migration engine: We should only be setting a changed default if there was one on the previous schema and in the next with the same enum. - <!-- If required, please update the title to be clear and descriptive --> Command: `prisma migrate dev` Version: `4.5.0` Binary Version: `0362da9eebca54d94c8ef5edd3b2e90af99ba452` Report: https://prisma-errors.netlify.app/report/14387 OS: `arm64 darwin 21.6.0` Rust Stacktrace: ``` Starting migration engine RPC server Analysis run in 26ms [migration-engine/connectors/sql-migration-connector/src/sql_renderer/postgres_renderer.rs:1011:22] We should only be setting a changed default if there was one on the previous schema and in the next with the same enum. ``` ## Context This error is happening because we're attempting to unwrap a `None` value into `default_str` via an `expect()` call: https://github.com/prisma/prisma-engines/blob/1efe6d372597cdb48067c5dd1f441702a1b0861e/migration-engine/connectors/sql-migration-connector/src/sql_renderer/postgres_renderer.rs#L1006-L1020 ## Potential Solution I think we should conditionally apply the `SET DEFAULT` expression in the pushed statement only when `default_str` is a `Some(_)`. I could easily do that as a side task.
process
error in migration engine we should only be setting a changed default if there was one on the previous schema and in the next with the same enum command prisma migrate dev version binary version report os darwin rust stacktrace starting migration engine rpc server analysis run in we should only be setting a changed default if there was one on the previous schema and in the next with the same enum context this error is happening because we re attempting to unwrap a none  value into default str via an expect call potential solution i think we should conditionally apply the set default expression in the pushed statement only when default str is a some i could easily do that as a side task
1
610,653
18,912,957,200
IssuesEvent
2021-11-16 15:46:23
mozilla/addons-server
https://api.github.com/repos/mozilla/addons-server
closed
Add an admin for ReviewActionReasonLog
component: reviewer tools priority: p3
A feature request is to be able to change the reason assigned to a review action if the original reason selected was incorrect. Possibly the simplest way to address this is to add an admin for `ReviewActionReasonLog` which will allow a user to edit the `reason`. I did a quick test of this, and it seems to be doable, although we'd need a way for users to be able to search through the log entries to find the one they want.
1.0
Add an admin for ReviewActionReasonLog - A feature request is to be able to change the reason assigned to a review action if the original reason selected was incorrect. Possibly the simplest way to address this is to add an admin for `ReviewActionReasonLog` which will allow a user to edit the `reason`. I did a quick test of this, and it seems to be doable, although we'd need a way for users to be able to search through the log entries to find the one they want.
non_process
add an admin for reviewactionreasonlog a feature request is to be able to change the reason assigned to a review action if the original reason selected was incorrect possibly the simplest way to address this is to add an admin for reviewactionreasonlog which will allow a user to edit the reason i did a quick test of this and it seems to be doable although we d need a way for users to be able to search through the log entries to find the one they want
0
12,184
14,742,116,400
IssuesEvent
2021-01-07 11:43:48
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Add Keith Mayer
anc-process anp-1 ant-support
In GitLab by @tim.traylor on Mar 14, 2019, 09:06 Hi, Can we please add Keith Mayer (keith.mayer@answernet.com) to this Gitlab? Thx
1.0
Add Keith Mayer - In GitLab by @tim.traylor on Mar 14, 2019, 09:06 Hi, Can we please add Keith Mayer (keith.mayer@answernet.com) to this Gitlab? Thx
process
add keith mayer in gitlab by tim traylor on mar hi can we please add keith mayer keith mayer answernet com to this gitlab thx
1
230,607
25,482,735,099
IssuesEvent
2022-11-26 01:21:38
Nivaskumark/kernel_v4.1.15
https://api.github.com/repos/Nivaskumark/kernel_v4.1.15
reopened
CVE-2017-17855 (High) detected in linuxlinux-4.6
security vulnerability
## CVE-2017-17855 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.6</b></p></summary> <p> <p>The Linux Kernel</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in HEAD commit: <a href="https://github.com/Nivaskumark/kernel_v4.1.15/commit/00db4e8795bcbec692fb60b19160bdd763ad42e3">00db4e8795bcbec692fb60b19160bdd763ad42e3</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/kernel/bpf/verifier.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/kernel/bpf/verifier.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> kernel/bpf/verifier.c in the Linux kernel through 4.14.8 allows local users to cause a denial of service (memory corruption) or possibly have unspecified other impact by leveraging improper use of pointers in place of scalars. <p>Publish Date: 2017-12-27 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-17855>CVE-2017-17855</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.4</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-17855">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-17855</a></p> <p>Release Date: 2017-12-27</p> <p>Fix Resolution: v4.15-rc5</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2017-17855 (High) detected in linuxlinux-4.6 - ## CVE-2017-17855 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.6</b></p></summary> <p> <p>The Linux Kernel</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in HEAD commit: <a href="https://github.com/Nivaskumark/kernel_v4.1.15/commit/00db4e8795bcbec692fb60b19160bdd763ad42e3">00db4e8795bcbec692fb60b19160bdd763ad42e3</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/kernel/bpf/verifier.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/kernel/bpf/verifier.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> kernel/bpf/verifier.c in the Linux kernel through 4.14.8 allows local users to cause a denial of service (memory corruption) or possibly have unspecified other impact by leveraging improper use of pointers in place of scalars. <p>Publish Date: 2017-12-27 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-17855>CVE-2017-17855</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.4</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-17855">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-17855</a></p> <p>Release Date: 2017-12-27</p> <p>Fix Resolution: v4.15-rc5</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in linuxlinux cve high severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href found in base branch master vulnerable source files kernel bpf verifier c kernel bpf verifier c vulnerability details kernel bpf verifier c in the linux kernel through allows local users to cause a denial of service memory corruption or possibly have unspecified other impact by leveraging improper use of pointers in place of scalars publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
0
80,799
3,574,631,841
IssuesEvent
2016-01-27 12:48:26
leeensminger/OED_Wetlands
https://api.github.com/repos/leeensminger/OED_Wetlands
closed
Mitigation Summary grid does not display edits.
bug - high priority
After editing and saving the Mitigation Summary panel, edits are not displayed within the grid, and values return to zero. This feature had previously been working as designed. Reproduce: 1. In Projects tab, enable editing. Double click in the Mitigation Summary panel to open the form. 2. Enter a value for MITIGATION REQUIRED and click apply. 3. The edits are not displayed. I cannot confirm if edits are actually being written to the database, or not. Very likely this issue is highly correlated with #41 .
1.0
Mitigation Summary grid does not display edits. - After editing and saving the Mitigation Summary panel, edits are not displayed within the grid, and values return to zero. This feature had previously been working as designed. Reproduce: 1. In Projects tab, enable editing. Double click in the Mitigation Summary panel to open the form. 2. Enter a value for MITIGATION REQUIRED and click apply. 3. The edits are not displayed. I cannot confirm if edits are actually being written to the database, or not. Very likely this issue is highly correlated with #41 .
non_process
mitigation summary grid does not display edits after editing and saving the mitigation summary panel edits are not displayed within the grid and values return to zero this feature had previously been working as designed reproduce in projects tab enable editing double click in the mitigation summary panel to open the form enter a value for mitigation required and click apply the edits are not displayed i cannot confirm if edits are actually being written to the database or not very likely this issue is highly correlated with
0
21,720
30,221,394,258
IssuesEvent
2023-07-05 19:42:50
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
Obsolete positive regulation by symbiont of host inflammatory response
multi-species process
Single protein annotated, see https://github.com/geneontology/go-ontology/issues/25685
1.0
Obsolete positive regulation by symbiont of host inflammatory response - Single protein annotated, see https://github.com/geneontology/go-ontology/issues/25685
process
obsolete positive regulation by symbiont of host inflammatory response single protein annotated see
1
465,721
13,390,937,494
IssuesEvent
2020-09-02 21:28:45
edgi-govdata-archiving/Environmental-Enforcement-Watch
https://api.github.com/repos/edgi-govdata-archiving/Environmental-Enforcement-Watch
closed
Develop Sunrise notebook
[priority-★★☆]
See: https://github.com/edgi-govdata-archiving/EEW_Planning/issues/169 To do: - [x] Create new repo: https://github.com/edgi-govdata-archiving/ECHO-Sunrise - [x] Start this from existing Cross Program notebook - [x] Constrain to MA - [ ] Test relevant metrics (density, violations/facility, non-compliance rate, penalties/facility, etc.) - [x] Test additional geographies (MA House, municipalities)
1.0
Develop Sunrise notebook - See: https://github.com/edgi-govdata-archiving/EEW_Planning/issues/169 To do: - [x] Create new repo: https://github.com/edgi-govdata-archiving/ECHO-Sunrise - [x] Start this from existing Cross Program notebook - [x] Constrain to MA - [ ] Test relevant metrics (density, violations/facility, non-compliance rate, penalties/facility, etc.) - [x] Test additional geographies (MA House, municipalities)
non_process
develop sunrise notebook see to do create new repo start this from existing cross program notebook constrain to ma test relevant metrics density violations facility non compliance rate penalties facility etc test additional geographies ma house municipalities
0
16,544
21,568,598,905
IssuesEvent
2022-05-02 04:17:55
lynnandtonic/nestflix.fun
https://api.github.com/repos/lynnandtonic/nestflix.fun
closed
Add Gina Tracker, FBI
suggested title in process
Please add as much of the following info as you can: Title: Gina Tracker, FBI Type (film/tv show): Crime Show Film or show in which it appears: Central Park, Season 2 Episode 1 ([Apple TV+ link](https://tv.apple.com/us/episode/central-dark/umc.cmc.3n9rqp0k7hw231v87h614ndii)) Is the parent film/show streaming anywhere? Apple TV About when in the parent film/show does it appear? 5:36, 8:00, 24:28 Actual footage of the film/show can be seen (yes/no)? yes ![gina tracker, fbi](https://user-images.githubusercontent.com/1124321/129180534-b989dc42-8e8b-4efa-b49e-c4838d1f2726.png)
1.0
Add Gina Tracker, FBI - Please add as much of the following info as you can: Title: Gina Tracker, FBI Type (film/tv show): Crime Show Film or show in which it appears: Central Park, Season 2 Episode 1 ([Apple TV+ link](https://tv.apple.com/us/episode/central-dark/umc.cmc.3n9rqp0k7hw231v87h614ndii)) Is the parent film/show streaming anywhere? Apple TV About when in the parent film/show does it appear? 5:36, 8:00, 24:28 Actual footage of the film/show can be seen (yes/no)? yes ![gina tracker, fbi](https://user-images.githubusercontent.com/1124321/129180534-b989dc42-8e8b-4efa-b49e-c4838d1f2726.png)
process
add gina tracker fbi please add as much of the following info as you can title gina tracker fbi type film tv show crime show film or show in which it appears central park season episode is the parent film show streaming anywhere apple tv about when in the parent film show does it appear actual footage of the film show can be seen yes no yes
1
102,380
11,296,779,785
IssuesEvent
2020-01-17 03:12:20
vuetifyjs/vuetify
https://api.github.com/repos/vuetifyjs/vuetify
closed
[Bug Report] V-Autocomplete selects wrong item
T: documentation
### Environment **Vuetify Version:** 2.1.6 **Vue Version:** 2.5.22 **Browsers:** Firefox 70.0 **OS:** Windows 10 ### Steps to reproduce 1. Type 1009 into the autocomplete to search 2. Select the entry with 1009 in it 3. The entry with 6 is selected ### Expected Behavior 1. Search for entry 1009 2. Select entry 1009 3. Get entry 1009 ### Actual Behavior Selects the wrong entry. ### Reproduction Link <a href="https://codepen.io/auspex/pen/YzzVNqP?&editable=true&editors=101" target="_blank">https://codepen.io/auspex/pen/YzzVNqP?&editable=true&editors=101</a> <!-- generated by vuetify-issue-helper. DO NOT REMOVE -->
1.0
[Bug Report] V-Autocomplete selects wrong item - ### Environment **Vuetify Version:** 2.1.6 **Vue Version:** 2.5.22 **Browsers:** Firefox 70.0 **OS:** Windows 10 ### Steps to reproduce 1. Type 1009 into the autocomplete to search 2. Select the entry with 1009 in it 3. The entry with 6 is selected ### Expected Behavior 1. Search for entry 1009 2. Select entry 1009 3. Get entry 1009 ### Actual Behavior Selects the wrong entry. ### Reproduction Link <a href="https://codepen.io/auspex/pen/YzzVNqP?&editable=true&editors=101" target="_blank">https://codepen.io/auspex/pen/YzzVNqP?&editable=true&editors=101</a> <!-- generated by vuetify-issue-helper. DO NOT REMOVE -->
non_process
v autocomplete selects wrong item environment vuetify version vue version browsers firefox os windows steps to reproduce type into the autocomplete to search select the entry with in it the entry with is selected expected behavior search for entry select entry get entry actual behavior selects the wrong entry reproduction link
0
10,381
7,174,588,366
IssuesEvent
2018-01-31 00:24:16
Beep6581/RawTherapee
https://api.github.com/repos/Beep6581/RawTherapee
closed
Improve time to start rt
Performance enhancement
This issue is about to improve the time to start rt. First patch will follow soon.
True
Improve time to start rt - This issue is about to improve the time to start rt. First patch will follow soon.
non_process
improve time to start rt this issue is about to improve the time to start rt first patch will follow soon
0
123,527
17,772,253,525
IssuesEvent
2021-08-30 14:54:03
kapseliboi/ac-web
https://api.github.com/repos/kapseliboi/ac-web
opened
CVE-2019-20920 (High) detected in handlebars-4.4.5.tgz
security vulnerability
## CVE-2019-20920 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.4.5.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.4.5.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.4.5.tgz</a></p> <p>Path to dependency file: ac-web/package.json</p> <p>Path to vulnerable library: ac-web/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - jest-cli-24.9.0.tgz (Root Library) - core-24.9.0.tgz - reporters-24.9.0.tgz - istanbul-reports-2.2.6.tgz - :x: **handlebars-4.4.5.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/kapseliboi/ac-web/commit/dfced36be0641d32ba1dbfcdd9969dd354b300c5">dfced36be0641d32ba1dbfcdd9969dd354b300c5</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Handlebars before 3.0.8 and 4.x before 4.5.3 is vulnerable to Arbitrary Code Execution. The lookup helper fails to properly validate templates, allowing attackers to submit templates that execute arbitrary JavaScript. This can be used to run arbitrary code on a server processing Handlebars templates or in a victim's browser (effectively serving as XSS). <p>Publish Date: 2020-09-30 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20920>CVE-2019-20920</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.npmjs.com/advisories/1324">https://www.npmjs.com/advisories/1324</a></p> <p>Release Date: 2020-10-15</p> <p>Fix Resolution: handlebars - 4.5.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-20920 (High) detected in handlebars-4.4.5.tgz - ## CVE-2019-20920 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.4.5.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.4.5.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.4.5.tgz</a></p> <p>Path to dependency file: ac-web/package.json</p> <p>Path to vulnerable library: ac-web/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - jest-cli-24.9.0.tgz (Root Library) - core-24.9.0.tgz - reporters-24.9.0.tgz - istanbul-reports-2.2.6.tgz - :x: **handlebars-4.4.5.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/kapseliboi/ac-web/commit/dfced36be0641d32ba1dbfcdd9969dd354b300c5">dfced36be0641d32ba1dbfcdd9969dd354b300c5</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Handlebars before 3.0.8 and 4.x before 4.5.3 is vulnerable to Arbitrary Code Execution. The lookup helper fails to properly validate templates, allowing attackers to submit templates that execute arbitrary JavaScript. This can be used to run arbitrary code on a server processing Handlebars templates or in a victim's browser (effectively serving as XSS). <p>Publish Date: 2020-09-30 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20920>CVE-2019-20920</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.npmjs.com/advisories/1324">https://www.npmjs.com/advisories/1324</a></p> <p>Release Date: 2020-10-15</p> <p>Fix Resolution: handlebars - 4.5.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in handlebars tgz cve high severity vulnerability vulnerable library handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file ac web package json path to vulnerable library ac web node modules handlebars package json dependency hierarchy jest cli tgz root library core tgz reporters tgz istanbul reports tgz x handlebars tgz vulnerable library found in head commit a href found in base branch master vulnerability details handlebars before and x before is vulnerable to arbitrary code execution the lookup helper fails to properly validate templates allowing attackers to submit templates that execute arbitrary javascript this can be used to run arbitrary code on a server processing handlebars templates or in a victim s browser effectively serving as xss publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope changed impact metrics confidentiality impact high integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution handlebars step up your open source security game with whitesource
0
4,237
7,187,096,583
IssuesEvent
2018-02-02 02:54:54
Great-Hill-Corporation/quickBlocks
https://api.github.com/repos/Great-Hill-Corporation/quickBlocks
closed
Modeling for database samples and tokenomics
monitors-all status-inprocess type-enhancement
This is a first pass, but I think I can do pretty much everything I want to do with this. Is this what you were looking for? Does it make sense? - A monitor is a list of Ethereum addresses - Each address can participate as either the ‘to’ or the ‘from’ address in a transaction. - A transaction either sends ether from one account to another or calls a function on one account from another - A block’s timestamp is the smallest granularity of time (usually 14 seconds), therefor all transactions a block have the same timestamp and as a result the same price at time of transaction - I used hashes as the relationship keys. Hashes are verbose and probably take up too much space, but they are perfect unique identifiers and therefore keys. - I’m not sure I have the modeling of the ‘to’ and ‘from’ correct, but I want to be able to query for all transactions from, all transactions to, and all transaction either in or out of an account. <img width="1751" alt="screen shot 2017-11-10 at 12 01 02 am" src="https://user-images.githubusercontent.com/5417918/32643596-6be808e0-c5aa-11e7-9c82-6e4117b9bec4.png"> I downloaded and used [mySQLWorkbench](https://www.mysql.com/products/workbench/) which is very nice and easy to use. I can share the file, if you need it. Let me know.
1.0
Modeling for database samples and tokenomics - This is a first pass, but I think I can do pretty much everything I want to do with this. Is this what you were looking for? Does it make sense? - A monitor is a list of Ethereum addresses - Each address can participate as either the ‘to’ or the ‘from’ address in a transaction. - A transaction either sends ether from one account to another or calls a function on one account from another - A block’s timestamp is the smallest granularity of time (usually 14 seconds), therefor all transactions a block have the same timestamp and as a result the same price at time of transaction - I used hashes as the relationship keys. Hashes are verbose and probably take up too much space, but they are perfect unique identifiers and therefore keys. - I’m not sure I have the modeling of the ‘to’ and ‘from’ correct, but I want to be able to query for all transactions from, all transactions to, and all transaction either in or out of an account. <img width="1751" alt="screen shot 2017-11-10 at 12 01 02 am" src="https://user-images.githubusercontent.com/5417918/32643596-6be808e0-c5aa-11e7-9c82-6e4117b9bec4.png"> I downloaded and used [mySQLWorkbench](https://www.mysql.com/products/workbench/) which is very nice and easy to use. I can share the file, if you need it. Let me know.
process
modeling for database samples and tokenomics this is a first pass but i think i can do pretty much everything i want to do with this is this what you were looking for does it make sense a monitor is a list of ethereum addresses each address can participate as either the ‘to’ or the ‘from’ address in a transaction a transaction either sends ether from one account to another or calls a function on one account from another a block’s timestamp is the smallest granularity of time usually seconds therefor all transactions a block have the same timestamp and as a result the same price at time of transaction i used hashes as the relationship keys hashes are verbose and probably take up too much space but they are perfect unique identifiers and therefore keys i’m not sure i have the modeling of the ‘to’ and ‘from’ correct but i want to be able to query for all transactions from all transactions to and all transaction either in or out of an account img width alt screen shot at am src i downloaded and used which is very nice and easy to use i can share the file if you need it let me know
1
7,295
10,441,693,583
IssuesEvent
2019-09-18 11:26:23
Altinn/altinn-studio
https://api.github.com/repos/Altinn/altinn-studio
closed
Create BPMN Engine
analysis process user-story
## Description App Backend need to process the Process Defined in BPMN for the app. This processing should be made as a separate "component" and not be a a integrated part of the app backend. This to make it reusable ## Considerations - The process engine should not have any dependency on other Altinn Studio / Apps components - ## Acceptance criteria - Can load BPMN Process. (Either directly from XML or from util that load and objectifies it) - Can Process the BPM process Move forward to next step Move forward to given step ## Tasks - [ ] Create BPMN Parser functionality that can parse a BPMN 2.0 file (There is already done something here, need to verify if it needs to be updated) - [ ] Create BPMN Engine to process a BPMN Process. ## Specification tasks - [ ] Test design / decide test need ## Development tasks - [ ] Documentation (if relevant) - [ ] Manual test (if needed) - [ ] Automated test (if needed)
1.0
Create BPMN Engine - ## Description App Backend need to process the Process Defined in BPMN for the app. This processing should be made as a separate "component" and not be a a integrated part of the app backend. This to make it reusable ## Considerations - The process engine should not have any dependency on other Altinn Studio / Apps components - ## Acceptance criteria - Can load BPMN Process. (Either directly from XML or from util that load and objectifies it) - Can Process the BPM process Move forward to next step Move forward to given step ## Tasks - [ ] Create BPMN Parser functionality that can parse a BPMN 2.0 file (There is already done something here, need to verify if it needs to be updated) - [ ] Create BPMN Engine to process a BPMN Process. ## Specification tasks - [ ] Test design / decide test need ## Development tasks - [ ] Documentation (if relevant) - [ ] Manual test (if needed) - [ ] Automated test (if needed)
process
create bpmn engine description app backend need to process the process defined in bpmn for the app this processing should be made as a separate component and not be a a integrated part of the app backend this to make it reusable considerations the process engine should not have any dependency on other altinn studio apps components acceptance criteria can load bpmn process either directly from xml or from util that load and objectifies it can process the bpm process move forward to next step move forward to given step tasks create bpmn parser functionality that can parse a bpmn file there is already done something here need to verify if it needs to be updated create bpmn engine to process a bpmn process specification tasks test design decide test need development tasks documentation if relevant manual test if needed automated test if needed
1
360,159
10,684,895,596
IssuesEvent
2019-10-22 11:32:10
official-antistasi-community/A3-Antistasi
https://api.github.com/repos/official-antistasi-community/A3-Antistasi
closed
Fix "Flag capture" exploit
Priority bug
When a person is taking the flag on (for example) a captured outposts he can do it several times whilst effectively in the animation and therefore exploit the system and get sweet money from it. If I'm correct the function being called is mrkWin. My idea, change it so the function can only be called once every 2 minutes. Either you have killed all the enemies in the area and capture the point or you still are outnumbered and have to wait 2 minutes before you can call it again. This of course should be on server-side and not on client-side to exclude the possibility of others also taking the flag at the same time. Maybe the system like on the hear-and-repair box.
1.0
Fix "Flag capture" exploit - When a person is taking the flag on (for example) a captured outposts he can do it several times whilst effectively in the animation and therefore exploit the system and get sweet money from it. If I'm correct the function being called is mrkWin. My idea, change it so the function can only be called once every 2 minutes. Either you have killed all the enemies in the area and capture the point or you still are outnumbered and have to wait 2 minutes before you can call it again. This of course should be on server-side and not on client-side to exclude the possibility of others also taking the flag at the same time. Maybe the system like on the hear-and-repair box.
non_process
fix flag capture exploit when a person is taking the flag on for example a captured outposts he can do it several times whilst effectively in the animation and therefore exploit the system and get sweet money from it if i m correct the function being called is mrkwin my idea change it so the function can only be called once every minutes either you have killed all the enemies in the area and capture the point or you still are outnumbered and have to wait minutes before you can call it again this of course should be on server side and not on client side to exclude the possibility of others also taking the flag at the same time maybe the system like on the hear and repair box
0
51,912
13,211,336,615
IssuesEvent
2020-08-15 22:24:11
icecube-trac/tix4
https://api.github.com/repos/icecube-trac/tix4
opened
[clsim] run the python tests (Trac #1252)
Incomplete Migration Migrated from Trac combo simulation defect
<details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1252">https://code.icecube.wisc.edu/projects/icecube/ticket/1252</a>, reported by david.schultzand owned by claudio.kopper</em></summary> <p> ```json { "status": "closed", "changetime": "2019-02-13T14:11:45", "_ts": "1550067105393059", "description": "clsim has python tests under `resources/tests`, but cmake doesn't know about them. Perhaps add `i3_test_scripts(resources/tests/*.py)` in the right place?", "reporter": "david.schultz", "cc": "", "resolution": "fixed", "time": "2015-08-20T18:32:32", "component": "combo simulation", "summary": "[clsim] run the python tests", "priority": "blocker", "keywords": "", "milestone": "", "owner": "claudio.kopper", "type": "defect" } ``` </p> </details>
1.0
[clsim] run the python tests (Trac #1252) - <details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1252">https://code.icecube.wisc.edu/projects/icecube/ticket/1252</a>, reported by david.schultzand owned by claudio.kopper</em></summary> <p> ```json { "status": "closed", "changetime": "2019-02-13T14:11:45", "_ts": "1550067105393059", "description": "clsim has python tests under `resources/tests`, but cmake doesn't know about them. Perhaps add `i3_test_scripts(resources/tests/*.py)` in the right place?", "reporter": "david.schultz", "cc": "", "resolution": "fixed", "time": "2015-08-20T18:32:32", "component": "combo simulation", "summary": "[clsim] run the python tests", "priority": "blocker", "keywords": "", "milestone": "", "owner": "claudio.kopper", "type": "defect" } ``` </p> </details>
non_process
run the python tests trac migrated from json status closed changetime ts description clsim has python tests under resources tests but cmake doesn t know about them perhaps add test scripts resources tests py in the right place reporter david schultz cc resolution fixed time component combo simulation summary run the python tests priority blocker keywords milestone owner claudio kopper type defect
0
298,870
25,862,596,865
IssuesEvent
2022-12-13 18:05:03
quarkusio/quarkus
https://api.github.com/repos/quarkusio/quarkus
closed
Rename `quarkus.test.native-image-profile` to reflect its use for integration tests
area/testing area/housekeeping
### Description Since specific native image testing has been replaced by integration testing, which may or may not be a native image, this property name should reflect its new usage. ### Implementation ideas A name such as `quarkus.test.integration-profile` or similar would better reflect its current usage.
1.0
Rename `quarkus.test.native-image-profile` to reflect its use for integration tests - ### Description Since specific native image testing has been replaced by integration testing, which may or may not be a native image, this property name should reflect its new usage. ### Implementation ideas A name such as `quarkus.test.integration-profile` or similar would better reflect its current usage.
non_process
rename quarkus test native image profile to reflect its use for integration tests description since specific native image testing has been replaced by integration testing which may or may not be a native image this property name should reflect its new usage implementation ideas a name such as quarkus test integration profile or similar would better reflect its current usage
0
221,537
7,389,578,159
IssuesEvent
2018-03-16 09:14:49
Wozza365/GameDevelopment
https://api.github.com/repos/Wozza365/GameDevelopment
opened
Key Must Trigger Door to Open When Key is Pressed Within Area
enhancement high priority
This work should only be a small enhancement of existing door work. The door must open when player presses button within the trigger area. Must trigger unlocking sound and remove key. Should transition the key into the keyhole and then twist it as sound is played.
1.0
Key Must Trigger Door to Open When Key is Pressed Within Area - This work should only be a small enhancement of existing door work. The door must open when player presses button within the trigger area. Must trigger unlocking sound and remove key. Should transition the key into the keyhole and then twist it as sound is played.
non_process
key must trigger door to open when key is pressed within area this work should only be a small enhancement of existing door work the door must open when player presses button within the trigger area must trigger unlocking sound and remove key should transition the key into the keyhole and then twist it as sound is played
0
9,195
12,230,722,159
IssuesEvent
2020-05-04 05:51:59
zotero/zotero
https://api.github.com/repos/zotero/zotero
opened
Multi-section bibliographies
Word Processor Integration
As noted in https://github.com/citation-style-language/documentation/issues/71 this is a frequently requested feature, which require a both a technical and GUI solution in CSL/citeproc clients. Since the GUI is likely to inform the technical implementation, I wanted to discuss it separately in more detail. (1) If we were to do this I think we should split the Add/Edit Bibliography button into two separate buttons: Add and Edit; where Add would add a full bibliography of all items in the document at the cursor location, and you could add multiple bibliographies, while Edit would open the Bibliography editor for the bibliography at the cursor and allow you to uncite items. This would be enough to make fully customizable bibliography sections. The Edit Bibliography dialog can be further enhanced per user requests based on the use cases of multi-section bibliography, to filter for certain item types, tags, etc. (2) We could also allow users to give names to bibliography sections and allow them to mark which sections an item should belong to when inserting/editing items via the citation dialog. For (1) we probably just would need to update uncited items for each bibliography generation separately. For (2) we might want to use the citeproc bibliography generation [filter functionality](https://citeproc-js.readthedocs.io/en/latest/running.html#selective-output-with-makebibliography) and would need to store some additional information in bibliography field codes.
1.0
Multi-section bibliographies - As noted in https://github.com/citation-style-language/documentation/issues/71 this is a frequently requested feature, which require a both a technical and GUI solution in CSL/citeproc clients. Since the GUI is likely to inform the technical implementation, I wanted to discuss it separately in more detail. (1) If we were to do this I think we should split the Add/Edit Bibliography button into two separate buttons: Add and Edit; where Add would add a full bibliography of all items in the document at the cursor location, and you could add multiple bibliographies, while Edit would open the Bibliography editor for the bibliography at the cursor and allow you to uncite items. This would be enough to make fully customizable bibliography sections. The Edit Bibliography dialog can be further enhanced per user requests based on the use cases of multi-section bibliography, to filter for certain item types, tags, etc. (2) We could also allow users to give names to bibliography sections and allow them to mark which sections an item should belong to when inserting/editing items via the citation dialog. For (1) we probably just would need to update uncited items for each bibliography generation separately. For (2) we might want to use the citeproc bibliography generation [filter functionality](https://citeproc-js.readthedocs.io/en/latest/running.html#selective-output-with-makebibliography) and would need to store some additional information in bibliography field codes.
process
multi section bibliographies as noted in this is a frequently requested feature which require a both a technical and gui solution in csl citeproc clients since the gui is likely to inform the technical implementation i wanted to discuss it separately in more detail if we were to do this i think we should split the add edit bibliography button into two separate buttons add and edit where add would add a full bibliography of all items in the document at the cursor location and you could add multiple bibliographies while edit would open the bibliography editor for the bibliography at the cursor and allow you to uncite items this would be enough to make fully customizable bibliography sections the edit bibliography dialog can be further enhanced per user requests based on the use cases of multi section bibliography to filter for certain item types tags etc we could also allow users to give names to bibliography sections and allow them to mark which sections an item should belong to when inserting editing items via the citation dialog for we probably just would need to update uncited items for each bibliography generation separately for we might want to use the citeproc bibliography generation and would need to store some additional information in bibliography field codes
1
10,764
13,551,960,105
IssuesEvent
2020-09-17 11:55:03
metabase/metabase
https://api.github.com/repos/metabase/metabase
closed
Object Detail on field with very large integer fails
Database/Postgres Priority:P2 Querying/Processor Type:Bug
**Describe the bug** Object Detail on field with large (#5816) `bigint`/`bigserial` fails on Postgres, but works on MariaDB. **To Reproduce** 1. Create table and sync: ``` CREATE TABLE sampledata.x5816 ( id bigserial NOT NULL, biggie bigint NULL, CONSTRAINT idx_pk PRIMARY KEY (id) ); INSERT INTO sampledata.x5816 (id,biggie) VALUES (201911260319345481, 201911260319345481), (202006010211003419, 202006010211003419); ``` 2. Simple question > (postgres) > X5816 ![image](https://user-images.githubusercontent.com/1447303/93196439-bab0a200-f74a-11ea-8d38-d11cf4bd7e67.png) 3. Click on one of the ID column rows to view Object Details - `ERROR: operator does not exist: bigint = character varying` <details><summary>Full stacktrace</summary> ``` 09-15 12:01:21 ERROR middleware.catch-exceptions :: Error processing query: null {:database_id 22, :started_at #t "2020-09-15T12:01:20.987779+02:00[Europe/Copenhagen]", :state "42883", :json_query {:query {:source-table 1288, :filter ["=" ["field-id" 19847] "201911260319345481"]}, :type "query", :database 22, :parameters [], :middleware {:js-int-to-string? true, :add-default-userland-constraints? true}}, :native {:query "SELECT \"sampledata\".\"x5816\".\"id\" AS \"id\", \"sampledata\".\"x5816\".\"biggie\" AS \"biggie\" FROM \"sampledata\".\"x5816\" WHERE \"sampledata\".\"x5816\".\"id\" = ? LIMIT 2000", :params ("201911260319345481")}, :status :failed, :class org.postgresql.util.PSQLException, :stacktrace ["org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2497)" "org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2233)" "org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:310)" "org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:446)" "org.postgresql.jdbc.PgStatement.execute(PgStatement.java:370)" "org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:149)" "org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:108)" "com.mchange.v2.c3p0.impl.NewProxyPreparedStatement.executeQuery(NewProxyPreparedStatement.java:431)" "--> driver.sql_jdbc.execute$fn__72618.invokeStatic(execute.clj:267)" "driver.sql_jdbc.execute$fn__72618.invoke(execute.clj:265)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:389)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:377)" "driver.sql_jdbc$fn__73933.invokeStatic(sql_jdbc.clj:49)" "driver.sql_jdbc$fn__73933.invoke(sql_jdbc.clj:47)" "query_processor.context$executef.invokeStatic(context.clj:59)" "query_processor.context$executef.invoke(context.clj:48)" "query_processor.context.default$default_runf.invokeStatic(default.clj:69)" "query_processor.context.default$default_runf.invoke(default.clj:67)" "query_processor.context$runf.invokeStatic(context.clj:45)" "query_processor.context$runf.invoke(context.clj:39)" "query_processor.reducible$pivot.invokeStatic(reducible.clj:34)" "query_processor.reducible$pivot.invoke(reducible.clj:31)" "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__45635.invoke(mbql_to_native.clj:26)" "query_processor.middleware.check_features$check_features$fn__44911.invoke(check_features.clj:42)" "query_processor.middleware.optimize_datetime_filters$optimize_datetime_filters$fn__45800.invoke(optimize_datetime_filters.clj:133)" "query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__47328.invoke(wrap_value_literals.clj:137)" "query_processor.middleware.annotate$add_column_info$fn__43532.invoke(annotate.clj:574)" "query_processor.middleware.permissions$check_query_permissions$fn__44786.invoke(permissions.clj:64)" "query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__46318.invoke(pre_alias_aggregations.clj:40)" "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__44984.invoke(cumulative_aggregations.clj:61)" "query_processor.middleware.resolve_joins$resolve_joins$fn__46850.invoke(resolve_joins.clj:183)" "query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__39262.invoke(add_implicit_joins.clj:245)" "query_processor.middleware.large_int_id$convert_id_to_string$fn__45596.invoke(large_int_id.clj:44)" "query_processor.middleware.limit$limit$fn__45621.invoke(limit.clj:38)" "query_processor.middleware.format_rows$format_rows$fn__45576.invoke(format_rows.clj:81)" "query_processor.middleware.desugar$desugar$fn__45050.invoke(desugar.clj:22)" "query_processor.middleware.binning$update_binning_strategy$fn__44076.invoke(binning.clj:229)" "query_processor.middleware.resolve_fields$resolve_fields$fn__44592.invoke(resolve_fields.clj:24)" "query_processor.middleware.add_dimension_projections$add_remapping$fn__38811.invoke(add_dimension_projections.clj:311)" "query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__39018.invoke(add_implicit_clauses.clj:141)" "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__39411.invoke(add_source_metadata.clj:105)" "query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__46515.invoke(reconcile_breakout_and_order_by_bucketing.clj:98)" "query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__43717.invoke(auto_bucket_datetimes.clj:125)" "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__44639.invoke(resolve_source_table.clj:46)" "query_processor.middleware.parameters$substitute_parameters$fn__46300.invoke(parameters.clj:114)" "query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__44691.invoke(resolve_referenced.clj:80)" "query_processor.middleware.expand_macros$expand_macros$fn__45306.invoke(expand_macros.clj:158)" "query_processor.middleware.add_timezone_info$add_timezone_info$fn__39442.invoke(add_timezone_info.clj:15)" "query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__47212.invoke(splice_params_in_response.clj:32)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__46526$fn__46530.invoke(resolve_database_and_driver.clj:33)" "driver$do_with_driver.invokeStatic(driver.clj:61)" "driver$do_with_driver.invoke(driver.clj:57)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__46526.invoke(resolve_database_and_driver.clj:27)" "query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__45524.invoke(fetch_source_query.clj:267)" "query_processor.middleware.store$initialize_store$fn__47221$fn__47222.invoke(store.clj:11)" "query_processor.store$do_with_store.invokeStatic(store.clj:46)" "query_processor.store$do_with_store.invoke(store.clj:40)" "query_processor.middleware.store$initialize_store$fn__47221.invoke(store.clj:10)" "query_processor.middleware.cache$maybe_return_cached_results$fn__44568.invoke(cache.clj:209)" "query_processor.middleware.validate$validate_query$fn__47230.invoke(validate.clj:10)" "query_processor.middleware.normalize_query$normalize$fn__45648.invoke(normalize_query.clj:22)" "query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__39280.invoke(add_rows_truncated.clj:36)" "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__47197.invoke(results_metadata.clj:147)" "query_processor.middleware.constraints$add_default_userland_constraints$fn__44927.invoke(constraints.clj:42)" "query_processor.middleware.process_userland_query$process_userland_query$fn__46389.invoke(process_userland_query.clj:136)" "query_processor.middleware.catch_exceptions$catch_exceptions$fn__44870.invoke(catch_exceptions.clj:174)" "query_processor.reducible$async_qp$qp_STAR___38074$thunk__38075.invoke(reducible.clj:101)" "query_processor.reducible$async_qp$qp_STAR___38074.invoke(reducible.clj:107)" "query_processor.reducible$sync_qp$qp_STAR___38083$fn__38086.invoke(reducible.clj:133)" "query_processor.reducible$sync_qp$qp_STAR___38083.invoke(reducible.clj:132)" "query_processor$process_userland_query.invokeStatic(query_processor.clj:215)" "query_processor$process_userland_query.doInvoke(query_processor.clj:211)" "query_processor$fn__47372$process_query_and_save_execution_BANG___47381$fn__47384.invoke(query_processor.clj:227)" "query_processor$fn__47372$process_query_and_save_execution_BANG___47381.invoke(query_processor.clj:219)" "query_processor$fn__47416$process_query_and_save_with_max_results_constraints_BANG___47425$fn__47428.invoke(query_processor.clj:239)" "query_processor$fn__47416$process_query_and_save_with_max_results_constraints_BANG___47425.invoke(query_processor.clj:232)" "api.dataset$fn__50707$fn__50710.invoke(dataset.clj:55)" "query_processor.streaming$streaming_response_STAR_$fn__35496$fn__35497.invoke(streaming.clj:73)" "query_processor.streaming$streaming_response_STAR_$fn__35496.invoke(streaming.clj:72)" "async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:66)" "async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:64)" "async.streaming_response$do_f_async$fn__23282.invoke(streaming_response.clj:85)"], :context :ad-hoc, :error "ERROR: operator does not exist: bigint = character varying\n Hint: No operator matches the given name and argument types. You might need to add explicit type casts.\n Position: 259", :row_count 0, :running_time 0, :preprocessed {:query {:source-table 1288, :filter [:= [:field-id 19847] [:value "201911260319345481" {:base_type :type/BigInteger, :special_type :type/PK, :database_type "bigserial", :name "id"}]], :fields [[:field-id 19847] [:field-id 19846]], :limit 2000}, :type :query, :database 22, :middleware {:js-int-to-string? true, :add-default-userland-constraints? true}, :info {:executed-by 1, :context :ad-hoc, :nested? false, :query-hash [-19, 36, 16, 124, 29, 123, -90, 125, -78, -14, -37, 35, 80, -83, -108, 11, 85, 125, -49, -69, -106, 67, 6, 63, 24, 59, 48, 100, -42, -15, -50, 40]}, :constraints {:max-results 10000, :max-results-bare-rows 2000}}, :data {:rows [], :cols []}} 09-15 12:01:21 DEBUG middleware.log :: POST /api/dataset 202 [ASYNC: completed] 333.2 ms (12 DB calls) App DB connections: 0/10 Jetty threads: 3/50 (3 idle, 0 queued) (130 total active threads) Queries in flight: 1 (0 queued) ``` </details> **Information about your Metabase Installation:** Metabase 0.36.5.1 querying Postgres 12.4 and MariaDB 10.4
1.0
Object Detail on field with very large integer fails - **Describe the bug** Object Detail on field with large (#5816) `bigint`/`bigserial` fails on Postgres, but works on MariaDB. **To Reproduce** 1. Create table and sync: ``` CREATE TABLE sampledata.x5816 ( id bigserial NOT NULL, biggie bigint NULL, CONSTRAINT idx_pk PRIMARY KEY (id) ); INSERT INTO sampledata.x5816 (id,biggie) VALUES (201911260319345481, 201911260319345481), (202006010211003419, 202006010211003419); ``` 2. Simple question > (postgres) > X5816 ![image](https://user-images.githubusercontent.com/1447303/93196439-bab0a200-f74a-11ea-8d38-d11cf4bd7e67.png) 3. Click on one of the ID column rows to view Object Details - `ERROR: operator does not exist: bigint = character varying` <details><summary>Full stacktrace</summary> ``` 09-15 12:01:21 ERROR middleware.catch-exceptions :: Error processing query: null {:database_id 22, :started_at #t "2020-09-15T12:01:20.987779+02:00[Europe/Copenhagen]", :state "42883", :json_query {:query {:source-table 1288, :filter ["=" ["field-id" 19847] "201911260319345481"]}, :type "query", :database 22, :parameters [], :middleware {:js-int-to-string? true, :add-default-userland-constraints? true}}, :native {:query "SELECT \"sampledata\".\"x5816\".\"id\" AS \"id\", \"sampledata\".\"x5816\".\"biggie\" AS \"biggie\" FROM \"sampledata\".\"x5816\" WHERE \"sampledata\".\"x5816\".\"id\" = ? LIMIT 2000", :params ("201911260319345481")}, :status :failed, :class org.postgresql.util.PSQLException, :stacktrace ["org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2497)" "org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2233)" "org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:310)" "org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:446)" "org.postgresql.jdbc.PgStatement.execute(PgStatement.java:370)" "org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:149)" "org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:108)" "com.mchange.v2.c3p0.impl.NewProxyPreparedStatement.executeQuery(NewProxyPreparedStatement.java:431)" "--> driver.sql_jdbc.execute$fn__72618.invokeStatic(execute.clj:267)" "driver.sql_jdbc.execute$fn__72618.invoke(execute.clj:265)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:389)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:377)" "driver.sql_jdbc$fn__73933.invokeStatic(sql_jdbc.clj:49)" "driver.sql_jdbc$fn__73933.invoke(sql_jdbc.clj:47)" "query_processor.context$executef.invokeStatic(context.clj:59)" "query_processor.context$executef.invoke(context.clj:48)" "query_processor.context.default$default_runf.invokeStatic(default.clj:69)" "query_processor.context.default$default_runf.invoke(default.clj:67)" "query_processor.context$runf.invokeStatic(context.clj:45)" "query_processor.context$runf.invoke(context.clj:39)" "query_processor.reducible$pivot.invokeStatic(reducible.clj:34)" "query_processor.reducible$pivot.invoke(reducible.clj:31)" "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__45635.invoke(mbql_to_native.clj:26)" "query_processor.middleware.check_features$check_features$fn__44911.invoke(check_features.clj:42)" "query_processor.middleware.optimize_datetime_filters$optimize_datetime_filters$fn__45800.invoke(optimize_datetime_filters.clj:133)" "query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__47328.invoke(wrap_value_literals.clj:137)" "query_processor.middleware.annotate$add_column_info$fn__43532.invoke(annotate.clj:574)" "query_processor.middleware.permissions$check_query_permissions$fn__44786.invoke(permissions.clj:64)" "query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__46318.invoke(pre_alias_aggregations.clj:40)" "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__44984.invoke(cumulative_aggregations.clj:61)" "query_processor.middleware.resolve_joins$resolve_joins$fn__46850.invoke(resolve_joins.clj:183)" "query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__39262.invoke(add_implicit_joins.clj:245)" "query_processor.middleware.large_int_id$convert_id_to_string$fn__45596.invoke(large_int_id.clj:44)" "query_processor.middleware.limit$limit$fn__45621.invoke(limit.clj:38)" "query_processor.middleware.format_rows$format_rows$fn__45576.invoke(format_rows.clj:81)" "query_processor.middleware.desugar$desugar$fn__45050.invoke(desugar.clj:22)" "query_processor.middleware.binning$update_binning_strategy$fn__44076.invoke(binning.clj:229)" "query_processor.middleware.resolve_fields$resolve_fields$fn__44592.invoke(resolve_fields.clj:24)" "query_processor.middleware.add_dimension_projections$add_remapping$fn__38811.invoke(add_dimension_projections.clj:311)" "query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__39018.invoke(add_implicit_clauses.clj:141)" "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__39411.invoke(add_source_metadata.clj:105)" "query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__46515.invoke(reconcile_breakout_and_order_by_bucketing.clj:98)" "query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__43717.invoke(auto_bucket_datetimes.clj:125)" "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__44639.invoke(resolve_source_table.clj:46)" "query_processor.middleware.parameters$substitute_parameters$fn__46300.invoke(parameters.clj:114)" "query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__44691.invoke(resolve_referenced.clj:80)" "query_processor.middleware.expand_macros$expand_macros$fn__45306.invoke(expand_macros.clj:158)" "query_processor.middleware.add_timezone_info$add_timezone_info$fn__39442.invoke(add_timezone_info.clj:15)" "query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__47212.invoke(splice_params_in_response.clj:32)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__46526$fn__46530.invoke(resolve_database_and_driver.clj:33)" "driver$do_with_driver.invokeStatic(driver.clj:61)" "driver$do_with_driver.invoke(driver.clj:57)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__46526.invoke(resolve_database_and_driver.clj:27)" "query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__45524.invoke(fetch_source_query.clj:267)" "query_processor.middleware.store$initialize_store$fn__47221$fn__47222.invoke(store.clj:11)" "query_processor.store$do_with_store.invokeStatic(store.clj:46)" "query_processor.store$do_with_store.invoke(store.clj:40)" "query_processor.middleware.store$initialize_store$fn__47221.invoke(store.clj:10)" "query_processor.middleware.cache$maybe_return_cached_results$fn__44568.invoke(cache.clj:209)" "query_processor.middleware.validate$validate_query$fn__47230.invoke(validate.clj:10)" "query_processor.middleware.normalize_query$normalize$fn__45648.invoke(normalize_query.clj:22)" "query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__39280.invoke(add_rows_truncated.clj:36)" "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__47197.invoke(results_metadata.clj:147)" "query_processor.middleware.constraints$add_default_userland_constraints$fn__44927.invoke(constraints.clj:42)" "query_processor.middleware.process_userland_query$process_userland_query$fn__46389.invoke(process_userland_query.clj:136)" "query_processor.middleware.catch_exceptions$catch_exceptions$fn__44870.invoke(catch_exceptions.clj:174)" "query_processor.reducible$async_qp$qp_STAR___38074$thunk__38075.invoke(reducible.clj:101)" "query_processor.reducible$async_qp$qp_STAR___38074.invoke(reducible.clj:107)" "query_processor.reducible$sync_qp$qp_STAR___38083$fn__38086.invoke(reducible.clj:133)" "query_processor.reducible$sync_qp$qp_STAR___38083.invoke(reducible.clj:132)" "query_processor$process_userland_query.invokeStatic(query_processor.clj:215)" "query_processor$process_userland_query.doInvoke(query_processor.clj:211)" "query_processor$fn__47372$process_query_and_save_execution_BANG___47381$fn__47384.invoke(query_processor.clj:227)" "query_processor$fn__47372$process_query_and_save_execution_BANG___47381.invoke(query_processor.clj:219)" "query_processor$fn__47416$process_query_and_save_with_max_results_constraints_BANG___47425$fn__47428.invoke(query_processor.clj:239)" "query_processor$fn__47416$process_query_and_save_with_max_results_constraints_BANG___47425.invoke(query_processor.clj:232)" "api.dataset$fn__50707$fn__50710.invoke(dataset.clj:55)" "query_processor.streaming$streaming_response_STAR_$fn__35496$fn__35497.invoke(streaming.clj:73)" "query_processor.streaming$streaming_response_STAR_$fn__35496.invoke(streaming.clj:72)" "async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:66)" "async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:64)" "async.streaming_response$do_f_async$fn__23282.invoke(streaming_response.clj:85)"], :context :ad-hoc, :error "ERROR: operator does not exist: bigint = character varying\n Hint: No operator matches the given name and argument types. You might need to add explicit type casts.\n Position: 259", :row_count 0, :running_time 0, :preprocessed {:query {:source-table 1288, :filter [:= [:field-id 19847] [:value "201911260319345481" {:base_type :type/BigInteger, :special_type :type/PK, :database_type "bigserial", :name "id"}]], :fields [[:field-id 19847] [:field-id 19846]], :limit 2000}, :type :query, :database 22, :middleware {:js-int-to-string? true, :add-default-userland-constraints? true}, :info {:executed-by 1, :context :ad-hoc, :nested? false, :query-hash [-19, 36, 16, 124, 29, 123, -90, 125, -78, -14, -37, 35, 80, -83, -108, 11, 85, 125, -49, -69, -106, 67, 6, 63, 24, 59, 48, 100, -42, -15, -50, 40]}, :constraints {:max-results 10000, :max-results-bare-rows 2000}}, :data {:rows [], :cols []}} 09-15 12:01:21 DEBUG middleware.log :: POST /api/dataset 202 [ASYNC: completed] 333.2 ms (12 DB calls) App DB connections: 0/10 Jetty threads: 3/50 (3 idle, 0 queued) (130 total active threads) Queries in flight: 1 (0 queued) ``` </details> **Information about your Metabase Installation:** Metabase 0.36.5.1 querying Postgres 12.4 and MariaDB 10.4
process
object detail on field with very large integer fails describe the bug object detail on field with large bigint bigserial fails on postgres but works on mariadb to reproduce create table and sync create table sampledata id bigserial not null biggie bigint null constraint idx pk primary key id insert into sampledata id biggie values simple question postgres click on one of the id column rows to view object details error operator does not exist bigint character varying full stacktrace error middleware catch exceptions error processing query null database id started at t state json query query source table filter type query database parameters middleware js int to string true add default userland constraints true native query select sampledata id as id sampledata biggie as biggie from sampledata where sampledata id limit params status failed class org postgresql util psqlexception stacktrace org postgresql core queryexecutorimpl receiveerrorresponse queryexecutorimpl java org postgresql core queryexecutorimpl processresults queryexecutorimpl java org postgresql core queryexecutorimpl execute queryexecutorimpl java org postgresql jdbc pgstatement executeinternal pgstatement java org postgresql jdbc pgstatement execute pgstatement java org postgresql jdbc pgpreparedstatement executewithflags pgpreparedstatement java org postgresql jdbc pgpreparedstatement executequery pgpreparedstatement java com mchange impl newproxypreparedstatement executequery newproxypreparedstatement java driver sql jdbc execute fn invokestatic execute clj driver sql jdbc execute fn invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc fn invokestatic sql jdbc clj driver sql jdbc fn invoke sql jdbc clj query processor context executef invokestatic context clj query processor context executef invoke context clj query processor context default default runf invokestatic default clj query processor context default default runf invoke default clj query processor context runf invokestatic context clj query processor context runf invoke context clj query processor reducible pivot invokestatic reducible clj query processor reducible pivot invoke reducible clj query processor middleware mbql to native mbql gt native fn invoke mbql to native clj query processor middleware check features check features fn invoke check features clj query processor middleware optimize datetime filters optimize datetime filters fn invoke optimize datetime filters clj query processor middleware wrap value literals wrap value literals fn invoke wrap value literals clj query processor middleware annotate add column info fn invoke annotate clj query processor middleware permissions check query permissions fn invoke permissions clj query processor middleware pre alias aggregations pre alias aggregations fn invoke pre alias aggregations clj query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj query processor middleware resolve joins resolve joins fn invoke resolve joins clj query processor middleware add implicit joins add implicit joins fn invoke add implicit joins clj query processor middleware large int id convert id to string fn invoke large int id clj query processor middleware limit limit fn invoke limit clj query processor middleware format rows format rows fn invoke format rows clj query processor middleware desugar desugar fn invoke desugar clj query processor middleware binning update binning strategy fn invoke binning clj query processor middleware resolve fields resolve fields fn invoke resolve fields clj query processor middleware add dimension projections add remapping fn invoke add dimension projections clj query processor middleware add implicit clauses add implicit clauses fn invoke add implicit clauses clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj query processor middleware reconcile breakout and order by bucketing reconcile breakout and order by bucketing fn invoke reconcile breakout and order by bucketing clj query processor middleware auto bucket datetimes auto bucket datetimes fn invoke auto bucket datetimes clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware parameters substitute parameters fn invoke parameters clj query processor middleware resolve referenced resolve referenced card resources fn invoke resolve referenced clj query processor middleware expand macros expand macros fn invoke expand macros clj query processor middleware add timezone info add timezone info fn invoke add timezone info clj query processor middleware splice params in response splice params in response fn invoke splice params in response clj query processor middleware resolve database and driver resolve database and driver fn fn invoke resolve database and driver clj driver do with driver invokestatic driver clj driver do with driver invoke driver clj query processor middleware resolve database and driver resolve database and driver fn invoke resolve database and driver clj query processor middleware fetch source query resolve card id source tables fn invoke fetch source query clj query processor middleware store initialize store fn fn invoke store clj query processor store do with store invokestatic store clj query processor store do with store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware cache maybe return cached results fn invoke cache clj query processor middleware validate validate query fn invoke validate clj query processor middleware normalize query normalize fn invoke normalize query clj query processor middleware add rows truncated add rows truncated fn invoke add rows truncated clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor middleware constraints add default userland constraints fn invoke constraints clj query processor middleware process userland query process userland query fn invoke process userland query clj query processor middleware catch exceptions catch exceptions fn invoke catch exceptions clj query processor reducible async qp qp star thunk invoke reducible clj query processor reducible async qp qp star invoke reducible clj query processor reducible sync qp qp star fn invoke reducible clj query processor reducible sync qp qp star invoke reducible clj query processor process userland query invokestatic query processor clj query processor process userland query doinvoke query processor clj query processor fn process query and save execution bang fn invoke query processor clj query processor fn process query and save execution bang invoke query processor clj query processor fn process query and save with max results constraints bang fn invoke query processor clj query processor fn process query and save with max results constraints bang invoke query processor clj api dataset fn fn invoke dataset clj query processor streaming streaming response star fn fn invoke streaming clj query processor streaming streaming response star fn invoke streaming clj async streaming response do f star invokestatic streaming response clj async streaming response do f star invoke streaming response clj async streaming response do f async fn invoke streaming response clj context ad hoc error error operator does not exist bigint character varying n hint no operator matches the given name and argument types you might need to add explicit type casts n position row count running time preprocessed query source table filter fields limit type query database middleware js int to string true add default userland constraints true info executed by context ad hoc nested false query hash constraints max results max results bare rows data rows cols debug middleware log post api dataset ms db calls app db connections jetty threads idle queued total active threads queries in flight queued information about your metabase installation metabase querying postgres and mariadb
1
20,600
27,265,745,060
IssuesEvent
2023-02-22 17:53:29
tikv/tikv
https://api.github.com/repos/tikv/tikv
opened
suspended-time should be passed through to TiDB to avoid confusion
type/bug sig/coprocessor
## Bug Report in PR #9257, we introduced the suspended time in the tracker. However this information is included in process time in TiDB side, which misleads the investigation. As slow process time typically means low IO or CPU resources. It should be exposed as suspended time itself in TiDB. <!-- Thanks for your bug report! Don't worry if you can't fill out all the sections. --> ### What version of TiKV are you using? <!-- You can run `tikv-server --version` --> 6.5 Or newer ### What operating system and CPU are you using? <!-- If you're using Linux, you can run `cat /proc/cpuinfo` --> ### Steps to reproduce <!-- If possible, provide a recipe for reproducing the error. A complete runnable program is good. --> Check out the TiDB dashboard and view the slow query's detail information. The process time includes the suspended time. ### What did you expect? There should be standalone item called suspended time in TiDB ### What did happened? The process time includes the suspended time.
1.0
suspended-time should be passed through to TiDB to avoid confusion - ## Bug Report in PR #9257, we introduced the suspended time in the tracker. However this information is included in process time in TiDB side, which misleads the investigation. As slow process time typically means low IO or CPU resources. It should be exposed as suspended time itself in TiDB. <!-- Thanks for your bug report! Don't worry if you can't fill out all the sections. --> ### What version of TiKV are you using? <!-- You can run `tikv-server --version` --> 6.5 Or newer ### What operating system and CPU are you using? <!-- If you're using Linux, you can run `cat /proc/cpuinfo` --> ### Steps to reproduce <!-- If possible, provide a recipe for reproducing the error. A complete runnable program is good. --> Check out the TiDB dashboard and view the slow query's detail information. The process time includes the suspended time. ### What did you expect? There should be standalone item called suspended time in TiDB ### What did happened? The process time includes the suspended time.
process
suspended time should be passed through to tidb to avoid confusion bug report in pr we introduced the suspended time in the tracker however this information is included in process time in tidb side which misleads the investigation as slow process time typically means low io or cpu resources it should be exposed as suspended time itself in tidb what version of tikv are you using or newer what operating system and cpu are you using steps to reproduce check out the tidb dashboard and view the slow query s detail information the process time includes the suspended time what did you expect there should be standalone item called suspended time in tidb what did happened the process time includes the suspended time
1
355,941
25,176,068,512
IssuesEvent
2022-11-11 09:22:26
jwdavis0200/pe
https://api.github.com/repos/jwdavis0200/pe
opened
Images used in the UG for features examples is redundant/ineffective for value adding.
severity.Low type.DocumentationBug
The Features in the UG were explained with many examples and result-of-command lists but fails to show the difference between the before-command state of the list and the after-command state of the list, hence is not useful to the user to learn about how the commands change the state of the HealthContact lists. May even add to the clutter of the UG rather than guide users, since they can't tell the difference anyways. ![image.png](https://raw.githubusercontent.com/jwdavis0200/pe/main/files/714ecdff-d202-4eb9-ae71-b1992fd752b7.png) Above Figure: User viewing the image can't tell the difference between before-command and after-command since only after-command is shown. <!--session: 1668154592408-aa02e40a-16e9-443e-95e5-0d6565bc2029--> <!--Version: Web v3.4.4-->
1.0
Images used in the UG for features examples is redundant/ineffective for value adding. - The Features in the UG were explained with many examples and result-of-command lists but fails to show the difference between the before-command state of the list and the after-command state of the list, hence is not useful to the user to learn about how the commands change the state of the HealthContact lists. May even add to the clutter of the UG rather than guide users, since they can't tell the difference anyways. ![image.png](https://raw.githubusercontent.com/jwdavis0200/pe/main/files/714ecdff-d202-4eb9-ae71-b1992fd752b7.png) Above Figure: User viewing the image can't tell the difference between before-command and after-command since only after-command is shown. <!--session: 1668154592408-aa02e40a-16e9-443e-95e5-0d6565bc2029--> <!--Version: Web v3.4.4-->
non_process
images used in the ug for features examples is redundant ineffective for value adding the features in the ug were explained with many examples and result of command lists but fails to show the difference between the before command state of the list and the after command state of the list hence is not useful to the user to learn about how the commands change the state of the healthcontact lists may even add to the clutter of the ug rather than guide users since they can t tell the difference anyways above figure user viewing the image can t tell the difference between before command and after command since only after command is shown
0
62,651
14,656,554,120
IssuesEvent
2020-12-28 13:41:03
fu1771695yongxie/Rocket.Chat
https://api.github.com/repos/fu1771695yongxie/Rocket.Chat
opened
CVE-2020-7661 (High) detected in url-regex-5.0.0.tgz, url-regex-3.2.0.tgz
security vulnerability
## CVE-2020-7661 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>url-regex-5.0.0.tgz</b>, <b>url-regex-3.2.0.tgz</b></p></summary> <p> <details><summary><b>url-regex-5.0.0.tgz</b></p></summary> <p>Regular expression for matching URLs</p> <p>Library home page: <a href="https://registry.npmjs.org/url-regex/-/url-regex-5.0.0.tgz">https://registry.npmjs.org/url-regex/-/url-regex-5.0.0.tgz</a></p> <p>Path to dependency file: Rocket.Chat/package.json</p> <p>Path to vulnerable library: Rocket.Chat/node_modules/postcss-values-parser/node_modules/url-regex/package.json</p> <p> Dependency Hierarchy: - postcss-custom-properties-9.1.1.tgz (Root Library) - postcss-values-parser-3.2.1.tgz - :x: **url-regex-5.0.0.tgz** (Vulnerable Library) </details> <details><summary><b>url-regex-3.2.0.tgz</b></p></summary> <p>Regular expression for matching URLs</p> <p>Library home page: <a href="https://registry.npmjs.org/url-regex/-/url-regex-3.2.0.tgz">https://registry.npmjs.org/url-regex/-/url-regex-3.2.0.tgz</a></p> <p>Path to dependency file: Rocket.Chat/package.json</p> <p>Path to vulnerable library: Rocket.Chat/node_modules/url-regex/package.json</p> <p> Dependency Hierarchy: - node-sprite-generator-0.10.2.tgz (Root Library) - jimp-0.2.21.tgz - :x: **url-regex-3.2.0.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/fu1771695yongxie/Rocket.Chat/commit/60c2c8d370f1dbf301090daf20046b9ecd2435f4">60c2c8d370f1dbf301090daf20046b9ecd2435f4</a></p> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> all versions of url-regex are vulnerable to Regular Expression Denial of Service. An attacker providing a very long string in String.test can cause a Denial of Service. <p>Publish Date: 2020-06-04 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7661>CVE-2020-7661</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-7661 (High) detected in url-regex-5.0.0.tgz, url-regex-3.2.0.tgz - ## CVE-2020-7661 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>url-regex-5.0.0.tgz</b>, <b>url-regex-3.2.0.tgz</b></p></summary> <p> <details><summary><b>url-regex-5.0.0.tgz</b></p></summary> <p>Regular expression for matching URLs</p> <p>Library home page: <a href="https://registry.npmjs.org/url-regex/-/url-regex-5.0.0.tgz">https://registry.npmjs.org/url-regex/-/url-regex-5.0.0.tgz</a></p> <p>Path to dependency file: Rocket.Chat/package.json</p> <p>Path to vulnerable library: Rocket.Chat/node_modules/postcss-values-parser/node_modules/url-regex/package.json</p> <p> Dependency Hierarchy: - postcss-custom-properties-9.1.1.tgz (Root Library) - postcss-values-parser-3.2.1.tgz - :x: **url-regex-5.0.0.tgz** (Vulnerable Library) </details> <details><summary><b>url-regex-3.2.0.tgz</b></p></summary> <p>Regular expression for matching URLs</p> <p>Library home page: <a href="https://registry.npmjs.org/url-regex/-/url-regex-3.2.0.tgz">https://registry.npmjs.org/url-regex/-/url-regex-3.2.0.tgz</a></p> <p>Path to dependency file: Rocket.Chat/package.json</p> <p>Path to vulnerable library: Rocket.Chat/node_modules/url-regex/package.json</p> <p> Dependency Hierarchy: - node-sprite-generator-0.10.2.tgz (Root Library) - jimp-0.2.21.tgz - :x: **url-regex-3.2.0.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/fu1771695yongxie/Rocket.Chat/commit/60c2c8d370f1dbf301090daf20046b9ecd2435f4">60c2c8d370f1dbf301090daf20046b9ecd2435f4</a></p> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> all versions of url-regex are vulnerable to Regular Expression Denial of Service. An attacker providing a very long string in String.test can cause a Denial of Service. <p>Publish Date: 2020-06-04 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7661>CVE-2020-7661</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in url regex tgz url regex tgz cve high severity vulnerability vulnerable libraries url regex tgz url regex tgz url regex tgz regular expression for matching urls library home page a href path to dependency file rocket chat package json path to vulnerable library rocket chat node modules postcss values parser node modules url regex package json dependency hierarchy postcss custom properties tgz root library postcss values parser tgz x url regex tgz vulnerable library url regex tgz regular expression for matching urls library home page a href path to dependency file rocket chat package json path to vulnerable library rocket chat node modules url regex package json dependency hierarchy node sprite generator tgz root library jimp tgz x url regex tgz vulnerable library found in head commit a href found in base branch develop vulnerability details all versions of url regex are vulnerable to regular expression denial of service an attacker providing a very long string in string test can cause a denial of service publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href step up your open source security game with whitesource
0
652,330
21,528,497,419
IssuesEvent
2022-04-28 21:06:21
kubernetes/minikube
https://api.github.com/repos/kubernetes/minikube
closed
GUI: Add cluster status to tray menu
priority/important-soon kind/gui
Cluster status should be added to the tray context menu so the user knows the status of the cluster without having to open the GUI.
1.0
GUI: Add cluster status to tray menu - Cluster status should be added to the tray context menu so the user knows the status of the cluster without having to open the GUI.
non_process
gui add cluster status to tray menu cluster status should be added to the tray context menu so the user knows the status of the cluster without having to open the gui
0
83,844
16,376,858,628
IssuesEvent
2021-05-16 09:34:33
qutip/qutip
https://api.github.com/repos/qutip/qutip
closed
ffmpeg command from User Guide gives an error
code good first issue unitaryhack
When I run `ffmpeg -r 20 -b 1800 -i bloch_%01d.png bloch.mp4` from the User Guide's [Generating Images for Animation ](http://qutip.org/docs/4.1/guide/guide-bloch.html#generating-images-for-animation) section I get the following error: ``` Option b (video bitrate (please use -b:v)) cannot be applied to input url %04d.png -- you are trying to apply an input option to an output file or vice versa. Move this option before the file it belongs to. Error parsing options for input file bloch_%01d.png. Error opening input files: Invalid argument ``` What works for me instead is this: `ffmpeg -r 20 -i bloch_%01d.png -pix_fmt yuv420p bloch.mp4` Note: personally I print files as zeropadded 4-digit number, so I have %04d.png instead of bloch_%01d.png.
1.0
ffmpeg command from User Guide gives an error - When I run `ffmpeg -r 20 -b 1800 -i bloch_%01d.png bloch.mp4` from the User Guide's [Generating Images for Animation ](http://qutip.org/docs/4.1/guide/guide-bloch.html#generating-images-for-animation) section I get the following error: ``` Option b (video bitrate (please use -b:v)) cannot be applied to input url %04d.png -- you are trying to apply an input option to an output file or vice versa. Move this option before the file it belongs to. Error parsing options for input file bloch_%01d.png. Error opening input files: Invalid argument ``` What works for me instead is this: `ffmpeg -r 20 -i bloch_%01d.png -pix_fmt yuv420p bloch.mp4` Note: personally I print files as zeropadded 4-digit number, so I have %04d.png instead of bloch_%01d.png.
non_process
ffmpeg command from user guide gives an error when i run ffmpeg r b i bloch png bloch from the user guide s generating images for animation section i get the following error option b video bitrate please use b v cannot be applied to input url png you are trying to apply an input option to an output file or vice versa move this option before the file it belongs to error parsing options for input file bloch png error opening input files invalid argument what works for me instead is this ffmpeg r i bloch png pix fmt bloch note personally i print files as zeropadded digit number so i have png instead of bloch png
0
13,088
15,436,475,898
IssuesEvent
2021-03-07 13:11:55
ismail-yilmaz/upp-components
https://api.github.com/repos/ismail-yilmaz/upp-components
closed
PtyProcess: Winpty should be made into a U++ plugin and statically linked.
PtyProcess Terminal enhancement
I have already managed to compile and **statically** link winpty library with stock Upp and the bundled clang compiler with very little wrestling . Not to mention that winpty has MIT license.. This means we can simply provide the library with the PtyProcess package, and preferably make it the default backend, for easy maintenance and development. This shouldn't take more than a week or so.
1.0
PtyProcess: Winpty should be made into a U++ plugin and statically linked. - I have already managed to compile and **statically** link winpty library with stock Upp and the bundled clang compiler with very little wrestling . Not to mention that winpty has MIT license.. This means we can simply provide the library with the PtyProcess package, and preferably make it the default backend, for easy maintenance and development. This shouldn't take more than a week or so.
process
ptyprocess winpty should be made into a u plugin and statically linked i have already managed to compile and statically link winpty library with stock upp and the bundled clang compiler with very little wrestling not to mention that winpty has mit license this means we can simply provide the library with the ptyprocess package and preferably make it the default backend for easy maintenance and development this shouldn t take more than a week or so
1
201,364
22,948,589,997
IssuesEvent
2022-07-19 04:22:06
snowdensb/wildfly
https://api.github.com/repos/snowdensb/wildfly
opened
CVE-2015-2575 (Medium) detected in mysql-connector-java-5.1.15.jar
security vulnerability
## CVE-2015-2575 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mysql-connector-java-5.1.15.jar</b></p></summary> <p>MySQL JDBC Type 4 driver</p> <p>Library home page: <a href="http://dev.mysql.com/doc/connector-j/en/">http://dev.mysql.com/doc/connector-j/en/</a></p> <p>Path to vulnerable library: /testsuite/integration/smoke/src/test/resources/mysql-connector-java-5.1.15.jar</p> <p> Dependency Hierarchy: - :x: **mysql-connector-java-5.1.15.jar** (Vulnerable Library) <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Unspecified vulnerability in the MySQL Connectors component in Oracle MySQL 5.1.34 and earlier allows remote authenticated users to affect confidentiality and integrity via unknown vectors related to Connector/J. <p>Publish Date: 2015-04-16 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-2575>CVE-2015-2575</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.2</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-gc43-g62c-99g2">https://github.com/advisories/GHSA-gc43-g62c-99g2</a></p> <p>Release Date: 2015-04-16</p> <p>Fix Resolution: 5.1.35</p> </p> </details> <p></p>
True
CVE-2015-2575 (Medium) detected in mysql-connector-java-5.1.15.jar - ## CVE-2015-2575 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mysql-connector-java-5.1.15.jar</b></p></summary> <p>MySQL JDBC Type 4 driver</p> <p>Library home page: <a href="http://dev.mysql.com/doc/connector-j/en/">http://dev.mysql.com/doc/connector-j/en/</a></p> <p>Path to vulnerable library: /testsuite/integration/smoke/src/test/resources/mysql-connector-java-5.1.15.jar</p> <p> Dependency Hierarchy: - :x: **mysql-connector-java-5.1.15.jar** (Vulnerable Library) <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Unspecified vulnerability in the MySQL Connectors component in Oracle MySQL 5.1.34 and earlier allows remote authenticated users to affect confidentiality and integrity via unknown vectors related to Connector/J. <p>Publish Date: 2015-04-16 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-2575>CVE-2015-2575</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.2</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-gc43-g62c-99g2">https://github.com/advisories/GHSA-gc43-g62c-99g2</a></p> <p>Release Date: 2015-04-16</p> <p>Fix Resolution: 5.1.35</p> </p> </details> <p></p>
non_process
cve medium detected in mysql connector java jar cve medium severity vulnerability vulnerable library mysql connector java jar mysql jdbc type driver library home page a href path to vulnerable library testsuite integration smoke src test resources mysql connector java jar dependency hierarchy x mysql connector java jar vulnerable library found in base branch main vulnerability details unspecified vulnerability in the mysql connectors component in oracle mysql and earlier allows remote authenticated users to affect confidentiality and integrity via unknown vectors related to connector j publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution
0
15,322
19,433,139,004
IssuesEvent
2021-12-21 14:15:59
threefoldtech/tfchain
https://api.github.com/repos/threefoldtech/tfchain
closed
Staking: increase GRANDPA delay
process_wontfix
The current GRANDPA delay is only 2 blocks. This is workeable for AURA, but since staking uses BABE, which might incur bigger forks, we will need to increase this delay. Failure to do so might result in unrecoverable forks (since a chain with Finalized block B1, and a different chain with finalized block B2, which are both direct children of some common ancestor A will never be compatible, as a finalized block can never be reversed).
1.0
Staking: increase GRANDPA delay - The current GRANDPA delay is only 2 blocks. This is workeable for AURA, but since staking uses BABE, which might incur bigger forks, we will need to increase this delay. Failure to do so might result in unrecoverable forks (since a chain with Finalized block B1, and a different chain with finalized block B2, which are both direct children of some common ancestor A will never be compatible, as a finalized block can never be reversed).
process
staking increase grandpa delay the current grandpa delay is only blocks this is workeable for aura but since staking uses babe which might incur bigger forks we will need to increase this delay failure to do so might result in unrecoverable forks since a chain with finalized block and a different chain with finalized block which are both direct children of some common ancestor a will never be compatible as a finalized block can never be reversed
1
258,360
27,563,923,197
IssuesEvent
2023-03-08 01:16:16
Abhi347/vid-to-speech-api-json
https://api.github.com/repos/Abhi347/vid-to-speech-api-json
opened
CVE-2017-20165 (High) detected in debug-2.6.8.tgz
Mend: dependency security vulnerability
## CVE-2017-20165 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>debug-2.6.8.tgz</b></p></summary> <p>small debugging utility</p> <p>Library home page: <a href="https://registry.npmjs.org/debug/-/debug-2.6.8.tgz">https://registry.npmjs.org/debug/-/debug-2.6.8.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/grpc/node_modules/debug/package.json</p> <p> Dependency Hierarchy: - speech-1.1.0.tgz (Root Library) - google-gax-0.14.5.tgz - grpc-1.7.3.tgz - node-pre-gyp-0.6.39.tgz - tar-pack-3.4.1.tgz - :x: **debug-2.6.8.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Abhi347/vid-to-speech-api-json/commit/f982a2ac7e2b2fffce4b4bc02af9d8eebfaf953b">f982a2ac7e2b2fffce4b4bc02af9d8eebfaf953b</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A vulnerability classified as problematic has been found in debug-js debug up to 3.0.x. This affects the function useColors of the file src/node.js. The manipulation of the argument str leads to inefficient regular expression complexity. Upgrading to version 3.1.0 is able to address this issue. The name of the patch is c38a0166c266a679c8de012d4eaccec3f944e685. It is recommended to upgrade the affected component. The identifier VDB-217665 was assigned to this vulnerability. <p>Publish Date: 2023-01-09 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-20165>CVE-2017-20165</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-9vvw-cc9w-f27h">https://github.com/advisories/GHSA-9vvw-cc9w-f27h</a></p> <p>Release Date: 2023-01-09</p> <p>Fix Resolution (debug): 2.6.9</p> <p>Direct dependency fix Resolution (@google-cloud/speech): 1.2.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2017-20165 (High) detected in debug-2.6.8.tgz - ## CVE-2017-20165 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>debug-2.6.8.tgz</b></p></summary> <p>small debugging utility</p> <p>Library home page: <a href="https://registry.npmjs.org/debug/-/debug-2.6.8.tgz">https://registry.npmjs.org/debug/-/debug-2.6.8.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/grpc/node_modules/debug/package.json</p> <p> Dependency Hierarchy: - speech-1.1.0.tgz (Root Library) - google-gax-0.14.5.tgz - grpc-1.7.3.tgz - node-pre-gyp-0.6.39.tgz - tar-pack-3.4.1.tgz - :x: **debug-2.6.8.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Abhi347/vid-to-speech-api-json/commit/f982a2ac7e2b2fffce4b4bc02af9d8eebfaf953b">f982a2ac7e2b2fffce4b4bc02af9d8eebfaf953b</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A vulnerability classified as problematic has been found in debug-js debug up to 3.0.x. This affects the function useColors of the file src/node.js. The manipulation of the argument str leads to inefficient regular expression complexity. Upgrading to version 3.1.0 is able to address this issue. The name of the patch is c38a0166c266a679c8de012d4eaccec3f944e685. It is recommended to upgrade the affected component. The identifier VDB-217665 was assigned to this vulnerability. <p>Publish Date: 2023-01-09 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-20165>CVE-2017-20165</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-9vvw-cc9w-f27h">https://github.com/advisories/GHSA-9vvw-cc9w-f27h</a></p> <p>Release Date: 2023-01-09</p> <p>Fix Resolution (debug): 2.6.9</p> <p>Direct dependency fix Resolution (@google-cloud/speech): 1.2.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in debug tgz cve high severity vulnerability vulnerable library debug tgz small debugging utility library home page a href path to dependency file package json path to vulnerable library node modules grpc node modules debug package json dependency hierarchy speech tgz root library google gax tgz grpc tgz node pre gyp tgz tar pack tgz x debug tgz vulnerable library found in head commit a href vulnerability details a vulnerability classified as problematic has been found in debug js debug up to x this affects the function usecolors of the file src node js the manipulation of the argument str leads to inefficient regular expression complexity upgrading to version is able to address this issue the name of the patch is it is recommended to upgrade the affected component the identifier vdb was assigned to this vulnerability publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution debug direct dependency fix resolution google cloud speech step up your open source security game with mend
0
323,627
27,741,238,394
IssuesEvent
2023-03-15 14:25:10
ntop/ntopng
https://api.github.com/repos/ntop/ntopng
closed
modification of max char length in Client Server columnt - IPv6 address length cut and replaced with … - for better search possibility
Ready to Test
We suggest the possibility of changing the width of the Client or Server column width to accommodate the length of IPv6 addresses. So they don't get cut and are searchable by browser search function. ![image](https://user-images.githubusercontent.com/25390140/223690566-0debeb17-b3ff-41dc-a75a-c70fae9c868c.png)
1.0
modification of max char length in Client Server columnt - IPv6 address length cut and replaced with … - for better search possibility - We suggest the possibility of changing the width of the Client or Server column width to accommodate the length of IPv6 addresses. So they don't get cut and are searchable by browser search function. ![image](https://user-images.githubusercontent.com/25390140/223690566-0debeb17-b3ff-41dc-a75a-c70fae9c868c.png)
non_process
modification of max char length in client server columnt address length cut and replaced with … for better search possibility we suggest the possibility of changing the width of the client or server column width to accommodate the length of addresses so they don t get cut and are searchable by browser search function
0
18,966
24,931,236,007
IssuesEvent
2022-10-31 11:48:45
deepset-ai/haystack
https://api.github.com/repos/deepset-ai/haystack
closed
`PDFToTextOCRConverter.convert()` write temp files into the package folder
type:bug topic:preprocessing journey:first steps topic:indexing
**Describe the bug** - `PDFToTextOCRConverter.convert()` write temp files into the package folder: https://github.com/deepset-ai/haystack/blob/5ca96357ff526bf11aefa7fe4aa24fd11135cd0c/haystack/nodes/file_converter/pdf.py#L255 - This breaks the node in some contexts, i.e. conda on Windows, non-editable install. - See https://discord.com/channels/993534733298450452/1034328904829247568/1034740296128409670 **Error message** ``` ERROR:haystack.nodes.file_converter.pdf:File Machine_Learning.pdf has an error: [Errno 13] Permission denied: 'C:\Users\bhati\anaconda3\envs\env_3\Lib\site-packages\haystack\nodes\file_converter\tmp4os8nw11.jpeg' ``` **Expected behavior** - The file should be generated into a writable temp directory. `tempfile` does that by default. - Forcing the path to be in the package brings no benefit **Additional context** n/a **To Reproduce** - Install Haystack in non-editable mode with conda on Windows. - Try to convert a file with PDFToTextOCRConverter **FAQ Check** - [X] Have you had a look at [our new FAQ page](https://haystack.deepset.ai/overview/faq)?
1.0
`PDFToTextOCRConverter.convert()` write temp files into the package folder - **Describe the bug** - `PDFToTextOCRConverter.convert()` write temp files into the package folder: https://github.com/deepset-ai/haystack/blob/5ca96357ff526bf11aefa7fe4aa24fd11135cd0c/haystack/nodes/file_converter/pdf.py#L255 - This breaks the node in some contexts, i.e. conda on Windows, non-editable install. - See https://discord.com/channels/993534733298450452/1034328904829247568/1034740296128409670 **Error message** ``` ERROR:haystack.nodes.file_converter.pdf:File Machine_Learning.pdf has an error: [Errno 13] Permission denied: 'C:\Users\bhati\anaconda3\envs\env_3\Lib\site-packages\haystack\nodes\file_converter\tmp4os8nw11.jpeg' ``` **Expected behavior** - The file should be generated into a writable temp directory. `tempfile` does that by default. - Forcing the path to be in the package brings no benefit **Additional context** n/a **To Reproduce** - Install Haystack in non-editable mode with conda on Windows. - Try to convert a file with PDFToTextOCRConverter **FAQ Check** - [X] Have you had a look at [our new FAQ page](https://haystack.deepset.ai/overview/faq)?
process
pdftotextocrconverter convert write temp files into the package folder describe the bug pdftotextocrconverter convert write temp files into the package folder this breaks the node in some contexts i e conda on windows non editable install see error message error haystack nodes file converter pdf file machine learning pdf has an error permission denied c users bhati envs env lib site packages haystack nodes file converter jpeg expected behavior the file should be generated into a writable temp directory tempfile does that by default forcing the path to be in the package brings no benefit additional context n a to reproduce install haystack in non editable mode with conda on windows try to convert a file with pdftotextocrconverter faq check have you had a look at
1
3,715
6,732,609,371
IssuesEvent
2017-10-18 12:12:12
lockedata/rcms
https://api.github.com/repos/lockedata/rcms
opened
Manage attendees
conference team odoo processes
## Detailed task - Monitor sales - Modify a registration e.g. issue a refund - Send an email to attendees ## Assessing the task Try to perform the task. Use google and the system documentation to help - part of what we're trying to assess how easy it is for people to work out how to do tasks. Use a 👍 (`:+1:`) reaction to this task if you were able to perform the task. Use a 👎 (`:-1:`) reaction to the task if you could not complete it. Add a reply with any comments or feedback. ## Extra Info - Site: [odoo](//http://188.166.159.192:8069) - System documentation: [odoo docs](https://www.odoo.com/page/docs) - Role: Conference team - Area: Processes
1.0
Manage attendees - ## Detailed task - Monitor sales - Modify a registration e.g. issue a refund - Send an email to attendees ## Assessing the task Try to perform the task. Use google and the system documentation to help - part of what we're trying to assess how easy it is for people to work out how to do tasks. Use a 👍 (`:+1:`) reaction to this task if you were able to perform the task. Use a 👎 (`:-1:`) reaction to the task if you could not complete it. Add a reply with any comments or feedback. ## Extra Info - Site: [odoo](//http://188.166.159.192:8069) - System documentation: [odoo docs](https://www.odoo.com/page/docs) - Role: Conference team - Area: Processes
process
manage attendees detailed task monitor sales modify a registration e g issue a refund send an email to attendees assessing the task try to perform the task use google and the system documentation to help part of what we re trying to assess how easy it is for people to work out how to do tasks use a 👍 reaction to this task if you were able to perform the task use a 👎 reaction to the task if you could not complete it add a reply with any comments or feedback extra info site system documentation role conference team area processes
1
128,283
17,470,026,949
IssuesEvent
2021-08-07 01:08:37
RecordReplay/devtools
https://api.github.com/repos/RecordReplay/devtools
closed
July Onboarding Refresh
design-complete
Here's a link to the Figma file: https://www.figma.com/file/SSr1ljyzF0lCXTL7BX1Whh/Replay-Design-Doc?node-id=5210%3A12083 **Here are the steps people will walk through, in order:** 1. Big bold welcome screen 2. Download links 3. Nicer Google signin 4. Create your first Replay screen 5. Glitch demo with circles? 6. Console.logs in dev From a priority standpoint, let's do it like this: - [x] [New onboarding demo with great console logs](https://github.com/RecordReplay/devtools/issues/3109) - [x] [New welcome and download screens (for people just invited)](https://github.com/RecordReplay/devtools/issues/3046) - [x] [Nicer Google login](https://github.com/RecordReplay/devtools/issues/3111)
1.0
July Onboarding Refresh - Here's a link to the Figma file: https://www.figma.com/file/SSr1ljyzF0lCXTL7BX1Whh/Replay-Design-Doc?node-id=5210%3A12083 **Here are the steps people will walk through, in order:** 1. Big bold welcome screen 2. Download links 3. Nicer Google signin 4. Create your first Replay screen 5. Glitch demo with circles? 6. Console.logs in dev From a priority standpoint, let's do it like this: - [x] [New onboarding demo with great console logs](https://github.com/RecordReplay/devtools/issues/3109) - [x] [New welcome and download screens (for people just invited)](https://github.com/RecordReplay/devtools/issues/3046) - [x] [Nicer Google login](https://github.com/RecordReplay/devtools/issues/3111)
non_process
july onboarding refresh here s a link to the figma file here are the steps people will walk through in order big bold welcome screen download links nicer google signin create your first replay screen glitch demo with circles console logs in dev from a priority standpoint let s do it like this
0
428,997
30,019,840,309
IssuesEvent
2023-06-26 22:01:54
VEMULA-MOUNITHA/ProjectBoard0501
https://api.github.com/repos/VEMULA-MOUNITHA/ProjectBoard0501
opened
Final Document:
documentation
Need to complete all the project related works and need to verify that the website is working properly or not.
1.0
Final Document: - Need to complete all the project related works and need to verify that the website is working properly or not.
non_process
final document need to complete all the project related works and need to verify that the website is working properly or not
0
499,999
14,484,054,062
IssuesEvent
2020-12-10 15:52:47
ccmbioinfo/ST2020
https://api.github.com/repos/ccmbioinfo/ST2020
closed
Epic: /api/groups
backend priority:high
Create a new file `groups.py` for these endpoints - [x] **GET /api/groups** (list_groups) Requires authentication. Returns a list of all group codes and display names, e.g. `[{ "group_code": "ACH", "group_name": "Alberta" }]` - [x] **GET /api/groups/:group_code** (get_group) Returns `{ "group_code": "", "group_name": "", "users": [] }` where each field is filled out accordingly. The `users` array should be an array of usernames of the users in the group and can be obtained from the `users` relation on the Group model as a backref from User. Admins can use this endpoint for all groups. Regulars can only use this for groups they belong to or they get a 403 Forbidden. - [x] **POST /api/groups** (create_group) Admin-only. Creates a new group with the specified codename, display name, and users (same format as the get_group endpoint return result, but the users array is optional). This also creates a corresponding MinIO group (via madmin.MinioAdmin) and bucket (via MinioClient). The group should have an access policy that gives it complete access to its bucket. The corresponding MinIO users should be added to this group. Fail with 422 if the bucket already exists (do not create a group in the database in this case) or the code/name are already in use. Fail with 404 if a user to be added doesn't exist. The return result should be of the same form as `get_group` and the URL to the created resource should be in the Location header. - [x] **PATCH /api/groups** (update_groups) Admin-only. Updates requested fields accordingly. The group code cannot be changed but the display name can be. Users should be added and removed accordingly and the error codes are the same as create_group. - [x] **DELETE /api/groups/:group_code** (delete_group) Admin-only. Deletes the group and the MinIO group if they are empty but retains the MinIO bucket, otherwise fail with 422.
1.0
Epic: /api/groups - Create a new file `groups.py` for these endpoints - [x] **GET /api/groups** (list_groups) Requires authentication. Returns a list of all group codes and display names, e.g. `[{ "group_code": "ACH", "group_name": "Alberta" }]` - [x] **GET /api/groups/:group_code** (get_group) Returns `{ "group_code": "", "group_name": "", "users": [] }` where each field is filled out accordingly. The `users` array should be an array of usernames of the users in the group and can be obtained from the `users` relation on the Group model as a backref from User. Admins can use this endpoint for all groups. Regulars can only use this for groups they belong to or they get a 403 Forbidden. - [x] **POST /api/groups** (create_group) Admin-only. Creates a new group with the specified codename, display name, and users (same format as the get_group endpoint return result, but the users array is optional). This also creates a corresponding MinIO group (via madmin.MinioAdmin) and bucket (via MinioClient). The group should have an access policy that gives it complete access to its bucket. The corresponding MinIO users should be added to this group. Fail with 422 if the bucket already exists (do not create a group in the database in this case) or the code/name are already in use. Fail with 404 if a user to be added doesn't exist. The return result should be of the same form as `get_group` and the URL to the created resource should be in the Location header. - [x] **PATCH /api/groups** (update_groups) Admin-only. Updates requested fields accordingly. The group code cannot be changed but the display name can be. Users should be added and removed accordingly and the error codes are the same as create_group. - [x] **DELETE /api/groups/:group_code** (delete_group) Admin-only. Deletes the group and the MinIO group if they are empty but retains the MinIO bucket, otherwise fail with 422.
non_process
epic api groups create a new file groups py for these endpoints get api groups list groups requires authentication returns a list of all group codes and display names e g get api groups group code get group returns group code group name users where each field is filled out accordingly the users array should be an array of usernames of the users in the group and can be obtained from the users relation on the group model as a backref from user admins can use this endpoint for all groups regulars can only use this for groups they belong to or they get a forbidden post api groups create group admin only creates a new group with the specified codename display name and users same format as the get group endpoint return result but the users array is optional this also creates a corresponding minio group via madmin minioadmin and bucket via minioclient the group should have an access policy that gives it complete access to its bucket the corresponding minio users should be added to this group fail with if the bucket already exists do not create a group in the database in this case or the code name are already in use fail with if a user to be added doesn t exist the return result should be of the same form as get group and the url to the created resource should be in the location header patch api groups update groups admin only updates requested fields accordingly the group code cannot be changed but the display name can be users should be added and removed accordingly and the error codes are the same as create group delete api groups group code delete group admin only deletes the group and the minio group if they are empty but retains the minio bucket otherwise fail with
0
19,396
25,539,287,900
IssuesEvent
2022-11-29 14:16:30
ESMValGroup/ESMValCore
https://api.github.com/repos/ESMValGroup/ESMValCore
closed
Preprocessor `multimodel_statistics` fails when data have no horizontal dimension
bug preprocessor
**Describe the bug** Hi all, I'm developing some tests for the multimodel statistics using real data (#856), and I'm coming accross this bug: Running multimodel statistics with a list of cubes with **no horizontal dimension**, will result in `iris.exceptions.CoordinateNotFoundError: 'Expected to find exactly 1 depth coordinate, but found none.'` The bug can be reproduced by: ```python cubes = [cube[:, :, 0, 0] for cube in cubes] multi_model_statistics(cubes, span='full', statistics=['mean']) ``` <details> <summary>See the full stack trace below.</summary> ``` timeseries_cubes_month = [<iris 'Cube' of air_temperature / (K) (time: 14; air_pressure: 2; latitude: 3; longitude: 2)>, <iris 'Cube' of air_te... 3; longitude: 2)>, <iris 'Cube' of air_temperature / (K) (time: 14; air_pressure: 2; latitude: 3; longitude: 2)>, ...] @pytest.mark.functional # @pytest.mark.xfail('iris.exceptions.CoordinateNotFoundError') def test_multimodel_no_horizontal_dimension(timeseries_cubes_month): """Test statistic without horizontal dimension using monthly data.""" span = 'full' cubes = timeseries_cubes_month cubes = [cube[:, :, 0, 0] for cube in cubes] # Coordinate not found error # iris.exceptions.CoordinateNotFoundError: # 'Expected to find exactly 1 depth coordinate, but found none.' > multimodel_test(cubes, span=span, statistic='mean') tests/functional/test_multimodel.py:207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/functional/test_multimodel.py:125: in multimodel_test output = multi_model_statistics(cubes, span=span, statistics=statistics) esmvalcore/preprocessor/_multimodel.py:439: in multi_model_statistics statistic_cube = _assemble_full_data(cubes, statistic) esmvalcore/preprocessor/_multimodel.py:355: in _assemble_full_data stats_cube = _put_in_cube(cubes[0], stats_dats, statistic, time_axis) esmvalcore/preprocessor/_multimodel.py:180: in _put_in_cube plev = template_cube.coord('depth') _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <iris 'Cube' of air_temperature / (K) (time: 14; air_pressure: 2)>, name_or_coord = 'depth', standard_name = None, long_name = None var_name = None, attributes = None, axis = None, contains_dimension = None, dimensions = None, coord_system = None, dim_coords = None def coord(self, name_or_coord=None, standard_name=None, long_name=None, var_name=None, attributes=None, axis=None, contains_dimension=None, dimensions=None, coord_system=None, dim_coords=None): """ Return a single coord given the same arguments as :meth:`Cube.coords`. .. note:: If the arguments given do not result in precisely 1 coordinate being matched, an :class:`iris.exceptions.CoordinateNotFoundError` is raised. .. seealso:: :meth:`Cube.coords()<iris.cube.Cube.coords>` for full keyword documentation. """ coords = self.coords(name_or_coord=name_or_coord, standard_name=standard_name, long_name=long_name, var_name=var_name, attributes=attributes, axis=axis, contains_dimension=contains_dimension, dimensions=dimensions, coord_system=coord_system, dim_coords=dim_coords) if len(coords) > 1: msg = 'Expected to find exactly 1 coordinate, but found %s. ' \ 'They were: %s.' % (len(coords), ', '.join(coord.name() for coord in coords)) raise iris.exceptions.CoordinateNotFoundError(msg) elif len(coords) == 0: _name = name_or_coord if name_or_coord is not None: if not isinstance(name_or_coord, six.string_types): _name = name_or_coord.name() bad_name = _name or standard_name or long_name or '' msg = 'Expected to find exactly 1 %s coordinate, but found ' \ 'none.' % bad_name > raise iris.exceptions.CoordinateNotFoundError(msg) E iris.exceptions.CoordinateNotFoundError: 'Expected to find exactly 1 depth coordinate, but found none.' ../../miniconda3/envs/esmvaltool/lib/python3.8/site-packages/iris/cube.py:1497: CoordinateNotFoundError ``` </details>
1.0
Preprocessor `multimodel_statistics` fails when data have no horizontal dimension - **Describe the bug** Hi all, I'm developing some tests for the multimodel statistics using real data (#856), and I'm coming accross this bug: Running multimodel statistics with a list of cubes with **no horizontal dimension**, will result in `iris.exceptions.CoordinateNotFoundError: 'Expected to find exactly 1 depth coordinate, but found none.'` The bug can be reproduced by: ```python cubes = [cube[:, :, 0, 0] for cube in cubes] multi_model_statistics(cubes, span='full', statistics=['mean']) ``` <details> <summary>See the full stack trace below.</summary> ``` timeseries_cubes_month = [<iris 'Cube' of air_temperature / (K) (time: 14; air_pressure: 2; latitude: 3; longitude: 2)>, <iris 'Cube' of air_te... 3; longitude: 2)>, <iris 'Cube' of air_temperature / (K) (time: 14; air_pressure: 2; latitude: 3; longitude: 2)>, ...] @pytest.mark.functional # @pytest.mark.xfail('iris.exceptions.CoordinateNotFoundError') def test_multimodel_no_horizontal_dimension(timeseries_cubes_month): """Test statistic without horizontal dimension using monthly data.""" span = 'full' cubes = timeseries_cubes_month cubes = [cube[:, :, 0, 0] for cube in cubes] # Coordinate not found error # iris.exceptions.CoordinateNotFoundError: # 'Expected to find exactly 1 depth coordinate, but found none.' > multimodel_test(cubes, span=span, statistic='mean') tests/functional/test_multimodel.py:207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/functional/test_multimodel.py:125: in multimodel_test output = multi_model_statistics(cubes, span=span, statistics=statistics) esmvalcore/preprocessor/_multimodel.py:439: in multi_model_statistics statistic_cube = _assemble_full_data(cubes, statistic) esmvalcore/preprocessor/_multimodel.py:355: in _assemble_full_data stats_cube = _put_in_cube(cubes[0], stats_dats, statistic, time_axis) esmvalcore/preprocessor/_multimodel.py:180: in _put_in_cube plev = template_cube.coord('depth') _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <iris 'Cube' of air_temperature / (K) (time: 14; air_pressure: 2)>, name_or_coord = 'depth', standard_name = None, long_name = None var_name = None, attributes = None, axis = None, contains_dimension = None, dimensions = None, coord_system = None, dim_coords = None def coord(self, name_or_coord=None, standard_name=None, long_name=None, var_name=None, attributes=None, axis=None, contains_dimension=None, dimensions=None, coord_system=None, dim_coords=None): """ Return a single coord given the same arguments as :meth:`Cube.coords`. .. note:: If the arguments given do not result in precisely 1 coordinate being matched, an :class:`iris.exceptions.CoordinateNotFoundError` is raised. .. seealso:: :meth:`Cube.coords()<iris.cube.Cube.coords>` for full keyword documentation. """ coords = self.coords(name_or_coord=name_or_coord, standard_name=standard_name, long_name=long_name, var_name=var_name, attributes=attributes, axis=axis, contains_dimension=contains_dimension, dimensions=dimensions, coord_system=coord_system, dim_coords=dim_coords) if len(coords) > 1: msg = 'Expected to find exactly 1 coordinate, but found %s. ' \ 'They were: %s.' % (len(coords), ', '.join(coord.name() for coord in coords)) raise iris.exceptions.CoordinateNotFoundError(msg) elif len(coords) == 0: _name = name_or_coord if name_or_coord is not None: if not isinstance(name_or_coord, six.string_types): _name = name_or_coord.name() bad_name = _name or standard_name or long_name or '' msg = 'Expected to find exactly 1 %s coordinate, but found ' \ 'none.' % bad_name > raise iris.exceptions.CoordinateNotFoundError(msg) E iris.exceptions.CoordinateNotFoundError: 'Expected to find exactly 1 depth coordinate, but found none.' ../../miniconda3/envs/esmvaltool/lib/python3.8/site-packages/iris/cube.py:1497: CoordinateNotFoundError ``` </details>
process
preprocessor multimodel statistics fails when data have no horizontal dimension describe the bug hi all i m developing some tests for the multimodel statistics using real data and i m coming accross this bug running multimodel statistics with a list of cubes with no horizontal dimension will result in iris exceptions coordinatenotfounderror expected to find exactly depth coordinate but found none the bug can be reproduced by python cubes for cube in cubes multi model statistics cubes span full statistics see the full stack trace below timeseries cubes month pytest mark functional pytest mark xfail iris exceptions coordinatenotfounderror def test multimodel no horizontal dimension timeseries cubes month test statistic without horizontal dimension using monthly data span full cubes timeseries cubes month cubes for cube in cubes coordinate not found error iris exceptions coordinatenotfounderror expected to find exactly depth coordinate but found none multimodel test cubes span span statistic mean tests functional test multimodel py tests functional test multimodel py in multimodel test output multi model statistics cubes span span statistics statistics esmvalcore preprocessor multimodel py in multi model statistics statistic cube assemble full data cubes statistic esmvalcore preprocessor multimodel py in assemble full data stats cube put in cube cubes stats dats statistic time axis esmvalcore preprocessor multimodel py in put in cube plev template cube coord depth self name or coord depth standard name none long name none var name none attributes none axis none contains dimension none dimensions none coord system none dim coords none def coord self name or coord none standard name none long name none var name none attributes none axis none contains dimension none dimensions none coord system none dim coords none return a single coord given the same arguments as meth cube coords note if the arguments given do not result in precisely coordinate being matched an class iris exceptions coordinatenotfounderror is raised seealso meth cube coords for full keyword documentation coords self coords name or coord name or coord standard name standard name long name long name var name var name attributes attributes axis axis contains dimension contains dimension dimensions dimensions coord system coord system dim coords dim coords if len coords msg expected to find exactly coordinate but found s they were s len coords join coord name for coord in coords raise iris exceptions coordinatenotfounderror msg elif len coords name name or coord if name or coord is not none if not isinstance name or coord six string types name name or coord name bad name name or standard name or long name or msg expected to find exactly s coordinate but found none bad name raise iris exceptions coordinatenotfounderror msg e iris exceptions coordinatenotfounderror expected to find exactly depth coordinate but found none envs esmvaltool lib site packages iris cube py coordinatenotfounderror
1
20,479
27,138,053,950
IssuesEvent
2023-02-16 14:33:51
benthosdev/benthos
https://api.github.com/repos/benthosdev/benthos
closed
Mapping proccesors are missing or skipped in tracers
bug processors
When we replace mapping with bloblang we could see it’s corresponding entry in tracers. So I am guessing the issue is only with mapping processor.
1.0
Mapping proccesors are missing or skipped in tracers - When we replace mapping with bloblang we could see it’s corresponding entry in tracers. So I am guessing the issue is only with mapping processor.
process
mapping proccesors are missing or skipped in tracers when we replace mapping with bloblang we could see it’s corresponding entry in tracers so i am guessing the issue is only with mapping processor
1
446,411
31,474,466,300
IssuesEvent
2023-08-30 09:43:56
PocketRelay/Website
https://api.github.com/repos/PocketRelay/Website
opened
Update docker config example
documentation enhancement
## Description Current docker example doesn't show how to easily include config variables. This should be updated to include volume binding examples so that its easier to use (Including docker compose)
1.0
Update docker config example - ## Description Current docker example doesn't show how to easily include config variables. This should be updated to include volume binding examples so that its easier to use (Including docker compose)
non_process
update docker config example description current docker example doesn t show how to easily include config variables this should be updated to include volume binding examples so that its easier to use including docker compose
0
560,520
16,598,730,723
IssuesEvent
2021-06-01 16:21:40
wp-media/wp-rocket
https://api.github.com/repos/wp-media/wp-rocket
closed
Remove "Image Optimization" menu when WP_ROCKET_WHITE_LABEL_ACCOUNT is used
Module: dashboard community effort: [XS] good first issue priority: low type: enhancement
**Describe the solution you'd like** The customer wants to be able to disable Imagify ads in WP Rocket. We could add this to the WP_ROCKET_WHITE_LABEL_ACCOUNT constant. **Additional context** From productboard: https://wp-media.productboard.com/insights/shared-inbox/notes/9688386
1.0
Remove "Image Optimization" menu when WP_ROCKET_WHITE_LABEL_ACCOUNT is used - **Describe the solution you'd like** The customer wants to be able to disable Imagify ads in WP Rocket. We could add this to the WP_ROCKET_WHITE_LABEL_ACCOUNT constant. **Additional context** From productboard: https://wp-media.productboard.com/insights/shared-inbox/notes/9688386
non_process
remove image optimization menu when wp rocket white label account is used describe the solution you d like the customer wants to be able to disable imagify ads in wp rocket we could add this to the wp rocket white label account constant additional context from productboard
0
298,144
9,196,330,488
IssuesEvent
2019-03-07 06:39:38
wso2/product-is
https://api.github.com/repos/wso2/product-is
closed
[Doc] Update the CorrettoJDK 8 and AdoptOpenJDK 8 compatibility with WSO2 IS 5.7
5.7.0 Priority/Highest Resolution/Done Severity/Critical Type/Docs
**Background** Update the [Tested Operating Systems and JDKs](https://docs.wso2.com/display/compatibility/Tested+Operating+Systems+and+JDKs) to indicate the compatibility of CorrettoJDK 8 and AdoptOpenJDK 8 with WSO2 Identity Server 5.7.
1.0
[Doc] Update the CorrettoJDK 8 and AdoptOpenJDK 8 compatibility with WSO2 IS 5.7 - **Background** Update the [Tested Operating Systems and JDKs](https://docs.wso2.com/display/compatibility/Tested+Operating+Systems+and+JDKs) to indicate the compatibility of CorrettoJDK 8 and AdoptOpenJDK 8 with WSO2 Identity Server 5.7.
non_process
update the correttojdk and adoptopenjdk compatibility with is background update the to indicate the compatibility of correttojdk and adoptopenjdk with identity server
0
92,396
18,848,274,772
IssuesEvent
2021-11-11 17:20:19
sourcegraph/sourcegraph
https://api.github.com/repos/sourcegraph/sourcegraph
closed
executors: Write docs for experimental auto-indexer configuration
team/code-intelligence server-side auto-index-on-prem
We currently have no user-facing docs on how to deploy, configure, or use auto-indexers.
1.0
executors: Write docs for experimental auto-indexer configuration - We currently have no user-facing docs on how to deploy, configure, or use auto-indexers.
non_process
executors write docs for experimental auto indexer configuration we currently have no user facing docs on how to deploy configure or use auto indexers
0
74,133
24,962,771,900
IssuesEvent
2022-11-01 16:49:07
jOOQ/jOOQ
https://api.github.com/repos/jOOQ/jOOQ
closed
Wrong transformation for transformPatternsTrivialPredicates when DISTINCT predicate operand is NULL
T: Defect C: Functionality P: Medium E: Professional Edition E: Enterprise Edition
The `QOM.IsDistinctFrom` and `QOM.IsNotDistinctFrom` predicates extend `CompareCondition`, which is the only check done for `Settings.transformPatternsTrivialPredicates` to decide whether a condition (e.g. `a = null`) is reduced to a `nullCondition()`. This means the following wrong transformation is made: ```sql -- Input SELECT a IS DISTINCT FROM NULL, a IS NOT DISTINCT FROM NULL -- Output SELECT NULL, NULL ```
1.0
Wrong transformation for transformPatternsTrivialPredicates when DISTINCT predicate operand is NULL - The `QOM.IsDistinctFrom` and `QOM.IsNotDistinctFrom` predicates extend `CompareCondition`, which is the only check done for `Settings.transformPatternsTrivialPredicates` to decide whether a condition (e.g. `a = null`) is reduced to a `nullCondition()`. This means the following wrong transformation is made: ```sql -- Input SELECT a IS DISTINCT FROM NULL, a IS NOT DISTINCT FROM NULL -- Output SELECT NULL, NULL ```
non_process
wrong transformation for transformpatternstrivialpredicates when distinct predicate operand is null the qom isdistinctfrom and qom isnotdistinctfrom predicates extend comparecondition which is the only check done for settings transformpatternstrivialpredicates to decide whether a condition e g a null is reduced to a nullcondition this means the following wrong transformation is made sql input select a is distinct from null a is not distinct from null output select null null
0
420,549
28,289,766,526
IssuesEvent
2023-04-09 03:29:00
abluenautilus/SeasideModularVCV
https://api.github.com/repos/abluenautilus/SeasideModularVCV
closed
Some scales left out
documentation Proteus
Documentation mentioned a new scale was added but this did not show up in the latest release.
1.0
Some scales left out - Documentation mentioned a new scale was added but this did not show up in the latest release.
non_process
some scales left out documentation mentioned a new scale was added but this did not show up in the latest release
0
1,755
4,460,997,489
IssuesEvent
2016-08-24 02:37:38
dotnet/corefx
https://api.github.com/repos/dotnet/corefx
closed
System.Diagnostics.ProcessThread.StartAddress reports incorrect address on Linux
3 - Ready For Review System.Diagnostics.Process X-Plat
We get the value of `ProcessThread.StartAddress` by looking at `/proc/[pid]/task/[tid]/stat`, parsing out the `startstack` field from that file. However, this gives the address of the *stack*, not the ["...address of the function that the operating system called that started this thread."](https://msdn.microsoft.com/en-us/library/system.diagnostics.processthread.startaddress(v=vs.110).aspx) It doesn't look to me like the value we need here is actually available on Linux or OSX. From the test code, it looks like we already expect `null` on OSX. Maybe we should change the Linux implementation to return `null` as well, or change both to throw `PlatformNotSupportedException`.
1.0
System.Diagnostics.ProcessThread.StartAddress reports incorrect address on Linux - We get the value of `ProcessThread.StartAddress` by looking at `/proc/[pid]/task/[tid]/stat`, parsing out the `startstack` field from that file. However, this gives the address of the *stack*, not the ["...address of the function that the operating system called that started this thread."](https://msdn.microsoft.com/en-us/library/system.diagnostics.processthread.startaddress(v=vs.110).aspx) It doesn't look to me like the value we need here is actually available on Linux or OSX. From the test code, it looks like we already expect `null` on OSX. Maybe we should change the Linux implementation to return `null` as well, or change both to throw `PlatformNotSupportedException`.
process
system diagnostics processthread startaddress reports incorrect address on linux we get the value of processthread startaddress by looking at proc task stat parsing out the startstack field from that file however this gives the address of the stack not the it doesn t look to me like the value we need here is actually available on linux or osx from the test code it looks like we already expect null on osx maybe we should change the linux implementation to return null as well or change both to throw platformnotsupportedexception
1
111,744
14,142,315,604
IssuesEvent
2020-11-10 13:54:58
patternfly/patternfly-org
https://api.github.com/repos/patternfly/patternfly-org
closed
UX writing style guide: Add CCS terms & conventions to "Terminology"
Content PF4 design Guidelines UX writing style guide
Link CCS terms and conventions in PatternFly's terminology page. Specify that it's a resource for Red Hat-specific terms. Link to CCS page: https://redhat-documentation.github.io/supplementary-style-guide/#introduction Link to PF terminology: https://www.patternfly.org/v4/ux-writing/terminology
1.0
UX writing style guide: Add CCS terms & conventions to "Terminology" - Link CCS terms and conventions in PatternFly's terminology page. Specify that it's a resource for Red Hat-specific terms. Link to CCS page: https://redhat-documentation.github.io/supplementary-style-guide/#introduction Link to PF terminology: https://www.patternfly.org/v4/ux-writing/terminology
non_process
ux writing style guide add ccs terms conventions to terminology link ccs terms and conventions in patternfly s terminology page specify that it s a resource for red hat specific terms link to ccs page link to pf terminology
0
347,868
31,281,486,058
IssuesEvent
2023-08-22 09:51:57
elastic/elasticsearch
https://api.github.com/repos/elastic/elasticsearch
opened
[CI] RestEsqlIT testWarningHeadersOnFailedConversions failing
:Search/Search >test-failure
**Build scan:** https://gradle-enterprise.elastic.co/s/3iejlt6s5z2gq/tests/:x-pack:plugin:esql:qa:server:single-node:javaRestTest/org.elasticsearch.xpack.esql.qa.single_node.RestEsqlIT/testWarningHeadersOnFailedConversions **Reproduction line:** ``` ./gradlew ':x-pack:plugin:esql:qa:server:single-node:javaRestTest' --tests "org.elasticsearch.xpack.esql.qa.single_node.RestEsqlIT.testWarningHeadersOnFailedConversions" -Dtests.seed=849E06D41F81C775 -Dtests.locale=id-ID -Dtests.timezone=Pacific/Funafuti -Druntime.java=20 ``` **Applicable branches:** main **Reproduces locally?:** Didn't try **Failure history:** https://gradle-enterprise.elastic.co/scans/tests?tests.container=org.elasticsearch.xpack.esql.qa.single_node.RestEsqlIT&tests.test=testWarningHeadersOnFailedConversions **Failure excerpt:** ``` java.lang.AssertionError: Expected: is <21> but: was <0> at __randomizedtesting.SeedInfo.seed([849E06D41F81C775:98FDFE231D426DAD]:0) at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:18) at org.junit.Assert.assertThat(Assert.java:956) at org.junit.Assert.assertThat(Assert.java:923) at org.elasticsearch.xpack.esql.qa.rest.RestEsqlTestCase.testWarningHeadersOnFailedConversions(RestEsqlTestCase.java:256) at jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) at java.lang.reflect.Method.invoke(Method.java:578) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) at java.lang.Thread.run(Thread.java:1623) ```
1.0
[CI] RestEsqlIT testWarningHeadersOnFailedConversions failing - **Build scan:** https://gradle-enterprise.elastic.co/s/3iejlt6s5z2gq/tests/:x-pack:plugin:esql:qa:server:single-node:javaRestTest/org.elasticsearch.xpack.esql.qa.single_node.RestEsqlIT/testWarningHeadersOnFailedConversions **Reproduction line:** ``` ./gradlew ':x-pack:plugin:esql:qa:server:single-node:javaRestTest' --tests "org.elasticsearch.xpack.esql.qa.single_node.RestEsqlIT.testWarningHeadersOnFailedConversions" -Dtests.seed=849E06D41F81C775 -Dtests.locale=id-ID -Dtests.timezone=Pacific/Funafuti -Druntime.java=20 ``` **Applicable branches:** main **Reproduces locally?:** Didn't try **Failure history:** https://gradle-enterprise.elastic.co/scans/tests?tests.container=org.elasticsearch.xpack.esql.qa.single_node.RestEsqlIT&tests.test=testWarningHeadersOnFailedConversions **Failure excerpt:** ``` java.lang.AssertionError: Expected: is <21> but: was <0> at __randomizedtesting.SeedInfo.seed([849E06D41F81C775:98FDFE231D426DAD]:0) at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:18) at org.junit.Assert.assertThat(Assert.java:956) at org.junit.Assert.assertThat(Assert.java:923) at org.elasticsearch.xpack.esql.qa.rest.RestEsqlTestCase.testWarningHeadersOnFailedConversions(RestEsqlTestCase.java:256) at jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) at java.lang.reflect.Method.invoke(Method.java:578) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) at java.lang.Thread.run(Thread.java:1623) ```
non_process
restesqlit testwarningheadersonfailedconversions failing build scan reproduction line gradlew x pack plugin esql qa server single node javaresttest tests org elasticsearch xpack esql qa single node restesqlit testwarningheadersonfailedconversions dtests seed dtests locale id id dtests timezone pacific funafuti druntime java applicable branches main reproduces locally didn t try failure history failure excerpt java lang assertionerror expected is but was at randomizedtesting seedinfo seed at org hamcrest matcherassert assertthat matcherassert java at org junit assert assertthat assert java at org junit assert assertthat assert java at org elasticsearch xpack esql qa rest restesqltestcase testwarningheadersonfailedconversions restesqltestcase java at jdk internal reflect directmethodhandleaccessor invoke directmethodhandleaccessor java at java lang reflect method invoke method java at com carrotsearch randomizedtesting randomizedrunner invoke randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene tests util testrulesetupteardownchained evaluate testrulesetupteardownchained java at org apache lucene tests util abstractbeforeafterrule evaluate abstractbeforeafterrule java at org apache lucene tests util testrulethreadandtestname evaluate testrulethreadandtestname java at org apache lucene tests util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene tests util testrulemarkfailure evaluate testrulemarkfailure java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol forktimeoutingtask threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol evaluate threadleakcontrol java at com carrotsearch randomizedtesting randomizedrunner runsingletest randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at org apache lucene tests util abstractbeforeafterrule evaluate abstractbeforeafterrule java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene tests util testrulestoreclassname evaluate testrulestoreclassname java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene tests util testruleassertionsrequired evaluate testruleassertionsrequired java at org apache lucene tests util abstractbeforeafterrule evaluate abstractbeforeafterrule java at org apache lucene tests util testrulemarkfailure evaluate testrulemarkfailure java at org apache lucene tests util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene tests util testruleignoretestsuites evaluate testruleignoretestsuites java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol lambda forktimeoutingtask threadleakcontrol java at java lang thread run thread java
0
21,918
30,444,992,706
IssuesEvent
2023-07-15 14:48:44
bitfocus/companion-module-requests
https://api.github.com/repos/bitfocus/companion-module-requests
opened
Lightkey OSC with feedback
NOT YET PROCESSED
- [ ] **I have researched the list of existing Companion modules and requests and have determined this has not yet been requested** The name of the device, hardware, or software you would like to control: Lightkey What you would like to be able to make it do from Companion: Control the lightkey sw through OSC or MIDI, and provide feedback. I'm using companion with lightkey today using OSC, but I'm missing feedback options Direct links or attachments to the ethernet control protocol or API: https://lightkeyapp.com
1.0
Lightkey OSC with feedback - - [ ] **I have researched the list of existing Companion modules and requests and have determined this has not yet been requested** The name of the device, hardware, or software you would like to control: Lightkey What you would like to be able to make it do from Companion: Control the lightkey sw through OSC or MIDI, and provide feedback. I'm using companion with lightkey today using OSC, but I'm missing feedback options Direct links or attachments to the ethernet control protocol or API: https://lightkeyapp.com
process
lightkey osc with feedback i have researched the list of existing companion modules and requests and have determined this has not yet been requested the name of the device hardware or software you would like to control lightkey what you would like to be able to make it do from companion control the lightkey sw through osc or midi and provide feedback i m using companion with lightkey today using osc but i m missing feedback options direct links or attachments to the ethernet control protocol or api
1
6,461
9,546,580,744
IssuesEvent
2019-05-01 20:21:57
openopps/openopps-platform
https://api.github.com/repos/openopps/openopps-platform
closed
Department of State: Right Rail of Submission Confirmation Page
Apply Process Requirements Ready State Dept.
Who: Student What: Notification that the user now has a USAJOBS profile and what they can do about it. Why: As a student I want to be informed that I now have a USAJOBS profile and information about federal hiring. A/C - There will be a box in the right rail "You now have a USAJOBS profile" - There will be content under the header With a USAJOBS profile, you can apply to jobs, upload a resume and make it searchable, save job searches and sign up for email notifications. - There will be 4 drawers with a + in the left margin that will open the drawer to expose content 1) Find a job This will link the user to USAJOBS search 2) Learn how to make your resume searchable Making your resume searchable adds it to the USAJOBS Resume Mining collection. HR specialists and hiring managers from federal agencies use resume mining to look for people (with a searchable resume in their USAJOBS profile) to fill their job vacancies. Plus, when your resume is searchable and you have completed your profile, your profile is searchable too. https://www.usajobs.gov/Help/how-to/account/profile/searchable/ (will open in a new window) 3) Learn how to save a job search You can save a search to help you look for jobs in your area of interest. When you save a search, we’ll automatically look for jobs that match the keywords and other criteria in your search. Learn how to create a saved search and sign up for email notifications. https://www.usajobs.gov/Help/how-to/search/save/) (will open in a new window) 4) Learn more about working in the government (this is the same as USAJOBS) - When the drawer is opened it will have 4 links: (these links will open in a new window) - How is finding a job in the government different that the private sector? (https://www.usajobs.gov/Help/working-in-government/appointments/difference-from-private-sector/) (will open in a new window) - How long does it take to get a federal job? (https://www.usajobs.gov/Help/faq/application/how-long/) (will open in a new window) - What's the difference between eligibility and qualifications? (https://www.usajobs.gov/Help/faq/application/eligibility/difference-from-qualifications/) (will open in a new window) - How does the application work? (https://www.usajobs.gov/Help/faq/application/process/) (will open in a new window) Related Issue #2927 InVision Mock: https://opm.invisionapp.com/d/main#/console/15360465/344480823/preview Public Link: Public Link: https://opm.invisionapp.com/share/ZEPNZR09Q54
1.0
Department of State: Right Rail of Submission Confirmation Page - Who: Student What: Notification that the user now has a USAJOBS profile and what they can do about it. Why: As a student I want to be informed that I now have a USAJOBS profile and information about federal hiring. A/C - There will be a box in the right rail "You now have a USAJOBS profile" - There will be content under the header With a USAJOBS profile, you can apply to jobs, upload a resume and make it searchable, save job searches and sign up for email notifications. - There will be 4 drawers with a + in the left margin that will open the drawer to expose content 1) Find a job This will link the user to USAJOBS search 2) Learn how to make your resume searchable Making your resume searchable adds it to the USAJOBS Resume Mining collection. HR specialists and hiring managers from federal agencies use resume mining to look for people (with a searchable resume in their USAJOBS profile) to fill their job vacancies. Plus, when your resume is searchable and you have completed your profile, your profile is searchable too. https://www.usajobs.gov/Help/how-to/account/profile/searchable/ (will open in a new window) 3) Learn how to save a job search You can save a search to help you look for jobs in your area of interest. When you save a search, we’ll automatically look for jobs that match the keywords and other criteria in your search. Learn how to create a saved search and sign up for email notifications. https://www.usajobs.gov/Help/how-to/search/save/) (will open in a new window) 4) Learn more about working in the government (this is the same as USAJOBS) - When the drawer is opened it will have 4 links: (these links will open in a new window) - How is finding a job in the government different that the private sector? (https://www.usajobs.gov/Help/working-in-government/appointments/difference-from-private-sector/) (will open in a new window) - How long does it take to get a federal job? (https://www.usajobs.gov/Help/faq/application/how-long/) (will open in a new window) - What's the difference between eligibility and qualifications? (https://www.usajobs.gov/Help/faq/application/eligibility/difference-from-qualifications/) (will open in a new window) - How does the application work? (https://www.usajobs.gov/Help/faq/application/process/) (will open in a new window) Related Issue #2927 InVision Mock: https://opm.invisionapp.com/d/main#/console/15360465/344480823/preview Public Link: Public Link: https://opm.invisionapp.com/share/ZEPNZR09Q54
process
department of state right rail of submission confirmation page who student what notification that the user now has a usajobs profile and what they can do about it why as a student i want to be informed that i now have a usajobs profile and information about federal hiring a c there will be a box in the right rail you now have a usajobs profile there will be content under the header with a usajobs profile you can apply to jobs upload a resume and make it searchable save job searches and sign up for email notifications there will be drawers with a in the left margin that will open the drawer to expose content find a job this will link the user to usajobs search learn how to make your resume searchable making your resume searchable adds it to the usajobs resume mining collection hr specialists and hiring managers from federal agencies use resume mining to look for people with a searchable resume in their usajobs profile to fill their job vacancies plus when your resume is searchable and you have completed your profile your profile is searchable too will open in a new window learn how to save a job search you can save a search to help you look for jobs in your area of interest when you save a search we’ll automatically look for jobs that match the keywords and other criteria in your search learn how to create a saved search and sign up for email notifications will open in a new window learn more about working in the government this is the same as usajobs when the drawer is opened it will have links these links will open in a new window how is finding a job in the government different that the private sector will open in a new window how long does it take to get a federal job will open in a new window what s the difference between eligibility and qualifications will open in a new window how does the application work will open in a new window related issue invision mock public link public link
1
6,837
9,979,468,918
IssuesEvent
2019-07-09 23:00:31
osquery/foundation
https://api.github.com/repos/osquery/foundation
closed
Proposal: Require squash commits for osquery
process proposal
I propose we require squash commits for osquery. My interest comes from working in large monorepos, and allowing people to iterate with a lot of little commits. These days, github has a toggles. We currently allow rebase and squash. I propose we only allow squash. Please vote, thumbs up or thumbs down. Or discuss if there's something substantial
1.0
Proposal: Require squash commits for osquery - I propose we require squash commits for osquery. My interest comes from working in large monorepos, and allowing people to iterate with a lot of little commits. These days, github has a toggles. We currently allow rebase and squash. I propose we only allow squash. Please vote, thumbs up or thumbs down. Or discuss if there's something substantial
process
proposal require squash commits for osquery i propose we require squash commits for osquery my interest comes from working in large monorepos and allowing people to iterate with a lot of little commits these days github has a toggles we currently allow rebase and squash i propose we only allow squash please vote thumbs up or thumbs down or discuss if there s something substantial
1
15,638
19,822,613,467
IssuesEvent
2022-01-20 00:25:12
Jeffail/benthos
https://api.github.com/repos/Jeffail/benthos
closed
Include metadata with specific prefixes as headers to http_client output request
enhancement processors inputs outputs effort: lower
I would like to provide a list of prefixes and all metadata with these prefixes to be included as headers in the http_client request.
1.0
Include metadata with specific prefixes as headers to http_client output request - I would like to provide a list of prefixes and all metadata with these prefixes to be included as headers in the http_client request.
process
include metadata with specific prefixes as headers to http client output request i would like to provide a list of prefixes and all metadata with these prefixes to be included as headers in the http client request
1
398,723
11,742,255,693
IssuesEvent
2020-03-12 00:08:23
grpc/grpc
https://api.github.com/repos/grpc/grpc
closed
Skip SSL Verification?
area/security kind/question lang/go priority/P2
I'm re-implementing a Go app (in rust) that uses grpc-go which uses the tls.Config{InsecureSkipVerify: true} option in the Go TLS library. example usage: https://github.com/grpc/grpc-go/blob/4abb3622b0fb97df222670b189869b44d0c2454d/credentials/credentials_test.go#L178 I'm using the rust wrapper for the C implementation for gRPC and its come down to the SslCredentialsOptions. Is there a way to get the same Skip-Verify behavior from the C library? I'm trying to connect to a gRPC server that is using TLS with a self-signed cert.
1.0
Skip SSL Verification? - I'm re-implementing a Go app (in rust) that uses grpc-go which uses the tls.Config{InsecureSkipVerify: true} option in the Go TLS library. example usage: https://github.com/grpc/grpc-go/blob/4abb3622b0fb97df222670b189869b44d0c2454d/credentials/credentials_test.go#L178 I'm using the rust wrapper for the C implementation for gRPC and its come down to the SslCredentialsOptions. Is there a way to get the same Skip-Verify behavior from the C library? I'm trying to connect to a gRPC server that is using TLS with a self-signed cert.
non_process
skip ssl verification i m re implementing a go app in rust that uses grpc go which uses the tls config insecureskipverify true option in the go tls library example usage i m using the rust wrapper for the c implementation for grpc and its come down to the sslcredentialsoptions is there a way to get the same skip verify behavior from the c library i m trying to connect to a grpc server that is using tls with a self signed cert
0
20,978
27,831,412,902
IssuesEvent
2023-03-20 05:24:29
serai-dex/serai
https://api.github.com/repos/serai-dex/serai
opened
Consider using a coordinated model for signature shares
cryptography processor
Since FROST isn't robust, any signer failing it a fault. Accordingly, the likelihood of failure doesn't fall under a coordinated model. Under Seraphis, Monero's future transaction protocol, any multisig member will be able to independently produce membership proofs. This include a membership proof of [1 ..= 127, actual], which is effectively null. We'd be unable to blame any individual multisig member for doing this *unless we have a coordinated model*, in which case only one signer can publish the transaction, so if the on-chain transaction is wonky, only one person is possibly at fault. There's also efficiency improvements by not needing to send signature shares from everyone to everyone, yet the designed coordinator needs to send the eventuality hash to everyone else who then runs the eventuality protocol. This is already needed for the non-participating signers though, so it should work out? Possible relation to #163, which void this idea if implemented with shares on chain, yet that may be appreciated in order to achieve a synchronous view of share participation which can be better modeled over.
1.0
Consider using a coordinated model for signature shares - Since FROST isn't robust, any signer failing it a fault. Accordingly, the likelihood of failure doesn't fall under a coordinated model. Under Seraphis, Monero's future transaction protocol, any multisig member will be able to independently produce membership proofs. This include a membership proof of [1 ..= 127, actual], which is effectively null. We'd be unable to blame any individual multisig member for doing this *unless we have a coordinated model*, in which case only one signer can publish the transaction, so if the on-chain transaction is wonky, only one person is possibly at fault. There's also efficiency improvements by not needing to send signature shares from everyone to everyone, yet the designed coordinator needs to send the eventuality hash to everyone else who then runs the eventuality protocol. This is already needed for the non-participating signers though, so it should work out? Possible relation to #163, which void this idea if implemented with shares on chain, yet that may be appreciated in order to achieve a synchronous view of share participation which can be better modeled over.
process
consider using a coordinated model for signature shares since frost isn t robust any signer failing it a fault accordingly the likelihood of failure doesn t fall under a coordinated model under seraphis monero s future transaction protocol any multisig member will be able to independently produce membership proofs this include a membership proof of which is effectively null we d be unable to blame any individual multisig member for doing this unless we have a coordinated model in which case only one signer can publish the transaction so if the on chain transaction is wonky only one person is possibly at fault there s also efficiency improvements by not needing to send signature shares from everyone to everyone yet the designed coordinator needs to send the eventuality hash to everyone else who then runs the eventuality protocol this is already needed for the non participating signers though so it should work out possible relation to which void this idea if implemented with shares on chain yet that may be appreciated in order to achieve a synchronous view of share participation which can be better modeled over
1
20,533
27,190,744,727
IssuesEvent
2023-02-19 19:18:03
NationalSecurityAgency/ghidra
https://api.github.com/repos/NationalSecurityAgency/ghidra
closed
Bug: 65C02 TRB/TSB instructions incorrectly analysed
Type: Bug Feature: Processor/6502 Status: Internal
**Describe the bug** On the 65C02 processor core, the TRB and TSB instructions are not implemented correctly. The decompiler incorrectly suggests that they have no effect -- yet they should write the updated value (with bits cleared/set respectively) back to OP1. This is documented at http://www.6502.org/tutorials/65c02opcodes.html **To Reproduce** Steps to reproduce the behavior: 1. Load example code below and decompile it. 2. Note that the decompiler shows the function returning 0x20, which is correct, but the TRB instruction causes a load into uVar1 instead of the correct operation (`DAT_0C46 &= ~0x20`) **Expected behavior** TRB instruction results in bits being cleared in the target memory location. (`OP1 &= ~accumulator`) TSB instruction results in bits being set in the target memory location. (`OP1 |= accumulator`) **Screenshots** Code dump: ``` ********************************************************************************************************* * FUNCTION * ********************************************************************************************************* undefined FUN_eb52() undefined A:1 <RETURN> FUN_eb52 XREF[2]: FUN_886a:887a(c), FUN_e99b:ea0f(c) eb52 a9 80 LDA #0x80 eb54 8d af 1d STA DAT_1daf = ?? LAB_eb57 XREF[1]: eb5c(j) eb57 20 fe 8c JSR FUN_8cfe undefined FUN_8cfe() eb5a c9 80 CMP #0x80 eb5c d0 f9 BNE LAB_eb57 LAB_eb5e XREF[1]: FUN_eb75:eb7a(j) eb5e ad b0 1d LDA DAT_1db0 = ?? eb61 cd 00 80 CMP DAT_8000 = 77h eb64 d0 09 BNE LAB_eb6f eb66 ad b1 1d LDA DAT_1db1 = ?? eb69 cd 01 80 CMP DAT_8001 = C1h eb6c d0 01 BNE LAB_eb6f eb6e 60 RTS LAB_eb6f XREF[2]: eb64(j), eb6c(j) eb6f a9 20 LDA #0x20 eb71 1c 46 0c TRB DAT_0c46 = ?? eb74 60 RTS ``` Incorrect decompilation result: ``` char FUN_eb52(void) { undefined uVar1; char cVar2; cVar2 = -0x80; DAT_1daf = 0x80; do { cVar2 = FUN_8cfe(cVar2); } while (cVar2 != -0x80); cVar2 = DAT_1db0; if (cVar2 == DAT_8000) { cVar2 = DAT_1db1; if (cVar2 == DAT_8001) { return cVar2; } } uVar1 = DAT_0c46; return ' '; } ``` **Environment (please complete the following information):** - OS: Ubuntu 20.04.5 LTS - Java Version: 17.0.5 - Ghidra Version: Own build from git master - Ghidra Origin: Locally built from git master, 12th Dec 2022
1.0
Bug: 65C02 TRB/TSB instructions incorrectly analysed - **Describe the bug** On the 65C02 processor core, the TRB and TSB instructions are not implemented correctly. The decompiler incorrectly suggests that they have no effect -- yet they should write the updated value (with bits cleared/set respectively) back to OP1. This is documented at http://www.6502.org/tutorials/65c02opcodes.html **To Reproduce** Steps to reproduce the behavior: 1. Load example code below and decompile it. 2. Note that the decompiler shows the function returning 0x20, which is correct, but the TRB instruction causes a load into uVar1 instead of the correct operation (`DAT_0C46 &= ~0x20`) **Expected behavior** TRB instruction results in bits being cleared in the target memory location. (`OP1 &= ~accumulator`) TSB instruction results in bits being set in the target memory location. (`OP1 |= accumulator`) **Screenshots** Code dump: ``` ********************************************************************************************************* * FUNCTION * ********************************************************************************************************* undefined FUN_eb52() undefined A:1 <RETURN> FUN_eb52 XREF[2]: FUN_886a:887a(c), FUN_e99b:ea0f(c) eb52 a9 80 LDA #0x80 eb54 8d af 1d STA DAT_1daf = ?? LAB_eb57 XREF[1]: eb5c(j) eb57 20 fe 8c JSR FUN_8cfe undefined FUN_8cfe() eb5a c9 80 CMP #0x80 eb5c d0 f9 BNE LAB_eb57 LAB_eb5e XREF[1]: FUN_eb75:eb7a(j) eb5e ad b0 1d LDA DAT_1db0 = ?? eb61 cd 00 80 CMP DAT_8000 = 77h eb64 d0 09 BNE LAB_eb6f eb66 ad b1 1d LDA DAT_1db1 = ?? eb69 cd 01 80 CMP DAT_8001 = C1h eb6c d0 01 BNE LAB_eb6f eb6e 60 RTS LAB_eb6f XREF[2]: eb64(j), eb6c(j) eb6f a9 20 LDA #0x20 eb71 1c 46 0c TRB DAT_0c46 = ?? eb74 60 RTS ``` Incorrect decompilation result: ``` char FUN_eb52(void) { undefined uVar1; char cVar2; cVar2 = -0x80; DAT_1daf = 0x80; do { cVar2 = FUN_8cfe(cVar2); } while (cVar2 != -0x80); cVar2 = DAT_1db0; if (cVar2 == DAT_8000) { cVar2 = DAT_1db1; if (cVar2 == DAT_8001) { return cVar2; } } uVar1 = DAT_0c46; return ' '; } ``` **Environment (please complete the following information):** - OS: Ubuntu 20.04.5 LTS - Java Version: 17.0.5 - Ghidra Version: Own build from git master - Ghidra Origin: Locally built from git master, 12th Dec 2022
process
bug trb tsb instructions incorrectly analysed describe the bug on the processor core the trb and tsb instructions are not implemented correctly the decompiler incorrectly suggests that they have no effect yet they should write the updated value with bits cleared set respectively back to this is documented at to reproduce steps to reproduce the behavior load example code below and decompile it note that the decompiler shows the function returning which is correct but the trb instruction causes a load into instead of the correct operation dat expected behavior trb instruction results in bits being cleared in the target memory location accumulator tsb instruction results in bits being set in the target memory location accumulator screenshots code dump function undefined fun undefined a fun xref fun c fun c lda af sta dat lab xref j fe jsr fun undefined fun cmp bne lab lab xref fun j ad lda dat cd cmp dat bne lab ad lda dat cd cmp dat bne lab rts lab xref j j lda trb dat rts incorrect decompilation result char fun void undefined char dat do fun while dat if dat dat if dat return dat return environment please complete the following information os ubuntu lts java version ghidra version own build from git master ghidra origin locally built from git master dec
1
16,445
4,053,937,855
IssuesEvent
2016-05-24 10:23:38
ES-DOC/esdoc-docs
https://api.github.com/repos/ES-DOC/esdoc-docs
opened
External review for ScenarioMIP experiment documentation
CMIP6 Documentation
Review the ScenarioMIP experiments with the PIs.
1.0
External review for ScenarioMIP experiment documentation - Review the ScenarioMIP experiments with the PIs.
non_process
external review for scenariomip experiment documentation review the scenariomip experiments with the pis
0
167,778
13,042,165,206
IssuesEvent
2020-07-28 21:51:34
thefrontside/bigtest
https://api.github.com/repos/thefrontside/bigtest
opened
Should we dispose of each test's JavaScript context before the next lane is run?
@bigtest/agent question
We currently have two JavaScript context's at play when running tests: The `Agent` JS context, and the `Harness` JS context. The `Agent` JS context is initialized once when the agent itself is connected to the orchestrator, whereas the `Harness` JS context is created once _per test_ by clearing the contents of the test iframe. The following psedo-html demonstrates this relationship. ```html <html> <body>This is the Agent frame which is loaded once</body> <iframe src="app.src.html">This is the Harness Frame into which the app is loaded<iframe> </html> ``` The harness JS context is only responsible for housing the application under test, whereas the agent context is the js context where the tests actually run. So when you write: ```ts test("homepage") .step("load the app", App.visit("/") .step("click on a button", () => Button("menu").click()) ``` The interactor code is evaluated in the _agent_ frame, but because the agent frame and harness frame are loaded from the same origin, JavaScript in the agent frame has permission to access the `documentElement` of the harness frame. The rub is that the scope of the agent frame can encompass not only multiple lanes in a single test, but in the case of a long lived server process used during development, it can span many many test runs. This is a problem because it [violates the principle of isolation and reliability][1] which is one of the pillars of BigTest. Anything that any test cases writes to the global scope will persist across every other test case. That's bad. We don't want to depend on library code, or test code being "well-behaved" and ensuring to clean up after itself. A much safer strategy is to discard it altogether after all side-effects have been run, and then start fresh again from the top. > Note: this isn't an abstract concern, we encountered this when starting services that track global state like [miragejs][2] and [msw][3] One thing we could do is to introduce a test frame along side the harness frame. In pseudo html this would look like this: ```html <html> <body>This is the Agent frame which is loaded once</body> <iframe src="test.src.html">This frame is responsible for evaluating the test code</iframe> <iframe src="app.src.html">This is the Harness Frame into which the app is loaded<iframe> </html> ``` Then, before each test case, the agent frame would tell the "runner" frame to clear, load the test bundle, and run the lane. The runner frame would be passed the document element of the harness frame so that all of the interactors would target it. Drawbacks: 1. Instead of loading the test bundle once before the test run inside the agent frame, we'd have to load it once per lane. This could add a performance overhead. 2. Setting breakpoints and doing inspections on the test suite would now have to happen on the evaluation frame, and not the agent frame, so it might become more awkward. [1]: https://frontside.com/blog/2020-triple-threat-to-testing-part-2-reliability/#environment-and-preconditions [2]: https://miragejs.org [3]: https://mswjs.io
1.0
Should we dispose of each test's JavaScript context before the next lane is run? - We currently have two JavaScript context's at play when running tests: The `Agent` JS context, and the `Harness` JS context. The `Agent` JS context is initialized once when the agent itself is connected to the orchestrator, whereas the `Harness` JS context is created once _per test_ by clearing the contents of the test iframe. The following psedo-html demonstrates this relationship. ```html <html> <body>This is the Agent frame which is loaded once</body> <iframe src="app.src.html">This is the Harness Frame into which the app is loaded<iframe> </html> ``` The harness JS context is only responsible for housing the application under test, whereas the agent context is the js context where the tests actually run. So when you write: ```ts test("homepage") .step("load the app", App.visit("/") .step("click on a button", () => Button("menu").click()) ``` The interactor code is evaluated in the _agent_ frame, but because the agent frame and harness frame are loaded from the same origin, JavaScript in the agent frame has permission to access the `documentElement` of the harness frame. The rub is that the scope of the agent frame can encompass not only multiple lanes in a single test, but in the case of a long lived server process used during development, it can span many many test runs. This is a problem because it [violates the principle of isolation and reliability][1] which is one of the pillars of BigTest. Anything that any test cases writes to the global scope will persist across every other test case. That's bad. We don't want to depend on library code, or test code being "well-behaved" and ensuring to clean up after itself. A much safer strategy is to discard it altogether after all side-effects have been run, and then start fresh again from the top. > Note: this isn't an abstract concern, we encountered this when starting services that track global state like [miragejs][2] and [msw][3] One thing we could do is to introduce a test frame along side the harness frame. In pseudo html this would look like this: ```html <html> <body>This is the Agent frame which is loaded once</body> <iframe src="test.src.html">This frame is responsible for evaluating the test code</iframe> <iframe src="app.src.html">This is the Harness Frame into which the app is loaded<iframe> </html> ``` Then, before each test case, the agent frame would tell the "runner" frame to clear, load the test bundle, and run the lane. The runner frame would be passed the document element of the harness frame so that all of the interactors would target it. Drawbacks: 1. Instead of loading the test bundle once before the test run inside the agent frame, we'd have to load it once per lane. This could add a performance overhead. 2. Setting breakpoints and doing inspections on the test suite would now have to happen on the evaluation frame, and not the agent frame, so it might become more awkward. [1]: https://frontside.com/blog/2020-triple-threat-to-testing-part-2-reliability/#environment-and-preconditions [2]: https://miragejs.org [3]: https://mswjs.io
non_process
should we dispose of each test s javascript context before the next lane is run we currently have two javascript context s at play when running tests the agent js context and the harness js context the agent js context is initialized once when the agent itself is connected to the orchestrator whereas the harness js context is created once per test by clearing the contents of the test iframe the following psedo html demonstrates this relationship html this is the agent frame which is loaded once this is the harness frame into which the app is loaded the harness js context is only responsible for housing the application under test whereas the agent context is the js context where the tests actually run so when you write ts test homepage step load the app app visit step click on a button button menu click the interactor code is evaluated in the agent frame but because the agent frame and harness frame are loaded from the same origin javascript in the agent frame has permission to access the documentelement of the harness frame the rub is that the scope of the agent frame can encompass not only multiple lanes in a single test but in the case of a long lived server process used during development it can span many many test runs this is a problem because it which is one of the pillars of bigtest anything that any test cases writes to the global scope will persist across every other test case that s bad we don t want to depend on library code or test code being well behaved and ensuring to clean up after itself a much safer strategy is to discard it altogether after all side effects have been run and then start fresh again from the top note this isn t an abstract concern we encountered this when starting services that track global state like and one thing we could do is to introduce a test frame along side the harness frame in pseudo html this would look like this html this is the agent frame which is loaded once this frame is responsible for evaluating the test code this is the harness frame into which the app is loaded then before each test case the agent frame would tell the runner frame to clear load the test bundle and run the lane the runner frame would be passed the document element of the harness frame so that all of the interactors would target it drawbacks instead of loading the test bundle once before the test run inside the agent frame we d have to load it once per lane this could add a performance overhead setting breakpoints and doing inspections on the test suite would now have to happen on the evaluation frame and not the agent frame so it might become more awkward
0
12,333
14,882,568,540
IssuesEvent
2021-01-20 12:05:33
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[Android] Unable to submit location based questionnaires
Android Bug P1 Process: Tested dev
Unable to submit the location based questionnaires in android
1.0
[Android] Unable to submit location based questionnaires - Unable to submit the location based questionnaires in android
process
unable to submit location based questionnaires unable to submit the location based questionnaires in android
1
19,137
25,196,148,199
IssuesEvent
2022-11-12 14:42:11
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
closed
Need better design for "skip" properties
priority/medium preprocess enhancement
## Description Many of our Ant targets in the preprocess code have "skip" properties that allow you to skip running part of our build pipeline. These are considered internal properties and subject to change, but available to advanced users who understand the code. There are several problems: * They are not documented, even for advanced users (this has come up in Slack a few times). * The naming is inconsistent, with most based on Ant target names rather than on function. You can disable the code that cascades map metadata, but you have to know the Ant target is `move-meta-entries` making the property name `preprocess.move-meta-entries.skip`. Some do not use the Ant target name, such as `preprocess.clean-map-check.skip`. Where preprocess and preprocess2 have overlapping targets, preprocess2 uses the name from preprocess; where they do not, it creates a property with the Ant target name. An odd one is `clean-temp.skip` which is outside of preprocess. * Property names based on Ant target name makes it hard to split or combine steps * You have to set the property to disable something, which is not intuitive; setting `preprocess.keyref.skip=true` disables keyref resolution. * Because of how Ant works, setting the variable to *anything* disables the target, it just checks if the property is set. So, setting `preprocess.keyref.skip=false` also disables keyref. This is likely familiar to people with a lot of Ant experience, but many of those trying to tweak DITA-OT builds do not have that background. List of current skip properties in base plugin: - `clean-temp.skip` - `preprocess.branch-filter.skip` - `preprocess.chunk.skip` - `preprocess.clean-map-check.skip` - `preprocess.clean-preprocess.skip` - `preprocess.coderef.skip` - `preprocess.conref.skip` - `preprocess.conrefpush.skip` - `preprocess.copy-files.skip` - `preprocess.copy-flag.skip` - `preprocess.copy-html.skip` - `preprocess.copy-image.skip` - `preprocess.debug-filter.skip` - `preprocess.gen-list.skip` - `preprocess.keyref.skip` - `preprocess.map-profile.skip` - `preprocess.maplink.skip` - `preprocess.mapref.skip` - `preprocess.move-meta-entries.skip` - `preprocess.normalize-codeblock.skip` - `preprocess.profile.skip` - `preprocess.topic-profile.skip` - `preprocess.topicpull.skip` Note that not all are used in `target/@unless`. Some are used e.g. ```xml <filter class="org.dita.dost.writer.NormalizeCodeblock" unless:set="preprocess.normalize-codeblock.skip"/> ``` ## Possible Solution Ideally, in a 4.0 major release we could: * Create a consistent pattern for all of the current "skip" properties. * The property names should be based entirely on function, not connected to the Ant target, so that even if we split a single `chunk` target into 5 targets, the "process chunking" property does not change and works to disable all 5. * The property be set to "false" to disable a step. Ant has `istrue` condition that maps case-insensitive "true", "yes", and "on" to `true`; otherwise `false`. * The actual Ant property used to skip a target is internal-only, and our build should fail if someone tries to set it; going forward this ensures we do not break anyone if those properties have to change * All of this makes it possible to document the properties in a straightforward manner - "Set X to false to disable conref resolution" - without going into detail about why it's "enable this property to disable this feature", or why "disable this property" also disables it, etc. I'm not sure of the best pattern for these or I'd have started with a PR that updated one step. Adding some ideas, using conref as an example; I like having `run` or `buildstep` or `build-step` in there, but am not attached to anything. I do like having the feature name last so all of these have the same prefix: * `run.conref.step=true` * `run.step.conref=true` * `run.buildstep.conref=true` * `build-step.conref=true` ## Potential Alternatives We could simply document the properties that exist today, but we've intentionally avoided this: * It locks in the properties like an external API, making them hard to change * The issues above all make it hard to document, requiring an explanation of Ant and of the "enable to disable" naming ## Additional Context This has come up twice in Slack over the last few months, both times about the `preprocess.clean-map-check.skip` property as a way to skip that step and retain information added by preprocessing.
1.0
Need better design for "skip" properties - ## Description Many of our Ant targets in the preprocess code have "skip" properties that allow you to skip running part of our build pipeline. These are considered internal properties and subject to change, but available to advanced users who understand the code. There are several problems: * They are not documented, even for advanced users (this has come up in Slack a few times). * The naming is inconsistent, with most based on Ant target names rather than on function. You can disable the code that cascades map metadata, but you have to know the Ant target is `move-meta-entries` making the property name `preprocess.move-meta-entries.skip`. Some do not use the Ant target name, such as `preprocess.clean-map-check.skip`. Where preprocess and preprocess2 have overlapping targets, preprocess2 uses the name from preprocess; where they do not, it creates a property with the Ant target name. An odd one is `clean-temp.skip` which is outside of preprocess. * Property names based on Ant target name makes it hard to split or combine steps * You have to set the property to disable something, which is not intuitive; setting `preprocess.keyref.skip=true` disables keyref resolution. * Because of how Ant works, setting the variable to *anything* disables the target, it just checks if the property is set. So, setting `preprocess.keyref.skip=false` also disables keyref. This is likely familiar to people with a lot of Ant experience, but many of those trying to tweak DITA-OT builds do not have that background. List of current skip properties in base plugin: - `clean-temp.skip` - `preprocess.branch-filter.skip` - `preprocess.chunk.skip` - `preprocess.clean-map-check.skip` - `preprocess.clean-preprocess.skip` - `preprocess.coderef.skip` - `preprocess.conref.skip` - `preprocess.conrefpush.skip` - `preprocess.copy-files.skip` - `preprocess.copy-flag.skip` - `preprocess.copy-html.skip` - `preprocess.copy-image.skip` - `preprocess.debug-filter.skip` - `preprocess.gen-list.skip` - `preprocess.keyref.skip` - `preprocess.map-profile.skip` - `preprocess.maplink.skip` - `preprocess.mapref.skip` - `preprocess.move-meta-entries.skip` - `preprocess.normalize-codeblock.skip` - `preprocess.profile.skip` - `preprocess.topic-profile.skip` - `preprocess.topicpull.skip` Note that not all are used in `target/@unless`. Some are used e.g. ```xml <filter class="org.dita.dost.writer.NormalizeCodeblock" unless:set="preprocess.normalize-codeblock.skip"/> ``` ## Possible Solution Ideally, in a 4.0 major release we could: * Create a consistent pattern for all of the current "skip" properties. * The property names should be based entirely on function, not connected to the Ant target, so that even if we split a single `chunk` target into 5 targets, the "process chunking" property does not change and works to disable all 5. * The property be set to "false" to disable a step. Ant has `istrue` condition that maps case-insensitive "true", "yes", and "on" to `true`; otherwise `false`. * The actual Ant property used to skip a target is internal-only, and our build should fail if someone tries to set it; going forward this ensures we do not break anyone if those properties have to change * All of this makes it possible to document the properties in a straightforward manner - "Set X to false to disable conref resolution" - without going into detail about why it's "enable this property to disable this feature", or why "disable this property" also disables it, etc. I'm not sure of the best pattern for these or I'd have started with a PR that updated one step. Adding some ideas, using conref as an example; I like having `run` or `buildstep` or `build-step` in there, but am not attached to anything. I do like having the feature name last so all of these have the same prefix: * `run.conref.step=true` * `run.step.conref=true` * `run.buildstep.conref=true` * `build-step.conref=true` ## Potential Alternatives We could simply document the properties that exist today, but we've intentionally avoided this: * It locks in the properties like an external API, making them hard to change * The issues above all make it hard to document, requiring an explanation of Ant and of the "enable to disable" naming ## Additional Context This has come up twice in Slack over the last few months, both times about the `preprocess.clean-map-check.skip` property as a way to skip that step and retain information added by preprocessing.
process
need better design for skip properties description many of our ant targets in the preprocess code have skip properties that allow you to skip running part of our build pipeline these are considered internal properties and subject to change but available to advanced users who understand the code there are several problems they are not documented even for advanced users this has come up in slack a few times the naming is inconsistent with most based on ant target names rather than on function you can disable the code that cascades map metadata but you have to know the ant target is move meta entries making the property name preprocess move meta entries skip some do not use the ant target name such as preprocess clean map check skip where preprocess and have overlapping targets uses the name from preprocess where they do not it creates a property with the ant target name an odd one is clean temp skip which is outside of preprocess property names based on ant target name makes it hard to split or combine steps you have to set the property to disable something which is not intuitive setting preprocess keyref skip true disables keyref resolution because of how ant works setting the variable to anything disables the target it just checks if the property is set so setting preprocess keyref skip false also disables keyref this is likely familiar to people with a lot of ant experience but many of those trying to tweak dita ot builds do not have that background list of current skip properties in base plugin clean temp skip preprocess branch filter skip preprocess chunk skip preprocess clean map check skip preprocess clean preprocess skip preprocess coderef skip preprocess conref skip preprocess conrefpush skip preprocess copy files skip preprocess copy flag skip preprocess copy html skip preprocess copy image skip preprocess debug filter skip preprocess gen list skip preprocess keyref skip preprocess map profile skip preprocess maplink skip preprocess mapref skip preprocess move meta entries skip preprocess normalize codeblock skip preprocess profile skip preprocess topic profile skip preprocess topicpull skip note that not all are used in target unless some are used e g xml filter class org dita dost writer normalizecodeblock unless set preprocess normalize codeblock skip possible solution ideally in a major release we could create a consistent pattern for all of the current skip properties the property names should be based entirely on function not connected to the ant target so that even if we split a single chunk target into targets the process chunking property does not change and works to disable all the property be set to false to disable a step ant has istrue condition that maps case insensitive true yes and on to true otherwise false the actual ant property used to skip a target is internal only and our build should fail if someone tries to set it going forward this ensures we do not break anyone if those properties have to change all of this makes it possible to document the properties in a straightforward manner set x to false to disable conref resolution without going into detail about why it s enable this property to disable this feature or why disable this property also disables it etc i m not sure of the best pattern for these or i d have started with a pr that updated one step adding some ideas using conref as an example i like having run or buildstep or build step in there but am not attached to anything i do like having the feature name last so all of these have the same prefix run conref step true run step conref true run buildstep conref true build step conref true potential alternatives we could simply document the properties that exist today but we ve intentionally avoided this it locks in the properties like an external api making them hard to change the issues above all make it hard to document requiring an explanation of ant and of the enable to disable naming additional context this has come up twice in slack over the last few months both times about the preprocess clean map check skip property as a way to skip that step and retain information added by preprocessing
1
8,363
11,518,467,649
IssuesEvent
2020-02-14 10:34:19
prisma/specs
https://api.github.com/repos/prisma/specs
opened
CLI spec: some commands are missing
area/cli kind/spec process/candidate
I found that the CLI spec at https://github.com/prisma/specs/blob/master/cli/README.md Is missing the following commands - `validate` - `studio` - `lift save|up|down` Note: these commands can take the --schema argument.
1.0
CLI spec: some commands are missing - I found that the CLI spec at https://github.com/prisma/specs/blob/master/cli/README.md Is missing the following commands - `validate` - `studio` - `lift save|up|down` Note: these commands can take the --schema argument.
process
cli spec some commands are missing i found that the cli spec at is missing the following commands validate studio lift save up down note these commands can take the schema argument
1
15,818
20,014,751,954
IssuesEvent
2022-02-01 10:53:00
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
Drop support for Ubuntu 14.04 / 16.04 / CentOS 7 on Bazel@HEAD
P1 type: process team-OSS
Considering that Ubuntu 16.04 LTS is EOL now, it might be reasonable to drop support for Ubuntu 14.04 / 16.04 / CentOS 7 on Bazel's main branch. One reason to discontinue support for outdated platforms, is that Bazel@HEAD cannot be built on modern platforms, like Fedora 34 rawhide (x86_64) that switched to `gcc` 11.x toolchain (https://gcc.gnu.org/gcc-11/), see https://github.com/bazelbuild/bazel/issues/12702. The attempt to bump transitive dependencies in Bazel for `gRPC` and `absl` caused the build to fail on outdated CentOS 7 platform due to outdated 4.8.5 C++ compiler toolchain: ``` $ docker run -it gcr.io/bazel-public/centos7-java8:latest /bin/bash $ gcc --version gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-44) ``` See CI breakages for this PR: https://github.com/bazelbuild/bazel/pull/13536.
1.0
Drop support for Ubuntu 14.04 / 16.04 / CentOS 7 on Bazel@HEAD - Considering that Ubuntu 16.04 LTS is EOL now, it might be reasonable to drop support for Ubuntu 14.04 / 16.04 / CentOS 7 on Bazel's main branch. One reason to discontinue support for outdated platforms, is that Bazel@HEAD cannot be built on modern platforms, like Fedora 34 rawhide (x86_64) that switched to `gcc` 11.x toolchain (https://gcc.gnu.org/gcc-11/), see https://github.com/bazelbuild/bazel/issues/12702. The attempt to bump transitive dependencies in Bazel for `gRPC` and `absl` caused the build to fail on outdated CentOS 7 platform due to outdated 4.8.5 C++ compiler toolchain: ``` $ docker run -it gcr.io/bazel-public/centos7-java8:latest /bin/bash $ gcc --version gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-44) ``` See CI breakages for this PR: https://github.com/bazelbuild/bazel/pull/13536.
process
drop support for ubuntu centos on bazel head considering that ubuntu lts is eol now it might be reasonable to drop support for ubuntu centos on bazel s main branch one reason to discontinue support for outdated platforms is that bazel head cannot be built on modern platforms like fedora rawhide that switched to gcc x toolchain see the attempt to bump transitive dependencies in bazel for grpc and absl caused the build to fail on outdated centos platform due to outdated c compiler toolchain docker run it gcr io bazel public latest bin bash gcc version gcc gcc red hat see ci breakages for this pr
1
1,227
3,758,446,632
IssuesEvent
2016-03-14 08:58:48
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
opened
Keyref resolution fails if input path contains ../
bug preprocess/keyref
2.2.3 and `develop`. If you do e.g. `dita -i ../docsrc/userguide.ditamap -f pdf`, keyref resolution fails: ``` [keyref] file:/tmp/docsrc/index.dita:6:55: [DOTJ047I][INFO] Unable to find key definition for key reference "release" in root scope. The href attribute may be used as fallback if it exists ``` If you do `dita -i docsrc/userguide.ditamap -f pdf` instead, it works fine.
1.0
Keyref resolution fails if input path contains ../ - 2.2.3 and `develop`. If you do e.g. `dita -i ../docsrc/userguide.ditamap -f pdf`, keyref resolution fails: ``` [keyref] file:/tmp/docsrc/index.dita:6:55: [DOTJ047I][INFO] Unable to find key definition for key reference "release" in root scope. The href attribute may be used as fallback if it exists ``` If you do `dita -i docsrc/userguide.ditamap -f pdf` instead, it works fine.
process
keyref resolution fails if input path contains and develop if you do e g dita i docsrc userguide ditamap f pdf keyref resolution fails file tmp docsrc index dita unable to find key definition for key reference release in root scope the href attribute may be used as fallback if it exists if you do dita i docsrc userguide ditamap f pdf instead it works fine
1
8,487
11,645,741,734
IssuesEvent
2020-03-01 03:56:52
SE-Garden/tms-webserver
https://api.github.com/repos/SE-Garden/tms-webserver
opened
既存ソースの改修ポイント
kind:共通機能 kind:機能 process:CQ
## 概要 既存ソースの改修すべきポイントをmasterをベースに再Rvして探し出す。 結果として修正すべき部分は、修正しPullRequestで処理する。 ## ゴール 再ソースRvと改修 ## 成果物 ソース ## 関連Issue None
1.0
既存ソースの改修ポイント - ## 概要 既存ソースの改修すべきポイントをmasterをベースに再Rvして探し出す。 結果として修正すべき部分は、修正しPullRequestで処理する。 ## ゴール 再ソースRvと改修 ## 成果物 ソース ## 関連Issue None
process
既存ソースの改修ポイント 概要 既存ソースの改修すべきポイントをmasterをベースに再rvして探し出す。 結果として修正すべき部分は、修正しpullrequestで処理する。 ゴール 再ソースrvと改修 成果物 ソース 関連issue none
1
15,657
3,331,318,506
IssuesEvent
2015-11-11 15:23:37
Marginal/EDMarketConnector
https://api.github.com/repos/Marginal/EDMarketConnector
closed
The special modules aren't displayed into the data sent to Eddn
working as designed
example: there's no prismatic shields into the data that arrive to eddn
1.0
The special modules aren't displayed into the data sent to Eddn - example: there's no prismatic shields into the data that arrive to eddn
non_process
the special modules aren t displayed into the data sent to eddn example there s no prismatic shields into the data that arrive to eddn
0
33,252
15,834,426,857
IssuesEvent
2021-04-06 16:46:19
elastic/kibana
https://api.github.com/repos/elastic/kibana
closed
Create index pattern page is slow if cluster is busy
Feature:Kibana Management Team:KibanaApp performance
**Kibana version:** 6.4 **Elasticsearch version:** 6.4 When creating a new index pattern and the Elasticsearch cluster is busy and/or overloaded with too many shards, the query that is used to populate the list of indices can take a long time. Currently the query looks like: ``` GET */_search {"size":0,"aggs":{"indices":{"terms":{"field":"_index","size":200}}}} ``` On a cluster with several thousands of shards this can become expensive and slow. I was wondering why we aren't using something cheaper like `_cat/indices`, but @pickypg already enlightened me that this would require `cluster monitor` privileges, which a normal Kibana user might not have. Running the agg also ensures that only indices are listed that the user has read access to, which might not be the case on the `_cat` API call. From my discussion with @pickypg , we should at least think about setting [shard_size](https://www.elastic.co/guide/en/elasticsearch/reference/current/search-aggregations-bucket-terms-aggregation.html#_shard_size_3) on this request. But I'd like to start a general discussion if there are better ways to get that list of indices the user has read access to.
True
Create index pattern page is slow if cluster is busy - **Kibana version:** 6.4 **Elasticsearch version:** 6.4 When creating a new index pattern and the Elasticsearch cluster is busy and/or overloaded with too many shards, the query that is used to populate the list of indices can take a long time. Currently the query looks like: ``` GET */_search {"size":0,"aggs":{"indices":{"terms":{"field":"_index","size":200}}}} ``` On a cluster with several thousands of shards this can become expensive and slow. I was wondering why we aren't using something cheaper like `_cat/indices`, but @pickypg already enlightened me that this would require `cluster monitor` privileges, which a normal Kibana user might not have. Running the agg also ensures that only indices are listed that the user has read access to, which might not be the case on the `_cat` API call. From my discussion with @pickypg , we should at least think about setting [shard_size](https://www.elastic.co/guide/en/elasticsearch/reference/current/search-aggregations-bucket-terms-aggregation.html#_shard_size_3) on this request. But I'd like to start a general discussion if there are better ways to get that list of indices the user has read access to.
non_process
create index pattern page is slow if cluster is busy kibana version elasticsearch version when creating a new index pattern and the elasticsearch cluster is busy and or overloaded with too many shards the query that is used to populate the list of indices can take a long time currently the query looks like get search size aggs indices terms field index size on a cluster with several thousands of shards this can become expensive and slow i was wondering why we aren t using something cheaper like cat indices but pickypg already enlightened me that this would require cluster monitor privileges which a normal kibana user might not have running the agg also ensures that only indices are listed that the user has read access to which might not be the case on the cat api call from my discussion with pickypg we should at least think about setting on this request but i d like to start a general discussion if there are better ways to get that list of indices the user has read access to
0
402,046
27,349,223,250
IssuesEvent
2023-02-27 08:17:59
kubermatic/docs
https://api.github.com/repos/kubermatic/docs
closed
Instruction for assessing and monitoring the health and proper function
kind/documentation
Create step-by-step instructions for how to assess and monitor the health and proper function of KKP. https://docs.kubermatic.com/kubermatic/v2.21/tutorials-howtos/monitoring-logging-alerting/ - [ ] Pictures about the dashboard and what you should monitor in a user cluster or in a master cluster. Examples and defaults described. - [x] This picture should be added (architecture diagram): https://docs.google.com/presentation/d/1TQZ-6F-yCfrEtN0XCTAU5-VerMNNNOfzCRXbqije4jY/edit#slide=id.g136a6827039_0_387 (the MLA diagram) HLCH-001
1.0
Instruction for assessing and monitoring the health and proper function - Create step-by-step instructions for how to assess and monitor the health and proper function of KKP. https://docs.kubermatic.com/kubermatic/v2.21/tutorials-howtos/monitoring-logging-alerting/ - [ ] Pictures about the dashboard and what you should monitor in a user cluster or in a master cluster. Examples and defaults described. - [x] This picture should be added (architecture diagram): https://docs.google.com/presentation/d/1TQZ-6F-yCfrEtN0XCTAU5-VerMNNNOfzCRXbqije4jY/edit#slide=id.g136a6827039_0_387 (the MLA diagram) HLCH-001
non_process
instruction for assessing and monitoring the health and proper function create step by step instructions for how to assess and monitor the health and proper function of kkp pictures about the dashboard and what you should monitor in a user cluster or in a master cluster examples and defaults described this picture should be added architecture diagram the mla diagram hlch
0
14,496
17,604,292,616
IssuesEvent
2021-08-17 15:13:32
qgis/QGIS-Documentation
https://api.github.com/repos/qgis/QGIS-Documentation
closed
Native DXF export algorithm (Request in QGIS)
Processing Alg 3.18
### Request for documentation From pull request QGIS/qgis#39575 Author: @alexbruy QGIS version: 3.18 **Native DXF export algorithm** ### PR Description: ## Description Implements native DXF export algorithm using `QgsDxfExport` functionality (also used by Project→Import/Export→Export Porject to DXF). Allows to export individual layer as well as multiple layers into single DXF file. For each input layer user can select which attribute to use for splitting layer into multiple output layers. ![export](https://user-images.githubusercontent.com/776954/97008125-1b51ac80-154b-11eb-9346-64bcb0691a72.gif) Fixes #25392. ### Commits tagged with [need-docs] or [FEATURE]
1.0
Native DXF export algorithm (Request in QGIS) - ### Request for documentation From pull request QGIS/qgis#39575 Author: @alexbruy QGIS version: 3.18 **Native DXF export algorithm** ### PR Description: ## Description Implements native DXF export algorithm using `QgsDxfExport` functionality (also used by Project→Import/Export→Export Porject to DXF). Allows to export individual layer as well as multiple layers into single DXF file. For each input layer user can select which attribute to use for splitting layer into multiple output layers. ![export](https://user-images.githubusercontent.com/776954/97008125-1b51ac80-154b-11eb-9346-64bcb0691a72.gif) Fixes #25392. ### Commits tagged with [need-docs] or [FEATURE]
process
native dxf export algorithm request in qgis request for documentation from pull request qgis qgis author alexbruy qgis version native dxf export algorithm pr description description implements native dxf export algorithm using qgsdxfexport functionality also used by project→import export→export porject to dxf allows to export individual layer as well as multiple layers into single dxf file for each input layer user can select which attribute to use for splitting layer into multiple output layers fixes commits tagged with or
1
146,423
5,621,235,841
IssuesEvent
2017-04-04 09:25:11
vmware/harbor
https://api.github.com/repos/vmware/harbor
closed
Deleting a non-empty project doesn't show meaningful Error.
area/clarity-ui priority/p0 UX waiting-for-verify
1)When I remove a project that has a repository, API returns 412, UI should not show this 412 on message bar, instead, it should show meaningful message, please reference 0.5.0 2)After I dismiss the message and try to delete again, the deletion will fail but no error message. for 2) Seems the component "inline error" has some issue
1.0
Deleting a non-empty project doesn't show meaningful Error. - 1)When I remove a project that has a repository, API returns 412, UI should not show this 412 on message bar, instead, it should show meaningful message, please reference 0.5.0 2)After I dismiss the message and try to delete again, the deletion will fail but no error message. for 2) Seems the component "inline error" has some issue
non_process
deleting a non empty project doesn t show meaningful error when i remove a project that has a repository api returns ui should not show this on message bar instead it should show meaningful message please reference after i dismiss the message and try to delete again the deletion will fail but no error message for seems the component inline error has some issue
0
12,046
4,350,055,341
IssuesEvent
2016-07-31 00:35:57
Idrinth/IDotD
https://api.github.com/repos/Idrinth/IDotD
opened
remove css from js files
CleanCode UI&UX
There are currently a lot of instances of the style attribute being abused with stuff that would fit better into a seperate css file.
1.0
remove css from js files - There are currently a lot of instances of the style attribute being abused with stuff that would fit better into a seperate css file.
non_process
remove css from js files there are currently a lot of instances of the style attribute being abused with stuff that would fit better into a seperate css file
0
423,701
12,300,754,102
IssuesEvent
2020-05-11 14:26:40
TheTofuShop/Menu
https://api.github.com/repos/TheTofuShop/Menu
closed
Preload collisions before teleporting to another area (so we don't fall under the map)
priority: p3 low status: acknowledged type: feature request
⭐ **What feature do you want to be added?** Preload collisions before teleporting to another area like from Akina to Gunsai. 🔎 **Why do you want this feature to be added?** ![image](https://user-images.githubusercontent.com/11861253/75103731-938f5900-55dd-11ea-9131-817809349df5.png) ❓ **Is there something else that we need to know?** We should try to use the spawnmanager exports.
1.0
Preload collisions before teleporting to another area (so we don't fall under the map) - ⭐ **What feature do you want to be added?** Preload collisions before teleporting to another area like from Akina to Gunsai. 🔎 **Why do you want this feature to be added?** ![image](https://user-images.githubusercontent.com/11861253/75103731-938f5900-55dd-11ea-9131-817809349df5.png) ❓ **Is there something else that we need to know?** We should try to use the spawnmanager exports.
non_process
preload collisions before teleporting to another area so we don t fall under the map ⭐ what feature do you want to be added preload collisions before teleporting to another area like from akina to gunsai 🔎 why do you want this feature to be added ❓ is there something else that we need to know we should try to use the spawnmanager exports
0
119,585
25,541,259,181
IssuesEvent
2022-11-29 15:27:47
aws-controllers-k8s/community
https://api.github.com/repos/aws-controllers-k8s/community
closed
Change in code-generator v0.20.0 breaks operation overrides
bug code generator
The [top part of kinesis-controller's `generator.yaml`](https://github.com/jaypipes/ack-kinesis-controller/blob/2dbf9a725282f612b5ca330c567927da362b480f/generator.yaml#L1-L9) file looks like this: ```yaml operations: # NOTE)jaypipes): According to # https://docs.aws.amazon.com/kinesis/latest/APIReference/API_DescribeStream.html, # the DescribeStream API should not be used. Instead, the # DescribeStreamSummary API and ListShards API should be used. DescribeStreamSummary: resource_name: Stream operation_type: READ_ONE output_wrapper_field_path: StreamDescriptionSummary ``` If I execute `make build-controller SERVICE=kinesis` with v0.20.0 of code-generator, I get a failure: ``` [jaypipes@thelio code-generator]$ make build-controller SERVICE=kinesis building ack-generate ... ok. installing controller-gen v0.7.0 ... ok. ==== building kinesis-controller ==== Copying common custom resource definitions into kinesis Building Kubernetes API objects for kinesis Generating deepcopy code for kinesis Generating custom resource definitions for kinesis Building service controller for kinesis Error: template: /home/jaypipes/go/src/github.com/aws-controllers-k8s/code-generator/templates/pkg/resource/sdk.go.tpl:73:3: executing "sdk_find_read_one" at <GoCodeRequiredFieldsMissingFromReadOneInput .CRD "r.ko" 1>: error calling GoCodeRequiredFieldsMissingFromReadOneInput: GENERATION FAILURE! there's a required field StreamName in Shape DescribeStreamInput that isn't in either the CR's Spec or Status structs! make: *** [Makefile:41: build-controller] Error 1 ``` However, if I check out v0.19.3 of the code-generator, I get no errors: ``` [jaypipes@thelio code-generator]$ git checkout v0.19.3 Previous HEAD position was 585f06b Upgrade ACK runtime to v0.20.0 (#360) HEAD is now at 87477ae Merge pull request #356 from vijtrip2/ack-rt-v0.19.3 [jaypipes@thelio code-generator]$ make build-controller SERVICE=kinesis building ack-generate ... ok. ==== building kinesis-controller ==== Copying common custom resource definitions into kinesis Building Kubernetes API objects for kinesis Generating deepcopy code for kinesis Generating custom resource definitions for kinesis Building service controller for kinesis Generating RBAC manifests for kinesis Running gofmt against generated code for kinesis Updating additional GitHub repository maintenance files ==== building kinesis-controller release artifacts ==== Building release artifacts for kinesis-v0.0.0-non-release-version Generating common custom resource definitions Generating custom resource definitions for kinesis Generating RBAC manifests for kinesis ``` So, it seems that something was added to code-gen that breaks the custom operations configuration somehow.
1.0
Change in code-generator v0.20.0 breaks operation overrides - The [top part of kinesis-controller's `generator.yaml`](https://github.com/jaypipes/ack-kinesis-controller/blob/2dbf9a725282f612b5ca330c567927da362b480f/generator.yaml#L1-L9) file looks like this: ```yaml operations: # NOTE)jaypipes): According to # https://docs.aws.amazon.com/kinesis/latest/APIReference/API_DescribeStream.html, # the DescribeStream API should not be used. Instead, the # DescribeStreamSummary API and ListShards API should be used. DescribeStreamSummary: resource_name: Stream operation_type: READ_ONE output_wrapper_field_path: StreamDescriptionSummary ``` If I execute `make build-controller SERVICE=kinesis` with v0.20.0 of code-generator, I get a failure: ``` [jaypipes@thelio code-generator]$ make build-controller SERVICE=kinesis building ack-generate ... ok. installing controller-gen v0.7.0 ... ok. ==== building kinesis-controller ==== Copying common custom resource definitions into kinesis Building Kubernetes API objects for kinesis Generating deepcopy code for kinesis Generating custom resource definitions for kinesis Building service controller for kinesis Error: template: /home/jaypipes/go/src/github.com/aws-controllers-k8s/code-generator/templates/pkg/resource/sdk.go.tpl:73:3: executing "sdk_find_read_one" at <GoCodeRequiredFieldsMissingFromReadOneInput .CRD "r.ko" 1>: error calling GoCodeRequiredFieldsMissingFromReadOneInput: GENERATION FAILURE! there's a required field StreamName in Shape DescribeStreamInput that isn't in either the CR's Spec or Status structs! make: *** [Makefile:41: build-controller] Error 1 ``` However, if I check out v0.19.3 of the code-generator, I get no errors: ``` [jaypipes@thelio code-generator]$ git checkout v0.19.3 Previous HEAD position was 585f06b Upgrade ACK runtime to v0.20.0 (#360) HEAD is now at 87477ae Merge pull request #356 from vijtrip2/ack-rt-v0.19.3 [jaypipes@thelio code-generator]$ make build-controller SERVICE=kinesis building ack-generate ... ok. ==== building kinesis-controller ==== Copying common custom resource definitions into kinesis Building Kubernetes API objects for kinesis Generating deepcopy code for kinesis Generating custom resource definitions for kinesis Building service controller for kinesis Generating RBAC manifests for kinesis Running gofmt against generated code for kinesis Updating additional GitHub repository maintenance files ==== building kinesis-controller release artifacts ==== Building release artifacts for kinesis-v0.0.0-non-release-version Generating common custom resource definitions Generating custom resource definitions for kinesis Generating RBAC manifests for kinesis ``` So, it seems that something was added to code-gen that breaks the custom operations configuration somehow.
non_process
change in code generator breaks operation overrides the file looks like this yaml operations note jaypipes according to the describestream api should not be used instead the describestreamsummary api and listshards api should be used describestreamsummary resource name stream operation type read one output wrapper field path streamdescriptionsummary if i execute make build controller service kinesis with of code generator i get a failure make build controller service kinesis building ack generate ok installing controller gen ok building kinesis controller copying common custom resource definitions into kinesis building kubernetes api objects for kinesis generating deepcopy code for kinesis generating custom resource definitions for kinesis building service controller for kinesis error template home jaypipes go src github com aws controllers code generator templates pkg resource sdk go tpl executing sdk find read one at error calling gocoderequiredfieldsmissingfromreadoneinput generation failure there s a required field streamname in shape describestreaminput that isn t in either the cr s spec or status structs make error however if i check out of the code generator i get no errors git checkout previous head position was upgrade ack runtime to head is now at merge pull request from ack rt make build controller service kinesis building ack generate ok building kinesis controller copying common custom resource definitions into kinesis building kubernetes api objects for kinesis generating deepcopy code for kinesis generating custom resource definitions for kinesis building service controller for kinesis generating rbac manifests for kinesis running gofmt against generated code for kinesis updating additional github repository maintenance files building kinesis controller release artifacts building release artifacts for kinesis non release version generating common custom resource definitions generating custom resource definitions for kinesis generating rbac manifests for kinesis so it seems that something was added to code gen that breaks the custom operations configuration somehow
0
20,106
26,641,162,772
IssuesEvent
2023-01-25 05:03:59
clearvoyance/Clearvoyance
https://api.github.com/repos/clearvoyance/Clearvoyance
opened
Emergency Processing
emergency processing
Given JSON data from an emergency GET route, write a python function to convert it to a python dictionary and parse it to check if a relevant emergency alert exists. Return a filtered version of the data that only contains relevant emergencies, or return an empty dictionary if no relevant emergencies exist.
1.0
Emergency Processing - Given JSON data from an emergency GET route, write a python function to convert it to a python dictionary and parse it to check if a relevant emergency alert exists. Return a filtered version of the data that only contains relevant emergencies, or return an empty dictionary if no relevant emergencies exist.
process
emergency processing given json data from an emergency get route write a python function to convert it to a python dictionary and parse it to check if a relevant emergency alert exists return a filtered version of the data that only contains relevant emergencies or return an empty dictionary if no relevant emergencies exist
1
202,508
15,833,303,216
IssuesEvent
2021-04-06 15:30:02
department-of-veterans-affairs/va.gov-team
https://api.github.com/repos/department-of-veterans-affairs/va.gov-team
closed
Platform Website — Create Confluence migration plan for VFS-facing docs
Epic VSP-Initiative content-ia-team documentation
## Problem Statement The C+IA team spent 2020 Q4 moving backend documentation into Confluence. If the C+IA team continues the "one practice area, one quarter" rate, we won't be able to unveil Confluence entirely to VFS teams until 2022. This would lead to VFS team confusion since there would be no single documentation "source of truth." To remove the C+IA team as a bottleneck, we are determining: * HMW scale documentation migration to Confluence? * HMW provide a skeleton, workable experience to VFS readers who consume all content types? * This is a separate initiative documented in [Documentation Site — Migrate Homepage MVP documentation to Confluence #17060](https://github.com/department-of-veterans-affairs/va.gov-team/issues/17060) ## Hypothesis or Bet If we leverage VSP teams to move _their own_ VFS-facing docs into Confluence, then the C+IA team will be able to focus on the holistic Confluence documentation experience. This ticket complements #17060, which describes the work involved in providing a skeleton, workable documentation experience. This ticket describes creating a plan for fleshing out the skeleton, ie migrating and performing QA on lower-priority documentation. ## We will know we're done when... ("Definition of Done") * [ ] We have created a plan that has the approval of each VSP team that owns VFS-facing documentation. * [ ] We have guidance around the placement. * [ ] We have guidance around moving the docs seamlessly. * [ ] Finalize release plan for Confluence documentation site. ## Known Blockers/Dependencies * Dependencies: Will have to communicate with VSP teams in order to create a migration plan ## Projected Launch Date * EOQ - March 2021 ## Launch Checklist ### Guidance (delete before posting) _This checklist is intended to be used to help answer, "is my VSP initiative ready for launch?". All of the items in this checklist should be completed, with artifacts linked---or have a brief explanation of why they've been skipped---before launching a given VSP initiative. All links or explanations can be provided in **Required Artifacts** sections. The items that can be skipped are marked as such._ _Keep in mind the distinction between **Product** and **Initiative** --- each Product needs specific supporting documentation, but Initiatives to improve existing Products should reuse existing documentation for that Product. [VSP Product Terminology](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/teams/vsp/product-management/product-terminology.md) for details._ ### Is this service / tool / feature... ### ... tested? - [ ] Usability test (_TODO: link_) has been performed, to validate that new changes enable users to do what was intended and that these changes don't worsen quality elsewhere. If usability test isn't relevant for this change, document the reason for skipping it. - [ ] ... and issues discovered in usability testing have been addressed. * _Note on skipping: metrics that show the impact of before/after can be a substitute for usability testing._ - [ ] End-to-end [manual QA](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/platform/quality-assurance/README.md) or [UAT](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/platform/research/planning/what-is-uat.md) is complete, to validate there are no high-severity issues before launching - [ ] _(if applicable)_ New functionality has thorough, automated tests running in CI/CD ### ... documented? - [ ] New documentation is written pursuant to our [documentation style guide](https://github.com/department-of-veterans-affairs/va.gov-team/tree/master/platform/documentation/style-guide) - [ ] Product is included in the [List of VSP Products](https://docs.google.com/spreadsheets/d/1Fn2lD419WE3sTZJtN2Ensrjqaz0jH3WvLaBtn812Wjo/edit#gid=0) * _List the existing product that this initiative fits within, or add a new product to this list._ - [ ] Internal-facing: there's a [Product Outline](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/teams/vsp/product-management/product-outline-template.md) checked into [`products/platform/PRODUCT_NAME/`](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/products/platform/) * _Note: the Product Directory Name should match 1:1 with the List of VSP Products_ - [ ] External-facing: a [VFS-facing README](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/teams/vsp/product-management/product-readme-template.md) exists for this product/feature tool - [ ] ... and should be located at `platform/PRODUCT_NAME/README.md` - [ ] External-facing: a [User Guide](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/teams/vsp/product-management/writing-user-guides.md) exists for this product/feature/tool, and is updated for changes from this initiative - [ ] ... and should be linked from the VFS-facing README for your product - [ ] ... and should be located within `platform/PRODUCT_NAME/`, unless you already have another location for it - [ ] _(if applicable)_... and post to [#vsp-content-ia](https://dsva.slack.com/channels/vsp-content-ia) about whether this should be added to the [Documentation homepage](https://department-of-veterans-affairs.github.io/va.gov-team/) - [ ] _(if applicable)_ Post to [#vsp-service-design](https://dsva.slack.com/channels/vsp-service-design) for external communication about this change (e.g. VSP Newsletter, customer-facing meetings) ### ... measurable - [ ] _(if applicable)_ This change has clearly-defined success metrics, with instrumentation of those analytics where possible, or a reason documented for skipping it. * For help, see: [Analytics team](https://github.com/department-of-veterans-affairs/va.gov-team/tree/master/platform/analytics) - [ ] This change has an accompanying [VSP Initiative Release Plan](https://github.com/department-of-veterans-affairs/va.gov-team/issues/new/choose). ## Required Artifacts ### Documentation * **`PRODUCT_NAME`**: [/products/platform/platform-documentaiton](https://github.com/department-of-veterans-affairs/va.gov-team/tree/master/products/platform/platform-documentation) * **Product Outline**: [Product outline](https://github.com/department-of-veterans-affairs/va.gov-team/issues/17377) * **README**: _link to VFS-facing README for your product_ * **User Guide**: _link to User Guide_ ### Testing * **Usability test**: _link to GitHub issue, or provide reason for skipping_ * **Manual QA**: _link to GitHub issue or documented results_ * **Automated tests**: _link to tests, or "N/A"_ ### Measurement * **Success metrics**: * VSP teams understand the roles they will play in migrating the content that they own * **Release plan**: [link to Release Plan ticket](https://github.com/department-of-veterans-affairs/va.gov-team/issues/17377) ## TODOs - [x] Convert this issue to an epic - [x] Add your team's label to this epic
1.0
Platform Website — Create Confluence migration plan for VFS-facing docs - ## Problem Statement The C+IA team spent 2020 Q4 moving backend documentation into Confluence. If the C+IA team continues the "one practice area, one quarter" rate, we won't be able to unveil Confluence entirely to VFS teams until 2022. This would lead to VFS team confusion since there would be no single documentation "source of truth." To remove the C+IA team as a bottleneck, we are determining: * HMW scale documentation migration to Confluence? * HMW provide a skeleton, workable experience to VFS readers who consume all content types? * This is a separate initiative documented in [Documentation Site — Migrate Homepage MVP documentation to Confluence #17060](https://github.com/department-of-veterans-affairs/va.gov-team/issues/17060) ## Hypothesis or Bet If we leverage VSP teams to move _their own_ VFS-facing docs into Confluence, then the C+IA team will be able to focus on the holistic Confluence documentation experience. This ticket complements #17060, which describes the work involved in providing a skeleton, workable documentation experience. This ticket describes creating a plan for fleshing out the skeleton, ie migrating and performing QA on lower-priority documentation. ## We will know we're done when... ("Definition of Done") * [ ] We have created a plan that has the approval of each VSP team that owns VFS-facing documentation. * [ ] We have guidance around the placement. * [ ] We have guidance around moving the docs seamlessly. * [ ] Finalize release plan for Confluence documentation site. ## Known Blockers/Dependencies * Dependencies: Will have to communicate with VSP teams in order to create a migration plan ## Projected Launch Date * EOQ - March 2021 ## Launch Checklist ### Guidance (delete before posting) _This checklist is intended to be used to help answer, "is my VSP initiative ready for launch?". All of the items in this checklist should be completed, with artifacts linked---or have a brief explanation of why they've been skipped---before launching a given VSP initiative. All links or explanations can be provided in **Required Artifacts** sections. The items that can be skipped are marked as such._ _Keep in mind the distinction between **Product** and **Initiative** --- each Product needs specific supporting documentation, but Initiatives to improve existing Products should reuse existing documentation for that Product. [VSP Product Terminology](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/teams/vsp/product-management/product-terminology.md) for details._ ### Is this service / tool / feature... ### ... tested? - [ ] Usability test (_TODO: link_) has been performed, to validate that new changes enable users to do what was intended and that these changes don't worsen quality elsewhere. If usability test isn't relevant for this change, document the reason for skipping it. - [ ] ... and issues discovered in usability testing have been addressed. * _Note on skipping: metrics that show the impact of before/after can be a substitute for usability testing._ - [ ] End-to-end [manual QA](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/platform/quality-assurance/README.md) or [UAT](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/platform/research/planning/what-is-uat.md) is complete, to validate there are no high-severity issues before launching - [ ] _(if applicable)_ New functionality has thorough, automated tests running in CI/CD ### ... documented? - [ ] New documentation is written pursuant to our [documentation style guide](https://github.com/department-of-veterans-affairs/va.gov-team/tree/master/platform/documentation/style-guide) - [ ] Product is included in the [List of VSP Products](https://docs.google.com/spreadsheets/d/1Fn2lD419WE3sTZJtN2Ensrjqaz0jH3WvLaBtn812Wjo/edit#gid=0) * _List the existing product that this initiative fits within, or add a new product to this list._ - [ ] Internal-facing: there's a [Product Outline](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/teams/vsp/product-management/product-outline-template.md) checked into [`products/platform/PRODUCT_NAME/`](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/products/platform/) * _Note: the Product Directory Name should match 1:1 with the List of VSP Products_ - [ ] External-facing: a [VFS-facing README](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/teams/vsp/product-management/product-readme-template.md) exists for this product/feature tool - [ ] ... and should be located at `platform/PRODUCT_NAME/README.md` - [ ] External-facing: a [User Guide](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/teams/vsp/product-management/writing-user-guides.md) exists for this product/feature/tool, and is updated for changes from this initiative - [ ] ... and should be linked from the VFS-facing README for your product - [ ] ... and should be located within `platform/PRODUCT_NAME/`, unless you already have another location for it - [ ] _(if applicable)_... and post to [#vsp-content-ia](https://dsva.slack.com/channels/vsp-content-ia) about whether this should be added to the [Documentation homepage](https://department-of-veterans-affairs.github.io/va.gov-team/) - [ ] _(if applicable)_ Post to [#vsp-service-design](https://dsva.slack.com/channels/vsp-service-design) for external communication about this change (e.g. VSP Newsletter, customer-facing meetings) ### ... measurable - [ ] _(if applicable)_ This change has clearly-defined success metrics, with instrumentation of those analytics where possible, or a reason documented for skipping it. * For help, see: [Analytics team](https://github.com/department-of-veterans-affairs/va.gov-team/tree/master/platform/analytics) - [ ] This change has an accompanying [VSP Initiative Release Plan](https://github.com/department-of-veterans-affairs/va.gov-team/issues/new/choose). ## Required Artifacts ### Documentation * **`PRODUCT_NAME`**: [/products/platform/platform-documentaiton](https://github.com/department-of-veterans-affairs/va.gov-team/tree/master/products/platform/platform-documentation) * **Product Outline**: [Product outline](https://github.com/department-of-veterans-affairs/va.gov-team/issues/17377) * **README**: _link to VFS-facing README for your product_ * **User Guide**: _link to User Guide_ ### Testing * **Usability test**: _link to GitHub issue, or provide reason for skipping_ * **Manual QA**: _link to GitHub issue or documented results_ * **Automated tests**: _link to tests, or "N/A"_ ### Measurement * **Success metrics**: * VSP teams understand the roles they will play in migrating the content that they own * **Release plan**: [link to Release Plan ticket](https://github.com/department-of-veterans-affairs/va.gov-team/issues/17377) ## TODOs - [x] Convert this issue to an epic - [x] Add your team's label to this epic
non_process
platform website — create confluence migration plan for vfs facing docs problem statement the c ia team spent moving backend documentation into confluence if the c ia team continues the one practice area one quarter rate we won t be able to unveil confluence entirely to vfs teams until this would lead to vfs team confusion since there would be no single documentation source of truth to remove the c ia team as a bottleneck we are determining hmw scale documentation migration to confluence hmw provide a skeleton workable experience to vfs readers who consume all content types this is a separate initiative documented in hypothesis or bet if we leverage vsp teams to move their own vfs facing docs into confluence then the c ia team will be able to focus on the holistic confluence documentation experience this ticket complements which describes the work involved in providing a skeleton workable documentation experience this ticket describes creating a plan for fleshing out the skeleton ie migrating and performing qa on lower priority documentation we will know we re done when definition of done we have created a plan that has the approval of each vsp team that owns vfs facing documentation we have guidance around the placement we have guidance around moving the docs seamlessly finalize release plan for confluence documentation site known blockers dependencies dependencies will have to communicate with vsp teams in order to create a migration plan projected launch date eoq march launch checklist guidance delete before posting this checklist is intended to be used to help answer is my vsp initiative ready for launch all of the items in this checklist should be completed with artifacts linked or have a brief explanation of why they ve been skipped before launching a given vsp initiative all links or explanations can be provided in required artifacts sections the items that can be skipped are marked as such keep in mind the distinction between product and initiative each product needs specific supporting documentation but initiatives to improve existing products should reuse existing documentation for that product for details is this service tool feature tested usability test todo link has been performed to validate that new changes enable users to do what was intended and that these changes don t worsen quality elsewhere if usability test isn t relevant for this change document the reason for skipping it and issues discovered in usability testing have been addressed note on skipping metrics that show the impact of before after can be a substitute for usability testing end to end or is complete to validate there are no high severity issues before launching if applicable new functionality has thorough automated tests running in ci cd documented new documentation is written pursuant to our product is included in the list the existing product that this initiative fits within or add a new product to this list internal facing there s a checked into note the product directory name should match with the list of vsp products external facing a exists for this product feature tool and should be located at platform product name readme md external facing a exists for this product feature tool and is updated for changes from this initiative and should be linked from the vfs facing readme for your product and should be located within platform product name unless you already have another location for it if applicable and post to about whether this should be added to the if applicable post to for external communication about this change e g vsp newsletter customer facing meetings measurable if applicable this change has clearly defined success metrics with instrumentation of those analytics where possible or a reason documented for skipping it for help see this change has an accompanying required artifacts documentation product name product outline readme link to vfs facing readme for your product user guide link to user guide testing usability test link to github issue or provide reason for skipping manual qa link to github issue or documented results automated tests link to tests or n a measurement success metrics vsp teams understand the roles they will play in migrating the content that they own release plan todos convert this issue to an epic add your team s label to this epic
0
28,463
12,854,262,933
IssuesEvent
2020-07-09 01:22:47
trilinos/Trilinos
https://api.github.com/repos/trilinos/Trilinos
closed
SEACAS: localtime_s and gmtime_s not defined
PA: Data Services pkg: seacas type: bug
@trilinos/seacas ### Description I get the error ``` In file included from /projects/ccsm/AlbanyTrilinos_20200702/trilinos/trilinos-src/packages/seacas/libraries/ioss/src/Ioss_Utils.C(21): /projects/ccsm/AlbanyTrilinos_20200702/trilinos/trilinos-src/packages/seacas/libraries/ioss/src/fmt/chrono.h(62): error: identifier "localtime_s" is undefined return fallback(localtime_s(&tm_, &time_)); ^ In file included from /projects/ccsm/AlbanyTrilinos_20200702/trilinos/trilinos-src/packages/seacas/libraries/ioss/src/Ioss_Utils.C(21): /projects/ccsm/AlbanyTrilinos_20200702/trilinos/trilinos-src/packages/seacas/libraries/ioss/src/fmt/chrono.h(106): error: identifier "gmtime_s" is undefined return fallback(gmtime_s(&tm_, &time_)); ``` when compiling with intel compiler and SEMS modules on skybridge. I need to build Albany LandIce using compiler/modules provided by E3SM ### Steps to Reproduce 1. SHA1: 11a3113b7d397c770efebf3c309457b4783bba67 modules loaded: ``` 1) sems-env 3) sems-git/2.10.1 5) sems-cmake/3.12.2 7) intel/17.0 9) sems-intel/17.0.0 11) sems-openmpi/1.10.5 13) acme-netcdf/4.7.4/acme 2) acme-env 4) sems-python/2.7.9 6) gnu/4.9.2 8) mkl/17.0 10) openmpi-intel/1.10 12) acme-hdf5/1.12.0/acme ``` 1. Configure script: ``` TRILINSTALLDIR="/projects/ccsm/AlbanyTrilinos_20200702/trilinos-build/install" BOOST_DIR="/usr/include/boost" rm -fr CMake* cmake -D CMAKE_INSTALL_PREFIX:PATH=$TRILINSTALLDIR \ -D CMAKE_BUILD_TYPE:STRING=RELEASE \ -D Trilinos_ENABLE_ALL_PACKAGES:BOOL=OFF \ -D Trilinos_ENABLE_ALL_OPTIONAL_PACKAGES:BOOL=OFF \ -D Trilinos_ENABLE_Epetra:BOOL=ON \ -D Trilinos_ENABLE_EpetraExt:BOOL=ON \ -D Trilinos_ENABLE_Tpetra:BOOL=ON \ -D Trilinos_ENABLE_Ifpack:BOOL=ON \ -D Trilinos_ENABLE_Ifpack2:BOOL=ON \ -D Amesos2_ENABLE_KLU2:BOOL=ON \ -D Trilinos_ENABLE_AztecOO:BOOL=ON \ -D Trilinos_ENABLE_Amesos:BOOL=ON \ -D Trilinos_ENABLE_Amesos2:BOOL=ON \ -D Trilinos_ENABLE_Belos:BOOL=ON \ -D Trilinos_ENABLE_Phalanx:BOOL=ON \ -D Trilinos_ENABLE_ML:BOOL=ON \ -D Trilinos_ENABLE_MueLu:BOOL=ON \ -D Trilinos_ENABLE_Stratimikos:BOOL=ON \ -D Kokkos_ENABLE_Serial:BOOL=ON \ -D Kokkos_ENABLE_OpenMP:BOOL=OFF \ -D Trilinos_ENABLE_EXPLICIT_INSTANTIATION:BOOL=ON \ -D Tpetra_INST_INT_LONG_LONG:BOOL=ON \ -D Tpetra_INST_INT_INT:BOOL=OFF \ \ -D TPL_ENABLE_Netcdf:BOOL=ON \ -D TPL_Netcdf_INCLUDE_DIRS:PATH="${SEMS_NETCDF_INCLUDE_PATH}" \ -D TPL_Netcdf_LIBRARY_DIRS:PATH="${SEMS_NETCDF_LIBRARY_PATH}" \ -D Trilinos_ENABLE_Intrepid2:BOOL=ON \ -D Trilinos_ENABLE_NOX:BOOL=ON \ -D Trilinos_ENABLE_Piro:BOOL=ON \ -D Boost_INCLUDE_DIRS:FILEPATH="${BOOST_DIR}" \ -D TPL_ENABLE_Boost:BOOL=ON \ -D TPL_ENABLE_MPI:BOOL=ON \ -D Trilinos_ENABLE_STKIO:BOOL=ON \ -D Trilinos_ENABLE_STKMesh:BOOL=ON \ -D Xpetra_ENABLE_Epetra=OFF \ -D MueLu_ENABLE_Epetra=OFF \ -D Belos_ENABLE_Epetra=OFF \ \ -D CMAKE_EXE_LINKER_FLAGS:STRING='-static' \ -D CMAKE_C_FLAGS:STRING="-mkl -O3 -DREDUCE_SCATTER_BUG" \ -D CMAKE_CXX_FLAGS:STRING="-mkl -O3 -std=c++11 -DREDUCE_SCATTER_BUG -DBOOST_NO_HASH" \ -D CMAKE_Fortran_FLAGS:STRING="-mkl" \ -D CMAKE_EXE_LINKER_FLAGS="-mkl -ldl" \ \ ../trilinos-src ``` Thanks! ```
1.0
SEACAS: localtime_s and gmtime_s not defined - @trilinos/seacas ### Description I get the error ``` In file included from /projects/ccsm/AlbanyTrilinos_20200702/trilinos/trilinos-src/packages/seacas/libraries/ioss/src/Ioss_Utils.C(21): /projects/ccsm/AlbanyTrilinos_20200702/trilinos/trilinos-src/packages/seacas/libraries/ioss/src/fmt/chrono.h(62): error: identifier "localtime_s" is undefined return fallback(localtime_s(&tm_, &time_)); ^ In file included from /projects/ccsm/AlbanyTrilinos_20200702/trilinos/trilinos-src/packages/seacas/libraries/ioss/src/Ioss_Utils.C(21): /projects/ccsm/AlbanyTrilinos_20200702/trilinos/trilinos-src/packages/seacas/libraries/ioss/src/fmt/chrono.h(106): error: identifier "gmtime_s" is undefined return fallback(gmtime_s(&tm_, &time_)); ``` when compiling with intel compiler and SEMS modules on skybridge. I need to build Albany LandIce using compiler/modules provided by E3SM ### Steps to Reproduce 1. SHA1: 11a3113b7d397c770efebf3c309457b4783bba67 modules loaded: ``` 1) sems-env 3) sems-git/2.10.1 5) sems-cmake/3.12.2 7) intel/17.0 9) sems-intel/17.0.0 11) sems-openmpi/1.10.5 13) acme-netcdf/4.7.4/acme 2) acme-env 4) sems-python/2.7.9 6) gnu/4.9.2 8) mkl/17.0 10) openmpi-intel/1.10 12) acme-hdf5/1.12.0/acme ``` 1. Configure script: ``` TRILINSTALLDIR="/projects/ccsm/AlbanyTrilinos_20200702/trilinos-build/install" BOOST_DIR="/usr/include/boost" rm -fr CMake* cmake -D CMAKE_INSTALL_PREFIX:PATH=$TRILINSTALLDIR \ -D CMAKE_BUILD_TYPE:STRING=RELEASE \ -D Trilinos_ENABLE_ALL_PACKAGES:BOOL=OFF \ -D Trilinos_ENABLE_ALL_OPTIONAL_PACKAGES:BOOL=OFF \ -D Trilinos_ENABLE_Epetra:BOOL=ON \ -D Trilinos_ENABLE_EpetraExt:BOOL=ON \ -D Trilinos_ENABLE_Tpetra:BOOL=ON \ -D Trilinos_ENABLE_Ifpack:BOOL=ON \ -D Trilinos_ENABLE_Ifpack2:BOOL=ON \ -D Amesos2_ENABLE_KLU2:BOOL=ON \ -D Trilinos_ENABLE_AztecOO:BOOL=ON \ -D Trilinos_ENABLE_Amesos:BOOL=ON \ -D Trilinos_ENABLE_Amesos2:BOOL=ON \ -D Trilinos_ENABLE_Belos:BOOL=ON \ -D Trilinos_ENABLE_Phalanx:BOOL=ON \ -D Trilinos_ENABLE_ML:BOOL=ON \ -D Trilinos_ENABLE_MueLu:BOOL=ON \ -D Trilinos_ENABLE_Stratimikos:BOOL=ON \ -D Kokkos_ENABLE_Serial:BOOL=ON \ -D Kokkos_ENABLE_OpenMP:BOOL=OFF \ -D Trilinos_ENABLE_EXPLICIT_INSTANTIATION:BOOL=ON \ -D Tpetra_INST_INT_LONG_LONG:BOOL=ON \ -D Tpetra_INST_INT_INT:BOOL=OFF \ \ -D TPL_ENABLE_Netcdf:BOOL=ON \ -D TPL_Netcdf_INCLUDE_DIRS:PATH="${SEMS_NETCDF_INCLUDE_PATH}" \ -D TPL_Netcdf_LIBRARY_DIRS:PATH="${SEMS_NETCDF_LIBRARY_PATH}" \ -D Trilinos_ENABLE_Intrepid2:BOOL=ON \ -D Trilinos_ENABLE_NOX:BOOL=ON \ -D Trilinos_ENABLE_Piro:BOOL=ON \ -D Boost_INCLUDE_DIRS:FILEPATH="${BOOST_DIR}" \ -D TPL_ENABLE_Boost:BOOL=ON \ -D TPL_ENABLE_MPI:BOOL=ON \ -D Trilinos_ENABLE_STKIO:BOOL=ON \ -D Trilinos_ENABLE_STKMesh:BOOL=ON \ -D Xpetra_ENABLE_Epetra=OFF \ -D MueLu_ENABLE_Epetra=OFF \ -D Belos_ENABLE_Epetra=OFF \ \ -D CMAKE_EXE_LINKER_FLAGS:STRING='-static' \ -D CMAKE_C_FLAGS:STRING="-mkl -O3 -DREDUCE_SCATTER_BUG" \ -D CMAKE_CXX_FLAGS:STRING="-mkl -O3 -std=c++11 -DREDUCE_SCATTER_BUG -DBOOST_NO_HASH" \ -D CMAKE_Fortran_FLAGS:STRING="-mkl" \ -D CMAKE_EXE_LINKER_FLAGS="-mkl -ldl" \ \ ../trilinos-src ``` Thanks! ```
non_process
seacas localtime s and gmtime s not defined trilinos seacas description i get the error in file included from projects ccsm albanytrilinos trilinos trilinos src packages seacas libraries ioss src ioss utils c projects ccsm albanytrilinos trilinos trilinos src packages seacas libraries ioss src fmt chrono h error identifier localtime s is undefined return fallback localtime s tm time in file included from projects ccsm albanytrilinos trilinos trilinos src packages seacas libraries ioss src ioss utils c projects ccsm albanytrilinos trilinos trilinos src packages seacas libraries ioss src fmt chrono h error identifier gmtime s is undefined return fallback gmtime s tm time when compiling with intel compiler and sems modules on skybridge i need to build albany landice using compiler modules provided by steps to reproduce modules loaded sems env sems git sems cmake intel sems intel sems openmpi acme netcdf acme acme env sems python gnu mkl openmpi intel acme acme configure script trilinstalldir projects ccsm albanytrilinos trilinos build install boost dir usr include boost rm fr cmake cmake d cmake install prefix path trilinstalldir d cmake build type string release d trilinos enable all packages bool off d trilinos enable all optional packages bool off d trilinos enable epetra bool on d trilinos enable epetraext bool on d trilinos enable tpetra bool on d trilinos enable ifpack bool on d trilinos enable bool on d enable bool on d trilinos enable aztecoo bool on d trilinos enable amesos bool on d trilinos enable bool on d trilinos enable belos bool on d trilinos enable phalanx bool on d trilinos enable ml bool on d trilinos enable muelu bool on d trilinos enable stratimikos bool on d kokkos enable serial bool on d kokkos enable openmp bool off d trilinos enable explicit instantiation bool on d tpetra inst int long long bool on d tpetra inst int int bool off d tpl enable netcdf bool on d tpl netcdf include dirs path sems netcdf include path d tpl netcdf library dirs path sems netcdf library path d trilinos enable bool on d trilinos enable nox bool on d trilinos enable piro bool on d boost include dirs filepath boost dir d tpl enable boost bool on d tpl enable mpi bool on d trilinos enable stkio bool on d trilinos enable stkmesh bool on d xpetra enable epetra off d muelu enable epetra off d belos enable epetra off d cmake exe linker flags string static d cmake c flags string mkl dreduce scatter bug d cmake cxx flags string mkl std c dreduce scatter bug dboost no hash d cmake fortran flags string mkl d cmake exe linker flags mkl ldl trilinos src thanks
0
74,413
9,039,990,642
IssuesEvent
2019-02-10 12:33:23
nextcloud/polls
https://api.github.com/repos/nextcloud/polls
closed
Polls overview - Don't show red icon "you did not comment"
1. to develop design
At the moment a red comment icon is shown in the polls home / overview if the user has not commented the poll. Usually a red sign indicates something is wrong or input / action is needed, which seems misleading on behalf of comments. Proposal: Only show the green (could then also be neutral black/white or grey) comment icon if a comment was made. Do not show the red icon at all.
1.0
Polls overview - Don't show red icon "you did not comment" - At the moment a red comment icon is shown in the polls home / overview if the user has not commented the poll. Usually a red sign indicates something is wrong or input / action is needed, which seems misleading on behalf of comments. Proposal: Only show the green (could then also be neutral black/white or grey) comment icon if a comment was made. Do not show the red icon at all.
non_process
polls overview don t show red icon you did not comment at the moment a red comment icon is shown in the polls home overview if the user has not commented the poll usually a red sign indicates something is wrong or input action is needed which seems misleading on behalf of comments proposal only show the green could then also be neutral black white or grey comment icon if a comment was made do not show the red icon at all
0
251,631
27,191,223,394
IssuesEvent
2023-02-19 20:29:16
WFS-Mend/vtrade-api
https://api.github.com/repos/WFS-Mend/vtrade-api
opened
spring-boot-starter-test-2.6.1.jar: 2 vulnerabilities (highest severity is: 5.3)
security vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-boot-starter-test-2.6.1.jar</b></p></summary> <p></p> <p>Path to dependency file: /build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework/spring-core/5.3.13/d2a6c3372dd337e08144f9f49f386b8ec7a8080d/spring-core-5.3.13.jar</p> <p> <p>Found in HEAD commit: <a href="https://github.com/WFS-Mend/vtrade-api/commit/4e3c1ff743df760c9f96386a86706d36870f5fa5">4e3c1ff743df760c9f96386a86706d36870f5fa5</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (spring-boot-starter-test version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2022-22970](https://www.mend.io/vulnerability-database/CVE-2022-22970) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | spring-core-5.3.13.jar | Transitive | 2.6.8 | &#9989; | | [CVE-2021-22060](https://www.mend.io/vulnerability-database/CVE-2021-22060) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.3 | spring-core-5.3.13.jar | Transitive | 2.6.2 | &#9989; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-22970</summary> ### Vulnerable Library - <b>spring-core-5.3.13.jar</b></p> <p>Spring Core</p> <p>Path to dependency file: /build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework/spring-core/5.3.13/d2a6c3372dd337e08144f9f49f386b8ec7a8080d/spring-core-5.3.13.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-test-2.6.1.jar (Root Library) - :x: **spring-core-5.3.13.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/WFS-Mend/vtrade-api/commit/4e3c1ff743df760c9f96386a86706d36870f5fa5">4e3c1ff743df760c9f96386a86706d36870f5fa5</a></p> <p>Found in base branch: <b>main</b></p> </p> <p></p> ### Vulnerability Details <p> In spring framework versions prior to 5.3.20+ , 5.2.22+ and old unsupported versions, applications that handle file uploads are vulnerable to DoS attack if they rely on data binding to set a MultipartFile or javax.servlet.Part to a field in a model object. <p>Publish Date: 2022-05-12 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-22970>CVE-2022-22970</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>5.3</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://tanzu.vmware.com/security/cve-2022-22970">https://tanzu.vmware.com/security/cve-2022-22970</a></p> <p>Release Date: 2022-05-12</p> <p>Fix Resolution (org.springframework:spring-core): 5.3.20</p> <p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-test): 2.6.8</p> </p> <p></p> :rescue_worker_helmet: Automatic Remediation is available for this issue </details><details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-22060</summary> ### Vulnerable Library - <b>spring-core-5.3.13.jar</b></p> <p>Spring Core</p> <p>Path to dependency file: /build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework/spring-core/5.3.13/d2a6c3372dd337e08144f9f49f386b8ec7a8080d/spring-core-5.3.13.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-test-2.6.1.jar (Root Library) - :x: **spring-core-5.3.13.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/WFS-Mend/vtrade-api/commit/4e3c1ff743df760c9f96386a86706d36870f5fa5">4e3c1ff743df760c9f96386a86706d36870f5fa5</a></p> <p>Found in base branch: <b>main</b></p> </p> <p></p> ### Vulnerability Details <p> In Spring Framework versions 5.3.0 - 5.3.13, 5.2.0 - 5.2.18, and older unsupported versions, it is possible for a user to provide malicious input to cause the insertion of additional log entries. This is a follow-up to CVE-2021-22096 that protects against additional types of input and in more places of the Spring Framework codebase. <p>Publish Date: 2022-01-10 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-22060>CVE-2021-22060</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>4.3</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-6gf2-pvqw-37ph">https://github.com/advisories/GHSA-6gf2-pvqw-37ph</a></p> <p>Release Date: 2022-01-10</p> <p>Fix Resolution (org.springframework:spring-core): 5.3.14</p> <p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-test): 2.6.2</p> </p> <p></p> :rescue_worker_helmet: Automatic Remediation is available for this issue </details> *** <p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
True
spring-boot-starter-test-2.6.1.jar: 2 vulnerabilities (highest severity is: 5.3) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-boot-starter-test-2.6.1.jar</b></p></summary> <p></p> <p>Path to dependency file: /build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework/spring-core/5.3.13/d2a6c3372dd337e08144f9f49f386b8ec7a8080d/spring-core-5.3.13.jar</p> <p> <p>Found in HEAD commit: <a href="https://github.com/WFS-Mend/vtrade-api/commit/4e3c1ff743df760c9f96386a86706d36870f5fa5">4e3c1ff743df760c9f96386a86706d36870f5fa5</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (spring-boot-starter-test version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2022-22970](https://www.mend.io/vulnerability-database/CVE-2022-22970) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | spring-core-5.3.13.jar | Transitive | 2.6.8 | &#9989; | | [CVE-2021-22060](https://www.mend.io/vulnerability-database/CVE-2021-22060) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.3 | spring-core-5.3.13.jar | Transitive | 2.6.2 | &#9989; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-22970</summary> ### Vulnerable Library - <b>spring-core-5.3.13.jar</b></p> <p>Spring Core</p> <p>Path to dependency file: /build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework/spring-core/5.3.13/d2a6c3372dd337e08144f9f49f386b8ec7a8080d/spring-core-5.3.13.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-test-2.6.1.jar (Root Library) - :x: **spring-core-5.3.13.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/WFS-Mend/vtrade-api/commit/4e3c1ff743df760c9f96386a86706d36870f5fa5">4e3c1ff743df760c9f96386a86706d36870f5fa5</a></p> <p>Found in base branch: <b>main</b></p> </p> <p></p> ### Vulnerability Details <p> In spring framework versions prior to 5.3.20+ , 5.2.22+ and old unsupported versions, applications that handle file uploads are vulnerable to DoS attack if they rely on data binding to set a MultipartFile or javax.servlet.Part to a field in a model object. <p>Publish Date: 2022-05-12 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-22970>CVE-2022-22970</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>5.3</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://tanzu.vmware.com/security/cve-2022-22970">https://tanzu.vmware.com/security/cve-2022-22970</a></p> <p>Release Date: 2022-05-12</p> <p>Fix Resolution (org.springframework:spring-core): 5.3.20</p> <p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-test): 2.6.8</p> </p> <p></p> :rescue_worker_helmet: Automatic Remediation is available for this issue </details><details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-22060</summary> ### Vulnerable Library - <b>spring-core-5.3.13.jar</b></p> <p>Spring Core</p> <p>Path to dependency file: /build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework/spring-core/5.3.13/d2a6c3372dd337e08144f9f49f386b8ec7a8080d/spring-core-5.3.13.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-test-2.6.1.jar (Root Library) - :x: **spring-core-5.3.13.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/WFS-Mend/vtrade-api/commit/4e3c1ff743df760c9f96386a86706d36870f5fa5">4e3c1ff743df760c9f96386a86706d36870f5fa5</a></p> <p>Found in base branch: <b>main</b></p> </p> <p></p> ### Vulnerability Details <p> In Spring Framework versions 5.3.0 - 5.3.13, 5.2.0 - 5.2.18, and older unsupported versions, it is possible for a user to provide malicious input to cause the insertion of additional log entries. This is a follow-up to CVE-2021-22096 that protects against additional types of input and in more places of the Spring Framework codebase. <p>Publish Date: 2022-01-10 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-22060>CVE-2021-22060</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>4.3</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-6gf2-pvqw-37ph">https://github.com/advisories/GHSA-6gf2-pvqw-37ph</a></p> <p>Release Date: 2022-01-10</p> <p>Fix Resolution (org.springframework:spring-core): 5.3.14</p> <p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-test): 2.6.2</p> </p> <p></p> :rescue_worker_helmet: Automatic Remediation is available for this issue </details> *** <p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
non_process
spring boot starter test jar vulnerabilities highest severity is vulnerable library spring boot starter test jar path to dependency file build gradle path to vulnerable library home wss scanner gradle caches modules files org springframework spring core spring core jar found in head commit a href vulnerabilities cve severity cvss dependency type fixed in spring boot starter test version remediation available medium spring core jar transitive medium spring core jar transitive details cve vulnerable library spring core jar spring core path to dependency file build gradle path to vulnerable library home wss scanner gradle caches modules files org springframework spring core spring core jar dependency hierarchy spring boot starter test jar root library x spring core jar vulnerable library found in head commit a href found in base branch main vulnerability details in spring framework versions prior to and old unsupported versions applications that handle file uploads are vulnerable to dos attack if they rely on data binding to set a multipartfile or javax servlet part to a field in a model object publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org springframework spring core direct dependency fix resolution org springframework boot spring boot starter test rescue worker helmet automatic remediation is available for this issue cve vulnerable library spring core jar spring core path to dependency file build gradle path to vulnerable library home wss scanner gradle caches modules files org springframework spring core spring core jar dependency hierarchy spring boot starter test jar root library x spring core jar vulnerable library found in head commit a href found in base branch main vulnerability details in spring framework versions and older unsupported versions it is possible for a user to provide malicious input to cause the insertion of additional log entries this is a follow up to cve that protects against additional types of input and in more places of the spring framework codebase publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org springframework spring core direct dependency fix resolution org springframework boot spring boot starter test rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue
0
10,189
13,044,162,868
IssuesEvent
2020-07-29 03:47:37
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `GreatestString` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `GreatestString` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @andylokandy ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `GreatestString` from TiDB - ## Description Port the scalar function `GreatestString` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @andylokandy ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function greateststring from tidb description port the scalar function greateststring from tidb to coprocessor score mentor s andylokandy recommended skills rust programming learning materials already implemented expressions ported from tidb
1
27,667
13,348,805,639
IssuesEvent
2020-08-29 20:33:22
OpenRA/OpenRA
https://api.github.com/repos/OpenRA/OpenRA
closed
AI cant recognize islands and check for pathability so it lags (solution)
AI Performance
The AI right now don't recognize islands or water limits,if you put a normal rush or turtle AI the game will lag from the vast numbers of the ground units that seek unreachable targets. A solution could be on the player starting position. an option can be made to toggle land unit production to only naval and air units. An example of my idea: on player starting position properties\settings (in map editor) 1) is island: true\false (the ai will stop or engage ground unit production,and use the units if any only for defense 2) is island bridge: true\false (the ai will build both ground and naval units BUT the ground units will not try to engage with the bridge destroyed, see below) 3) island bridge link: bridge01,bridge02,bridge04 (this will tell the ai what bridges can be passed from ground units) 4) player path link: playerposition1,playerposition2 ect. (with this the ai will engage if enemy the player that is on the same island or group of island linked with a bridge,using the ground units) so the ai will check if the the starting position is an island,then it will check if "any" the bridges are working,and then will check if the enemy can be reached with ground units. the names like bridge01 playerposition1 ect are imaginary i don't remember the actual names. final words,this is just a way,toggling off and on things.i dont know if this is the best way to do it,but this is the general idea.i am not a programmer so if some one can filter out this idea and make it work then...it will work..
True
AI cant recognize islands and check for pathability so it lags (solution) - The AI right now don't recognize islands or water limits,if you put a normal rush or turtle AI the game will lag from the vast numbers of the ground units that seek unreachable targets. A solution could be on the player starting position. an option can be made to toggle land unit production to only naval and air units. An example of my idea: on player starting position properties\settings (in map editor) 1) is island: true\false (the ai will stop or engage ground unit production,and use the units if any only for defense 2) is island bridge: true\false (the ai will build both ground and naval units BUT the ground units will not try to engage with the bridge destroyed, see below) 3) island bridge link: bridge01,bridge02,bridge04 (this will tell the ai what bridges can be passed from ground units) 4) player path link: playerposition1,playerposition2 ect. (with this the ai will engage if enemy the player that is on the same island or group of island linked with a bridge,using the ground units) so the ai will check if the the starting position is an island,then it will check if "any" the bridges are working,and then will check if the enemy can be reached with ground units. the names like bridge01 playerposition1 ect are imaginary i don't remember the actual names. final words,this is just a way,toggling off and on things.i dont know if this is the best way to do it,but this is the general idea.i am not a programmer so if some one can filter out this idea and make it work then...it will work..
non_process
ai cant recognize islands and check for pathability so it lags solution the ai right now don t recognize islands or water limits if you put a normal rush or turtle ai the game will lag from the vast numbers of the ground units that seek unreachable targets a solution could be on the player starting position an option can be made to toggle land unit production to only naval and air units an example of my idea on player starting position properties settings in map editor is island true false the ai will stop or engage ground unit production and use the units if any only for defense is island bridge true false the ai will build both ground and naval units but the ground units will not try to engage with the bridge destroyed see below island bridge link this will tell the ai what bridges can be passed from ground units player path link ect with this the ai will engage if enemy the player that is on the same island or group of island linked with a bridge using the ground units so the ai will check if the the starting position is an island then it will check if any the bridges are working and then will check if the enemy can be reached with ground units the names like ect are imaginary i don t remember the actual names final words this is just a way toggling off and on things i dont know if this is the best way to do it but this is the general idea i am not a programmer so if some one can filter out this idea and make it work then it will work
0
179,095
13,823,069,579
IssuesEvent
2020-10-13 06:25:50
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
opened
roachtest: sqlsmith/setup=empty/setting=no-mutations failed
C-test-failure O-roachtest O-robot branch-release-20.2 release-blocker
[(roachtest).sqlsmith/setup=empty/setting=no-mutations failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2359532&tab=buildLog) on [release-20.2@8c603a116644b518ee79a333c2e67f6d0f10743f](https://github.com/cockroachdb/cockroach/commits/8c603a116644b518ee79a333c2e67f6d0f10743f): ``` The test failed on branch=release-20.2, cloud=gce: test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/sqlsmith/setup=empty/setting=no-mutations/run_1 sqlsmith.go:110,sqlsmith.go:207,test_runner.go:755: pq: unrecognized configuration parameter "testing_vectorize_inject_panics" ``` <details><summary>More</summary><p> Artifacts: [/sqlsmith/setup=empty/setting=no-mutations](https://teamcity.cockroachdb.com/viewLog.html?buildId=2359532&tab=artifacts#/sqlsmith/setup=empty/setting=no-mutations) [See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Asqlsmith%2Fsetup%3Dempty%2Fsetting%3Dno-mutations.%2A&sort=title&restgroup=false&display=lastcommented+project) <sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
2.0
roachtest: sqlsmith/setup=empty/setting=no-mutations failed - [(roachtest).sqlsmith/setup=empty/setting=no-mutations failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2359532&tab=buildLog) on [release-20.2@8c603a116644b518ee79a333c2e67f6d0f10743f](https://github.com/cockroachdb/cockroach/commits/8c603a116644b518ee79a333c2e67f6d0f10743f): ``` The test failed on branch=release-20.2, cloud=gce: test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/sqlsmith/setup=empty/setting=no-mutations/run_1 sqlsmith.go:110,sqlsmith.go:207,test_runner.go:755: pq: unrecognized configuration parameter "testing_vectorize_inject_panics" ``` <details><summary>More</summary><p> Artifacts: [/sqlsmith/setup=empty/setting=no-mutations](https://teamcity.cockroachdb.com/viewLog.html?buildId=2359532&tab=artifacts#/sqlsmith/setup=empty/setting=no-mutations) [See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Asqlsmith%2Fsetup%3Dempty%2Fsetting%3Dno-mutations.%2A&sort=title&restgroup=false&display=lastcommented+project) <sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
non_process
roachtest sqlsmith setup empty setting no mutations failed on the test failed on branch release cloud gce test artifacts and logs in home agent work go src github com cockroachdb cockroach artifacts sqlsmith setup empty setting no mutations run sqlsmith go sqlsmith go test runner go pq unrecognized configuration parameter testing vectorize inject panics more artifacts powered by
0
78,023
22,089,701,043
IssuesEvent
2022-06-01 04:16:35
spack/spack
https://api.github.com/repos/spack/spack
closed
kokkos@3.4.00 +sycl +tests: Kokkos_Setup_SYCL.hpp: 'CL/sycl.hpp' file e not found
build-error e4s oneapi
### Steps to reproduce the issue `kokkos@3.4.00 +sycl +tests +debug std=17` is failing to build using: * `spack@develop` (372fc78e9869e36c54ac86bfd20e2447df0a8405 from `Tue Nov 2 19:37:23 2021 +0800`) * Ubuntu 20.04 * Intel OneAPI 2021.4.0 Concrete spec: [kokkos.spec.yaml.txt](https://github.com/spack/spack/files/7462312/kokkos.spec.yaml.txt) Concretization: ``` root@3073e3329c5a:/# spack spec -I kokkos@3.4.00 +sycl +tests +debug %oneapi ^cmake%gcc ^openssl%gcc Input spec -------------------------------- - kokkos@3.4.00%oneapi+debug+sycl+tests - ^cmake%gcc - ^openssl%gcc Concretized -------------------------------- - kokkos@3.4.00%oneapi@2021.4.0~aggressive_vectorization~compiler_warnings~cuda~cuda_constexpr~cuda_lambda~cuda_ldg_intrinsic~cuda_relocatable_device_code~cuda_uvm+debug~debug_bounds_check~debug_dualview_modify_check~deprecated_code~examples~explicit_instantiation~hpx~hpx_async_dispatch~hwloc~ipo~memkind~numactl~openmp~pic+profiling~profiling_load_print~pthread~qthread~rocm+serial+shared+sycl+tests~tuning~wrapper amdgpu_target=none build_type=RelWithDebInfo cuda_arch=none std=14 arch=linux-ubuntu20.04-cascadelake [+] ^cmake@3.21.4%gcc@11.1.0~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-ubuntu20.04-cascadelake [+] ^ncurses@6.2%gcc@11.1.0~symlinks+termlib abi=none arch=linux-ubuntu20.04-cascadelake [+] ^pkgconf@1.8.0%gcc@11.1.0 arch=linux-ubuntu20.04-cascadelake [+] ^openssl@1.1.1l%gcc@11.1.0~docs certs=system arch=linux-ubuntu20.04-cascadelake [+] ^perl@5.34.0%gcc@11.1.0+cpanm+shared+threads arch=linux-ubuntu20.04-cascadelake [+] ^berkeley-db@18.1.40%gcc@11.1.0+cxx~docs+stl patches=b231fcc4d5cff05e5c3a4814f6a5af0e9a966428dc2176540d2c05aff41de522 arch=linux-ubuntu20.04-cascadelake [+] ^bzip2@1.0.8%gcc@11.1.0~debug~pic+shared arch=linux-ubuntu20.04-cascadelake [+] ^diffutils@3.8%gcc@11.1.0 arch=linux-ubuntu20.04-cascadelake [+] ^libiconv@1.16%gcc@11.1.0 libs=shared,static arch=linux-ubuntu20.04-cascadelake [+] ^gdbm@1.19%gcc@11.1.0 arch=linux-ubuntu20.04-cascadelake [+] ^readline@8.1%gcc@11.1.0 arch=linux-ubuntu20.04-cascadelake [+] ^zlib@1.2.11%gcc@11.1.0+optimize+pic+shared arch=linux-ubuntu20.04-cascadelake ``` ``` ==> Installing kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz ==> No binary for kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz found: installing from source ==> Using cached archive: /spack/var/spack/cache/_source-cache/archive/2e/2e4438f9e4767442d8a55e65d000cc9cde92277d415ab4913a96cd3ad901d317.tar.gz ==> No patches needed for kokkos ==> kokkos: Executing phase: 'cmake' ==> kokkos: Executing phase: 'build' ==> Error: ProcessError: Command exited with status 2: 'make' '-j16' 25 errors found in build log: 95 cd /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/core/src && /spack/lib/spack/env/oneapi/icpx -DKOKKOS_DEPENDE NCE -Dkokkoscore_EXPORTS -I/tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/core/src -I/tmp/root/spack-stage/spac k-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src -I/tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-bu ild-3jnklvt -O2 -g -DNDEBUG -fPIC -std=gnu++17 -MD -MT core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Serial.cpp.o -MF CMakeFiles/kokkoscore.dir/impl/Kokkos_Serial.c pp.o.d -o CMakeFiles/kokkoscore.dir/impl/Kokkos_Serial.cpp.o -c /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/i mpl/Kokkos_Serial.cpp 96 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_ExecPolicy.cpp:45: 97 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core.hpp:51: 98 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core_fwd.hpp:52: 99 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 100 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 101 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 102 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_Core.cpp:45: 103 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core.hpp:51: 104 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core_fwd.hpp:52: 105 In file included from #include <CL/sycl.hpp>/tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp 106 : ^~~~~~~~~~~~~ 107 109: 108 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 109 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 110 #include <CL/sycl.hpp> 111 ^~~~~~~~~~~~~ 112 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_HostBarrier.cpp:45: 113 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 114 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 115 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 116 #include <CL/sycl.hpp> 117 ^~~~~~~~~~~~~ 118 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_MemorySpace.cpp:50: 119 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_MemorySpace.hpp:53: 120 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 121 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 122 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 123 #include <CL/sycl.hpp> 124 ^~~~~~~~~~~~~ 125 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_HostSpace_deepcopy.cpp:45: 126 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core.hpp:51: 127 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core_fwd.hpp:52: 128 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 129 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 130 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 131 #include <CL/sycl.hpp> 132 ^~~~~~~~~~~~~ 133 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_NumericTraits.cpp:1: 134 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_NumericTraits.hpp:48: 135 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 136 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 137 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 138 #include <CL/sycl.hpp> 139 ^~~~~~~~~~~~~ 140 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_HostSpace.cpp:45: 141 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 142 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 143 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 144 #include <CL/sycl.hpp> 145 ^~~~~~~~~~~~~ 146 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_Profiling.cpp:45: 147 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 148 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 149 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 150 #include <CL/sycl.hpp> 151 ^~~~~~~~~~~~~ 152 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_HostThreadTeam.cpp:46: 153 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 154 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 155 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 156 #include <CL/sycl.hpp> 157 ^~~~~~~~~~~~~ 158 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_Serial.cpp:45: 159 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core.hpp:51: 160 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core_fwd.hpp:52: 161 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 162 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 163 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 164 #include <CL/sycl.hpp> 165 ^~~~~~~~~~~~~ 166 1 error generated. >> 167 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:219: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_NumericTraits.cpp.o] Error 1 168 make[2]: *** Waiting for unfinished jobs.... 169 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_MemoryPool.cpp:45: 170 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_Error.hpp:50: 171 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 172 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 173 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 174 #include <CL/sycl.hpp> 175 ^~~~~~~~~~~~~ 176 1 error generated. >> 177 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:205: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_MemorySpace.cpp.o] Error 1 178 1 error generated. >> 179 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:191: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_MemoryPool.cpp.o] Error 1 180 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_Error.cpp:53: 181 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_Error.hpp:50: 182 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 183 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 184 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 185 #include <CL/sycl.hpp> 186 ^~~~~~~~~~~~~ 187 1 error generated. >> 188 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:107: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Error.cpp.o] Error 1 189 [ 4%] Linking CXX shared library libkokkosprinter-tool.so 190 cd /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/core/unit_test && /spack/opt/spack/linux-ubuntu20.04-cascadel ake/gcc-11.1.0/cmake-3.21.4-yyah5puj2sv7t237i2r5iugsltd5uvn6/bin/cmake -E cmake_link_script CMakeFiles/kokkosprinter-tool.dir/link.txt --verbose=1 191 /spack/lib/spack/env/oneapi/icpx -fPIC -O2 -g -DNDEBUG -shared -Wl,-soname,libkokkosprinter-tool.so -o libkokkosprinter-tool.so CMakeFiles/kokkosprinter-tool.dir/too ls/printing-tool.cpp.o 192 1 error generated. >> 193 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:135: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostBarrier.cpp.o] Error 1 194 1 error generated. 195 1 error generated. 196 make[2]: Leaving directory '/tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt' >> 197 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:233: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Profiling.cpp.o] Error 1 >> 198 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:149: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostSpace.cpp.o] Error 1 199 [ 4%] Built target kokkosprinter-tool 200 1 error generated. >> 201 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:177: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostThreadTeam.cpp.o] Error 1 202 1 error generated. 203 1 error generated. >> 204 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:121: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_ExecPolicy.cpp.o] Error 1 >> 205 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:163: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostSpace_deepcopy.cpp.o] Error 1 206 1 error generated. >> 207 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:247: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Serial.cpp.o] Error 1 208 1 error generated. >> 209 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:93: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Core.cpp.o] Error 1 210 make[2]: Leaving directory '/tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt' >> 211 make[1]: *** [CMakeFiles/Makefile2:1295: core/src/CMakeFiles/kokkoscore.dir/all] Error 2 212 make[1]: *** Waiting for unfinished jobs.... 213 [ 4%] Linking CXX shared library libkokkosalgorithms_gtest.so 214 cd /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/algorithms/unit_tests && /spack/opt/spack/linux-ubuntu20.04-c ascadelake/gcc-11.1.0/cmake-3.21.4-yyah5puj2sv7t237i2r5iugsltd5uvn6/bin/cmake -E cmake_link_script CMakeFiles/kokkosalgorithms_gtest.dir/link.txt --verbose=1 215 /spack/lib/spack/env/oneapi/icpx -fPIC -O2 -g -DNDEBUG -shared -Wl,-soname,libkokkosalgorithms_gtest.so -o libkokkosalgorithms_gtest.so CMakeFiles/kokkosalgorithms_g test.dir/__/__/tpls/gtest/gtest/gtest-all.cc.o 216 [ 5%] Linking CXX shared library libkokkos_gtest.so 217 cd /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/core/unit_test && /spack/opt/spack/linux-ubuntu20.04-cascadel ake/gcc-11.1.0/cmake-3.21.4-yyah5puj2sv7t237i2r5iugsltd5uvn6/bin/cmake -E cmake_link_script CMakeFiles/kokkos_gtest.dir/link.txt --verbose=1 ``` sycl.hpp is here: ``` root@3073e3329c5a:/# find /opt/intel/ -type f -name sycl.hpp /opt/intel/oneapi/compiler/2021.4.0/linux/include/sycl/CL/sycl.hpp ``` ### Information on your system * **Spack:** 0.16.3-5185-372fc78e98 * **Python:** 3.8.10 * **Platform:** linux-ubuntu20.04-cascadelake * **Concretizer:** clingo ### Additional information [kokkos-3.4.00-build-out.txt](https://github.com/spack/spack/files/7462287/kokkos-3.4.00-build-out.txt) [kokkos-3.4.00-build-env.txt](https://github.com/spack/spack/files/7462297/kokkos-3.4.00-build-env.txt) @DavidPoliakoff @jciesko ### General information - [X] I have run `spack debug report` and reported the version of Spack/Python/Platform - [X] I have run `spack maintainers <name-of-the-package>` and **@mentioned** any maintainers - [X] I have uploaded the build log and environment files - [X] I have searched the issues of this repo and believe this is not a duplicate
1.0
kokkos@3.4.00 +sycl +tests: Kokkos_Setup_SYCL.hpp: 'CL/sycl.hpp' file e not found - ### Steps to reproduce the issue `kokkos@3.4.00 +sycl +tests +debug std=17` is failing to build using: * `spack@develop` (372fc78e9869e36c54ac86bfd20e2447df0a8405 from `Tue Nov 2 19:37:23 2021 +0800`) * Ubuntu 20.04 * Intel OneAPI 2021.4.0 Concrete spec: [kokkos.spec.yaml.txt](https://github.com/spack/spack/files/7462312/kokkos.spec.yaml.txt) Concretization: ``` root@3073e3329c5a:/# spack spec -I kokkos@3.4.00 +sycl +tests +debug %oneapi ^cmake%gcc ^openssl%gcc Input spec -------------------------------- - kokkos@3.4.00%oneapi+debug+sycl+tests - ^cmake%gcc - ^openssl%gcc Concretized -------------------------------- - kokkos@3.4.00%oneapi@2021.4.0~aggressive_vectorization~compiler_warnings~cuda~cuda_constexpr~cuda_lambda~cuda_ldg_intrinsic~cuda_relocatable_device_code~cuda_uvm+debug~debug_bounds_check~debug_dualview_modify_check~deprecated_code~examples~explicit_instantiation~hpx~hpx_async_dispatch~hwloc~ipo~memkind~numactl~openmp~pic+profiling~profiling_load_print~pthread~qthread~rocm+serial+shared+sycl+tests~tuning~wrapper amdgpu_target=none build_type=RelWithDebInfo cuda_arch=none std=14 arch=linux-ubuntu20.04-cascadelake [+] ^cmake@3.21.4%gcc@11.1.0~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-ubuntu20.04-cascadelake [+] ^ncurses@6.2%gcc@11.1.0~symlinks+termlib abi=none arch=linux-ubuntu20.04-cascadelake [+] ^pkgconf@1.8.0%gcc@11.1.0 arch=linux-ubuntu20.04-cascadelake [+] ^openssl@1.1.1l%gcc@11.1.0~docs certs=system arch=linux-ubuntu20.04-cascadelake [+] ^perl@5.34.0%gcc@11.1.0+cpanm+shared+threads arch=linux-ubuntu20.04-cascadelake [+] ^berkeley-db@18.1.40%gcc@11.1.0+cxx~docs+stl patches=b231fcc4d5cff05e5c3a4814f6a5af0e9a966428dc2176540d2c05aff41de522 arch=linux-ubuntu20.04-cascadelake [+] ^bzip2@1.0.8%gcc@11.1.0~debug~pic+shared arch=linux-ubuntu20.04-cascadelake [+] ^diffutils@3.8%gcc@11.1.0 arch=linux-ubuntu20.04-cascadelake [+] ^libiconv@1.16%gcc@11.1.0 libs=shared,static arch=linux-ubuntu20.04-cascadelake [+] ^gdbm@1.19%gcc@11.1.0 arch=linux-ubuntu20.04-cascadelake [+] ^readline@8.1%gcc@11.1.0 arch=linux-ubuntu20.04-cascadelake [+] ^zlib@1.2.11%gcc@11.1.0+optimize+pic+shared arch=linux-ubuntu20.04-cascadelake ``` ``` ==> Installing kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz ==> No binary for kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz found: installing from source ==> Using cached archive: /spack/var/spack/cache/_source-cache/archive/2e/2e4438f9e4767442d8a55e65d000cc9cde92277d415ab4913a96cd3ad901d317.tar.gz ==> No patches needed for kokkos ==> kokkos: Executing phase: 'cmake' ==> kokkos: Executing phase: 'build' ==> Error: ProcessError: Command exited with status 2: 'make' '-j16' 25 errors found in build log: 95 cd /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/core/src && /spack/lib/spack/env/oneapi/icpx -DKOKKOS_DEPENDE NCE -Dkokkoscore_EXPORTS -I/tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/core/src -I/tmp/root/spack-stage/spac k-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src -I/tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-bu ild-3jnklvt -O2 -g -DNDEBUG -fPIC -std=gnu++17 -MD -MT core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Serial.cpp.o -MF CMakeFiles/kokkoscore.dir/impl/Kokkos_Serial.c pp.o.d -o CMakeFiles/kokkoscore.dir/impl/Kokkos_Serial.cpp.o -c /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/i mpl/Kokkos_Serial.cpp 96 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_ExecPolicy.cpp:45: 97 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core.hpp:51: 98 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core_fwd.hpp:52: 99 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 100 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 101 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 102 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_Core.cpp:45: 103 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core.hpp:51: 104 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core_fwd.hpp:52: 105 In file included from #include <CL/sycl.hpp>/tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp 106 : ^~~~~~~~~~~~~ 107 109: 108 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 109 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 110 #include <CL/sycl.hpp> 111 ^~~~~~~~~~~~~ 112 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_HostBarrier.cpp:45: 113 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 114 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 115 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 116 #include <CL/sycl.hpp> 117 ^~~~~~~~~~~~~ 118 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_MemorySpace.cpp:50: 119 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_MemorySpace.hpp:53: 120 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 121 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 122 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 123 #include <CL/sycl.hpp> 124 ^~~~~~~~~~~~~ 125 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_HostSpace_deepcopy.cpp:45: 126 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core.hpp:51: 127 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core_fwd.hpp:52: 128 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 129 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 130 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 131 #include <CL/sycl.hpp> 132 ^~~~~~~~~~~~~ 133 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_NumericTraits.cpp:1: 134 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_NumericTraits.hpp:48: 135 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 136 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 137 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 138 #include <CL/sycl.hpp> 139 ^~~~~~~~~~~~~ 140 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_HostSpace.cpp:45: 141 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 142 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 143 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 144 #include <CL/sycl.hpp> 145 ^~~~~~~~~~~~~ 146 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_Profiling.cpp:45: 147 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 148 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 149 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 150 #include <CL/sycl.hpp> 151 ^~~~~~~~~~~~~ 152 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_HostThreadTeam.cpp:46: 153 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 154 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 155 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 156 #include <CL/sycl.hpp> 157 ^~~~~~~~~~~~~ 158 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_Serial.cpp:45: 159 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core.hpp:51: 160 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Core_fwd.hpp:52: 161 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 162 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 163 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 164 #include <CL/sycl.hpp> 165 ^~~~~~~~~~~~~ 166 1 error generated. >> 167 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:219: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_NumericTraits.cpp.o] Error 1 168 make[2]: *** Waiting for unfinished jobs.... 169 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_MemoryPool.cpp:45: 170 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_Error.hpp:50: 171 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 172 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 173 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 174 #include <CL/sycl.hpp> 175 ^~~~~~~~~~~~~ 176 1 error generated. >> 177 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:205: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_MemorySpace.cpp.o] Error 1 178 1 error generated. >> 179 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:191: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_MemoryPool.cpp.o] Error 1 180 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_Error.cpp:53: 181 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/impl/Kokkos_Error.hpp:50: 182 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/Kokkos_Macros.hpp:109: 183 In file included from /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/KokkosCore_Config_SetupBackend.hpp:47: >> 184 /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-src/core/src/setup/Kokkos_Setup_SYCL.hpp:48:10: fatal error: 'CL/sycl.hpp' fil e not found 185 #include <CL/sycl.hpp> 186 ^~~~~~~~~~~~~ 187 1 error generated. >> 188 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:107: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Error.cpp.o] Error 1 189 [ 4%] Linking CXX shared library libkokkosprinter-tool.so 190 cd /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/core/unit_test && /spack/opt/spack/linux-ubuntu20.04-cascadel ake/gcc-11.1.0/cmake-3.21.4-yyah5puj2sv7t237i2r5iugsltd5uvn6/bin/cmake -E cmake_link_script CMakeFiles/kokkosprinter-tool.dir/link.txt --verbose=1 191 /spack/lib/spack/env/oneapi/icpx -fPIC -O2 -g -DNDEBUG -shared -Wl,-soname,libkokkosprinter-tool.so -o libkokkosprinter-tool.so CMakeFiles/kokkosprinter-tool.dir/too ls/printing-tool.cpp.o 192 1 error generated. >> 193 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:135: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostBarrier.cpp.o] Error 1 194 1 error generated. 195 1 error generated. 196 make[2]: Leaving directory '/tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt' >> 197 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:233: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Profiling.cpp.o] Error 1 >> 198 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:149: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostSpace.cpp.o] Error 1 199 [ 4%] Built target kokkosprinter-tool 200 1 error generated. >> 201 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:177: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostThreadTeam.cpp.o] Error 1 202 1 error generated. 203 1 error generated. >> 204 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:121: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_ExecPolicy.cpp.o] Error 1 >> 205 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:163: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_HostSpace_deepcopy.cpp.o] Error 1 206 1 error generated. >> 207 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:247: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Serial.cpp.o] Error 1 208 1 error generated. >> 209 make[2]: *** [core/src/CMakeFiles/kokkoscore.dir/build.make:93: core/src/CMakeFiles/kokkoscore.dir/impl/Kokkos_Core.cpp.o] Error 1 210 make[2]: Leaving directory '/tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt' >> 211 make[1]: *** [CMakeFiles/Makefile2:1295: core/src/CMakeFiles/kokkoscore.dir/all] Error 2 212 make[1]: *** Waiting for unfinished jobs.... 213 [ 4%] Linking CXX shared library libkokkosalgorithms_gtest.so 214 cd /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/algorithms/unit_tests && /spack/opt/spack/linux-ubuntu20.04-c ascadelake/gcc-11.1.0/cmake-3.21.4-yyah5puj2sv7t237i2r5iugsltd5uvn6/bin/cmake -E cmake_link_script CMakeFiles/kokkosalgorithms_gtest.dir/link.txt --verbose=1 215 /spack/lib/spack/env/oneapi/icpx -fPIC -O2 -g -DNDEBUG -shared -Wl,-soname,libkokkosalgorithms_gtest.so -o libkokkosalgorithms_gtest.so CMakeFiles/kokkosalgorithms_g test.dir/__/__/tpls/gtest/gtest/gtest-all.cc.o 216 [ 5%] Linking CXX shared library libkokkos_gtest.so 217 cd /tmp/root/spack-stage/spack-stage-kokkos-3.4.00-3jnklvtv4o22nw5af7dw4opgmg7daqhz/spack-build-3jnklvt/core/unit_test && /spack/opt/spack/linux-ubuntu20.04-cascadel ake/gcc-11.1.0/cmake-3.21.4-yyah5puj2sv7t237i2r5iugsltd5uvn6/bin/cmake -E cmake_link_script CMakeFiles/kokkos_gtest.dir/link.txt --verbose=1 ``` sycl.hpp is here: ``` root@3073e3329c5a:/# find /opt/intel/ -type f -name sycl.hpp /opt/intel/oneapi/compiler/2021.4.0/linux/include/sycl/CL/sycl.hpp ``` ### Information on your system * **Spack:** 0.16.3-5185-372fc78e98 * **Python:** 3.8.10 * **Platform:** linux-ubuntu20.04-cascadelake * **Concretizer:** clingo ### Additional information [kokkos-3.4.00-build-out.txt](https://github.com/spack/spack/files/7462287/kokkos-3.4.00-build-out.txt) [kokkos-3.4.00-build-env.txt](https://github.com/spack/spack/files/7462297/kokkos-3.4.00-build-env.txt) @DavidPoliakoff @jciesko ### General information - [X] I have run `spack debug report` and reported the version of Spack/Python/Platform - [X] I have run `spack maintainers <name-of-the-package>` and **@mentioned** any maintainers - [X] I have uploaded the build log and environment files - [X] I have searched the issues of this repo and believe this is not a duplicate
non_process
kokkos sycl tests kokkos setup sycl hpp cl sycl hpp file e not found steps to reproduce the issue kokkos sycl tests debug std is failing to build using spack develop from tue nov ubuntu intel oneapi concrete spec concretization root spack spec i kokkos sycl tests debug oneapi cmake gcc openssl gcc input spec kokkos oneapi debug sycl tests cmake gcc openssl gcc concretized kokkos oneapi aggressive vectorization compiler warnings cuda cuda constexpr cuda lambda cuda ldg intrinsic cuda relocatable device code cuda uvm debug debug bounds check debug dualview modify check deprecated code examples explicit instantiation hpx hpx async dispatch hwloc ipo memkind numactl openmp pic profiling profiling load print pthread qthread rocm serial shared sycl tests tuning wrapper amdgpu target none build type relwithdebinfo cuda arch none std arch linux cascadelake cmake gcc doc ncurses openssl ownlibs qt build type release arch linux cascadelake ncurses gcc symlinks termlib abi none arch linux cascadelake pkgconf gcc arch linux cascadelake openssl gcc docs certs system arch linux cascadelake perl gcc cpanm shared threads arch linux cascadelake berkeley db gcc cxx docs stl patches arch linux cascadelake gcc debug pic shared arch linux cascadelake diffutils gcc arch linux cascadelake libiconv gcc libs shared static arch linux cascadelake gdbm gcc arch linux cascadelake readline gcc arch linux cascadelake zlib gcc optimize pic shared arch linux cascadelake installing kokkos no binary for kokkos found installing from source using cached archive spack var spack cache source cache archive tar gz no patches needed for kokkos kokkos executing phase cmake kokkos executing phase build error processerror command exited with status make errors found in build log cd tmp root spack stage spack stage kokkos spack build core src spack lib spack env oneapi icpx dkokkos depende nce dkokkoscore exports i tmp root spack stage spack stage kokkos spack build core src i tmp root spack stage spac k stage kokkos spack src core src i tmp root spack stage spack stage kokkos spack bu ild g dndebug fpic std gnu md mt core src cmakefiles kokkoscore dir impl kokkos serial cpp o mf cmakefiles kokkoscore dir impl kokkos serial c pp o d o cmakefiles kokkoscore dir impl kokkos serial cpp o c tmp root spack stage spack stage kokkos spack src core src i mpl kokkos serial cpp in file included from tmp root spack stage spack stage kokkos spack src core src impl kokkos execpolicy cpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos core hpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos core fwd hpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos macros hpp in file included from tmp root spack stage spack stage kokkos spack build kokkoscore config setupbackend hpp tmp root spack stage spack stage kokkos spack src core src setup kokkos setup sycl hpp fatal error cl sycl hpp fil e not found in file included from tmp root spack stage spack stage kokkos spack src core src impl kokkos core cpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos core hpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos core fwd hpp in file included from include tmp root spack stage spack stage kokkos spack src core src kokkos macros hpp in file included from tmp root spack stage spack stage kokkos spack build kokkoscore config setupbackend hpp tmp root spack stage spack stage kokkos spack src core src setup kokkos setup sycl hpp fatal error cl sycl hpp fil e not found include in file included from tmp root spack stage spack stage kokkos spack src core src impl kokkos hostbarrier cpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos macros hpp in file included from tmp root spack stage spack stage kokkos spack build kokkoscore config setupbackend hpp tmp root spack stage spack stage kokkos spack src core src setup kokkos setup sycl hpp fatal error cl sycl hpp fil e not found include in file included from tmp root spack stage spack stage kokkos spack src core src impl kokkos memoryspace cpp in file included from tmp root spack stage spack stage kokkos spack src core src impl kokkos memoryspace hpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos macros hpp in file included from tmp root spack stage spack stage kokkos spack build kokkoscore config setupbackend hpp tmp root spack stage spack stage kokkos spack src core src setup kokkos setup sycl hpp fatal error cl sycl hpp fil e not found include in file included from tmp root spack stage spack stage kokkos spack src core src impl kokkos hostspace deepcopy cpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos core hpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos core fwd hpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos macros hpp in file included from tmp root spack stage spack stage kokkos spack build kokkoscore config setupbackend hpp tmp root spack stage spack stage kokkos spack src core src setup kokkos setup sycl hpp fatal error cl sycl hpp fil e not found include in file included from tmp root spack stage spack stage kokkos spack src core src impl kokkos numerictraits cpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos numerictraits hpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos macros hpp in file included from tmp root spack stage spack stage kokkos spack build kokkoscore config setupbackend hpp tmp root spack stage spack stage kokkos spack src core src setup kokkos setup sycl hpp fatal error cl sycl hpp fil e not found include in file included from tmp root spack stage spack stage kokkos spack src core src impl kokkos hostspace cpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos macros hpp in file included from tmp root spack stage spack stage kokkos spack build kokkoscore config setupbackend hpp tmp root spack stage spack stage kokkos spack src core src setup kokkos setup sycl hpp fatal error cl sycl hpp fil e not found include in file included from tmp root spack stage spack stage kokkos spack src core src impl kokkos profiling cpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos macros hpp in file included from tmp root spack stage spack stage kokkos spack build kokkoscore config setupbackend hpp tmp root spack stage spack stage kokkos spack src core src setup kokkos setup sycl hpp fatal error cl sycl hpp fil e not found include in file included from tmp root spack stage spack stage kokkos spack src core src impl kokkos hostthreadteam cpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos macros hpp in file included from tmp root spack stage spack stage kokkos spack build kokkoscore config setupbackend hpp tmp root spack stage spack stage kokkos spack src core src setup kokkos setup sycl hpp fatal error cl sycl hpp fil e not found include in file included from tmp root spack stage spack stage kokkos spack src core src impl kokkos serial cpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos core hpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos core fwd hpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos macros hpp in file included from tmp root spack stage spack stage kokkos spack build kokkoscore config setupbackend hpp tmp root spack stage spack stage kokkos spack src core src setup kokkos setup sycl hpp fatal error cl sycl hpp fil e not found include error generated make error make waiting for unfinished jobs in file included from tmp root spack stage spack stage kokkos spack src core src impl kokkos memorypool cpp in file included from tmp root spack stage spack stage kokkos spack src core src impl kokkos error hpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos macros hpp in file included from tmp root spack stage spack stage kokkos spack build kokkoscore config setupbackend hpp tmp root spack stage spack stage kokkos spack src core src setup kokkos setup sycl hpp fatal error cl sycl hpp fil e not found include error generated make error error generated make error in file included from tmp root spack stage spack stage kokkos spack src core src impl kokkos error cpp in file included from tmp root spack stage spack stage kokkos spack src core src impl kokkos error hpp in file included from tmp root spack stage spack stage kokkos spack src core src kokkos macros hpp in file included from tmp root spack stage spack stage kokkos spack build kokkoscore config setupbackend hpp tmp root spack stage spack stage kokkos spack src core src setup kokkos setup sycl hpp fatal error cl sycl hpp fil e not found include error generated make error linking cxx shared library libkokkosprinter tool so cd tmp root spack stage spack stage kokkos spack build core unit test spack opt spack linux cascadel ake gcc cmake bin cmake e cmake link script cmakefiles kokkosprinter tool dir link txt verbose spack lib spack env oneapi icpx fpic g dndebug shared wl soname libkokkosprinter tool so o libkokkosprinter tool so cmakefiles kokkosprinter tool dir too ls printing tool cpp o error generated make error error generated error generated make leaving directory tmp root spack stage spack stage kokkos spack build make error make error built target kokkosprinter tool error generated make error error generated error generated make error make error error generated make error error generated make error make leaving directory tmp root spack stage spack stage kokkos spack build make error make waiting for unfinished jobs linking cxx shared library libkokkosalgorithms gtest so cd tmp root spack stage spack stage kokkos spack build algorithms unit tests spack opt spack linux c ascadelake gcc cmake bin cmake e cmake link script cmakefiles kokkosalgorithms gtest dir link txt verbose spack lib spack env oneapi icpx fpic g dndebug shared wl soname libkokkosalgorithms gtest so o libkokkosalgorithms gtest so cmakefiles kokkosalgorithms g test dir tpls gtest gtest gtest all cc o linking cxx shared library libkokkos gtest so cd tmp root spack stage spack stage kokkos spack build core unit test spack opt spack linux cascadel ake gcc cmake bin cmake e cmake link script cmakefiles kokkos gtest dir link txt verbose sycl hpp is here root find opt intel type f name sycl hpp opt intel oneapi compiler linux include sycl cl sycl hpp information on your system spack python platform linux cascadelake concretizer clingo additional information davidpoliakoff jciesko general information i have run spack debug report and reported the version of spack python platform i have run spack maintainers and mentioned any maintainers i have uploaded the build log and environment files i have searched the issues of this repo and believe this is not a duplicate
0
7,852
11,027,541,240
IssuesEvent
2019-12-06 09:41:07
prisma/prisma-engine
https://api.github.com/repos/prisma/prisma-engine
closed
Introspection engine --version argument
kind/improvement process/next-milestone
It is available in other engines, very useful for pin pointing binary version when debugging. ``` divyendusingh [test-introspection]$ ./migration-engine --version b8d90fea39d266b128b4d748db5aca8505bb1026 divyendusingh [test-introspection]$ ./introspection-engine --version ```
1.0
Introspection engine --version argument - It is available in other engines, very useful for pin pointing binary version when debugging. ``` divyendusingh [test-introspection]$ ./migration-engine --version b8d90fea39d266b128b4d748db5aca8505bb1026 divyendusingh [test-introspection]$ ./introspection-engine --version ```
process
introspection engine version argument it is available in other engines very useful for pin pointing binary version when debugging divyendusingh migration engine version divyendusingh introspection engine version
1
740,697
25,763,517,276
IssuesEvent
2022-12-08 22:54:58
CDCgov/prime-reportstream
https://api.github.com/repos/CDCgov/prime-reportstream
closed
Frontend Dependency scan for 04/19/2022
frontend experience dependencies Low Priority
## Overview This ticket is auto-generated by a Github Action that scans our frontend for any unused dependencies. Currently we are using the NPM package depcheck to generate the below output. <p> __Note__: To learn more about depcheck, please click [here](https://www.npmjs.com/package/depcheck) ___ ## Output ``` Unused dependencies * env-cmd * node-sass * rimraf Unused devDependencies * @typescript-eslint/eslint-plugin * eslint * npm-run-all * prettier * react-scripts * sass * typescript Missing dependencies * eslint-plugin-import: ./package.json * styles: ./src/global.scss * @jest/types: ./src/setupTests.ts ```
1.0
Frontend Dependency scan for 04/19/2022 - ## Overview This ticket is auto-generated by a Github Action that scans our frontend for any unused dependencies. Currently we are using the NPM package depcheck to generate the below output. <p> __Note__: To learn more about depcheck, please click [here](https://www.npmjs.com/package/depcheck) ___ ## Output ``` Unused dependencies * env-cmd * node-sass * rimraf Unused devDependencies * @typescript-eslint/eslint-plugin * eslint * npm-run-all * prettier * react-scripts * sass * typescript Missing dependencies * eslint-plugin-import: ./package.json * styles: ./src/global.scss * @jest/types: ./src/setupTests.ts ```
non_process
frontend dependency scan for overview this ticket is auto generated by a github action that scans our frontend for any unused dependencies currently we are using the npm package depcheck to generate the below output note to learn more about depcheck please click output unused dependencies env cmd node sass rimraf unused devdependencies typescript eslint eslint plugin eslint npm run all prettier react scripts sass typescript missing dependencies eslint plugin import package json styles src global scss jest types src setuptests ts
0
141,868
12,990,567,620
IssuesEvent
2020-07-23 00:27:20
udistrital/financiera_documentacion
https://api.github.com/repos/udistrital/financiera_documentacion
opened
Elaboración Historia de Usuario Calendario Tributario
Documentation
Se levanta el respectivo mockup para elaboración de las hu [https://drive.google.com/file/d/1am3TjQENDWnVyOYWFXFJkYpyGoCOHiA-/view?usp=sharing](url)
1.0
Elaboración Historia de Usuario Calendario Tributario - Se levanta el respectivo mockup para elaboración de las hu [https://drive.google.com/file/d/1am3TjQENDWnVyOYWFXFJkYpyGoCOHiA-/view?usp=sharing](url)
non_process
elaboración historia de usuario calendario tributario se levanta el respectivo mockup para elaboración de las hu url
0
59,948
14,676,996,287
IssuesEvent
2020-12-30 21:49:10
ziglang/zig
https://api.github.com/repos/ziglang/zig
closed
proposal: make default install directory same as dirname(build.zig)
breaking proposal zig build system
Right now, the default `zig-cache` directory is cwd(). Consider this: ``` $ zig build $ cd build; vim test.zig $ ../zig-cache/bin/zig build-exe test.zig $ vim ../src/main.zig $ zig build $ ../zig-cache/bin/zig build-exe test.zig ``` Now I am using the outdated version of the executable instead of the one in `./zig-cache/bin/zig`. This has happened to me more times than i can count. @FireFox317 and pjz on discord also had this problem ``` g_w1 : does anyone feel like the default behavior of zig-cache not being in the project root directory, but being in cwd() very confusing. I always have to allocate some brainpower to remember where I ran zig build from in order to run the right binary g_w1 : I think it should be in the same dir as build.zig pjz : I agree! pjz : if zig build can find the nearest build.zig, then it can put the zig-cache in the same directory pjz : I did a zig build from inside a subdir once and got a bit confused as there was now one zig-cache next to build.zig and one in the subdir FireFox317 : Yep same struggle here ``` I am proposing this for the whole build system, not just the compiler build system. Is there any reason for it to be in cwd() that I am not seeing?
1.0
proposal: make default install directory same as dirname(build.zig) - Right now, the default `zig-cache` directory is cwd(). Consider this: ``` $ zig build $ cd build; vim test.zig $ ../zig-cache/bin/zig build-exe test.zig $ vim ../src/main.zig $ zig build $ ../zig-cache/bin/zig build-exe test.zig ``` Now I am using the outdated version of the executable instead of the one in `./zig-cache/bin/zig`. This has happened to me more times than i can count. @FireFox317 and pjz on discord also had this problem ``` g_w1 : does anyone feel like the default behavior of zig-cache not being in the project root directory, but being in cwd() very confusing. I always have to allocate some brainpower to remember where I ran zig build from in order to run the right binary g_w1 : I think it should be in the same dir as build.zig pjz : I agree! pjz : if zig build can find the nearest build.zig, then it can put the zig-cache in the same directory pjz : I did a zig build from inside a subdir once and got a bit confused as there was now one zig-cache next to build.zig and one in the subdir FireFox317 : Yep same struggle here ``` I am proposing this for the whole build system, not just the compiler build system. Is there any reason for it to be in cwd() that I am not seeing?
non_process
proposal make default install directory same as dirname build zig right now the default zig cache directory is cwd consider this zig build cd build vim test zig zig cache bin zig build exe test zig vim src main zig zig build zig cache bin zig build exe test zig now i am using the outdated version of the executable instead of the one in zig cache bin zig this has happened to me more times than i can count and pjz on discord also had this problem g does anyone feel like the default behavior of zig cache not being in the project root directory but being in cwd very confusing i always have to allocate some brainpower to remember where i ran zig build from in order to run the right binary g i think it should be in the same dir as build zig pjz i agree pjz if zig build can find the nearest build zig then it can put the zig cache in the same directory pjz i did a zig build from inside a subdir once and got a bit confused as there was now one zig cache next to build zig and one in the subdir yep same struggle here i am proposing this for the whole build system not just the compiler build system is there any reason for it to be in cwd that i am not seeing
0
22,518
31,567,541,286
IssuesEvent
2023-09-04 00:52:28
tdwg/hc
https://api.github.com/repos/tdwg/hc
opened
New Term - totalAreaSampledInSquareKilometers
Term - add normative Process - under public review Class - Event
## New term * Submitter: Humboldt Extension Task Group * Efficacy Justification (why is this term necessary?): Part of a package of terms in support of biological inventory data. * Demand Justification (name at least two organizations that independently need this term): The Humboldt Extension Task Group proposing this term consists of numerous organizations. * Stability Justification (what concerns are there that this might affect existing implementations?): None * Implications for dwciri: namespace (does this change affect a dwciri term version)?: None Proposed attributes of the new term: * Term name (in lowerCamelCase for properties, UpperCamelCase for classes): totalAreaSampledInSquareKilometers * Term label (English, not normative): Total Area Sampled In Square Kilometers * Organized in Class (e.g., Occurrence, Event, Location, Taxon): Event * Definition of the term (normative): Total area surveyed during the dwc:Event in square kilometers. * Usage comments (recommendations regarding content, etc., not normative): This area is always less than or equal to the geospatialScopeAreaInSquareKilometers. * Examples (not normative): 0.8 * Refines (identifier of the broader term this term refines; normative): None * Replaces (identifier of the existing term that would be deprecated and replaced by this term; normative): None * ABCD 2.06 (XPATH of the equivalent term in ABCD or EFG; not normative): not in ABCD
1.0
New Term - totalAreaSampledInSquareKilometers - ## New term * Submitter: Humboldt Extension Task Group * Efficacy Justification (why is this term necessary?): Part of a package of terms in support of biological inventory data. * Demand Justification (name at least two organizations that independently need this term): The Humboldt Extension Task Group proposing this term consists of numerous organizations. * Stability Justification (what concerns are there that this might affect existing implementations?): None * Implications for dwciri: namespace (does this change affect a dwciri term version)?: None Proposed attributes of the new term: * Term name (in lowerCamelCase for properties, UpperCamelCase for classes): totalAreaSampledInSquareKilometers * Term label (English, not normative): Total Area Sampled In Square Kilometers * Organized in Class (e.g., Occurrence, Event, Location, Taxon): Event * Definition of the term (normative): Total area surveyed during the dwc:Event in square kilometers. * Usage comments (recommendations regarding content, etc., not normative): This area is always less than or equal to the geospatialScopeAreaInSquareKilometers. * Examples (not normative): 0.8 * Refines (identifier of the broader term this term refines; normative): None * Replaces (identifier of the existing term that would be deprecated and replaced by this term; normative): None * ABCD 2.06 (XPATH of the equivalent term in ABCD or EFG; not normative): not in ABCD
process
new term totalareasampledinsquarekilometers new term submitter humboldt extension task group efficacy justification why is this term necessary part of a package of terms in support of biological inventory data demand justification name at least two organizations that independently need this term the humboldt extension task group proposing this term consists of numerous organizations stability justification what concerns are there that this might affect existing implementations none implications for dwciri namespace does this change affect a dwciri term version none proposed attributes of the new term term name in lowercamelcase for properties uppercamelcase for classes totalareasampledinsquarekilometers term label english not normative total area sampled in square kilometers organized in class e g occurrence event location taxon event definition of the term normative total area surveyed during the dwc event in square kilometers usage comments recommendations regarding content etc not normative this area is always less than or equal to the geospatialscopeareainsquarekilometers examples not normative refines identifier of the broader term this term refines normative none replaces identifier of the existing term that would be deprecated and replaced by this term normative none abcd xpath of the equivalent term in abcd or efg not normative not in abcd
1
250,989
18,921,416,926
IssuesEvent
2021-11-17 02:26:44
wakystuf/ESG-Mod
https://api.github.com/repos/wakystuf/ESG-Mod
opened
Lux Tweaks
balancing in progress needs documentation
1) Add the T1 effect to each of the FIDSI T2s 2) Increase Hydromiel and Proto-Orchid by 1, from +3 to +4 (but leave Void Stone and Giga Lattice at 2 and Ionic Crystal at 3) 3) Nerf Endless Foundries and Proto-Spores from 10% per SD to 7.5% per SD 4) Slightly reduce the amount of T2 required for each level of SD 5) Replace Vodyani food luxes with essence (but at lower numbers) and % one with hero XP
1.0
Lux Tweaks - 1) Add the T1 effect to each of the FIDSI T2s 2) Increase Hydromiel and Proto-Orchid by 1, from +3 to +4 (but leave Void Stone and Giga Lattice at 2 and Ionic Crystal at 3) 3) Nerf Endless Foundries and Proto-Spores from 10% per SD to 7.5% per SD 4) Slightly reduce the amount of T2 required for each level of SD 5) Replace Vodyani food luxes with essence (but at lower numbers) and % one with hero XP
non_process
lux tweaks add the effect to each of the fidsi increase hydromiel and proto orchid by from to but leave void stone and giga lattice at and ionic crystal at nerf endless foundries and proto spores from per sd to per sd slightly reduce the amount of required for each level of sd replace vodyani food luxes with essence but at lower numbers and one with hero xp
0
11,701
14,545,045,368
IssuesEvent
2020-12-15 19:03:53
MicrosoftDocs/azure-devops-docs
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
closed
ref/heads/master should be renamed to ref/heads/main
Pri3 devops-cicd-process/tech devops/prod doc-enhancement ready-to-doc
variable is called isMain but branch name is still master --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 3f151218-9a11-0078-e038-f96198a76143 * Version Independent ID: 09c4d032-62f3-d97c-79d7-6fbfd89910e9 * Content: [Conditions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/conditions?WT.mc_id=ITOpsTalk-blog-nepeters&view=azure-devops&tabs=yaml) * Content Source: [docs/pipelines/process/conditions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/conditions.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
1.0
ref/heads/master should be renamed to ref/heads/main - variable is called isMain but branch name is still master --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 3f151218-9a11-0078-e038-f96198a76143 * Version Independent ID: 09c4d032-62f3-d97c-79d7-6fbfd89910e9 * Content: [Conditions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/conditions?WT.mc_id=ITOpsTalk-blog-nepeters&view=azure-devops&tabs=yaml) * Content Source: [docs/pipelines/process/conditions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/conditions.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
process
ref heads master should be renamed to ref heads main variable is called ismain but branch name is still master document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
1
15,762
19,912,561,504
IssuesEvent
2022-01-25 18:42:46
MunchBit/MunchLove
https://api.github.com/repos/MunchBit/MunchLove
opened
Send Delivery Driver their collated salary after predefined/configurable number of days
feature Payment Process
**Title** Send Delivery Driver their collated salary after predefined/configurable number of days **Description** Send Delivery Driver their collated salary after predefined/configurable number of days. Eg the money of their sales accrued over 3 days for restaurant X is transferred into their account
1.0
Send Delivery Driver their collated salary after predefined/configurable number of days - **Title** Send Delivery Driver their collated salary after predefined/configurable number of days **Description** Send Delivery Driver their collated salary after predefined/configurable number of days. Eg the money of their sales accrued over 3 days for restaurant X is transferred into their account
process
send delivery driver their collated salary after predefined configurable number of days title send delivery driver their collated salary after predefined configurable number of days description send delivery driver their collated salary after predefined configurable number of days eg the money of their sales accrued over days for restaurant x is transferred into their account
1
94,731
16,021,690,103
IssuesEvent
2021-04-21 01:02:51
TIBCOSoftware/lmi-jdbc
https://api.github.com/repos/TIBCOSoftware/lmi-jdbc
opened
CVE-2020-13956 (Medium) detected in httpclient-4.5.2.jar
security vulnerability
## CVE-2020-13956 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>httpclient-4.5.2.jar</b></p></summary> <p>Apache HttpComponents Client</p> <p>Path to dependency file: /lmi-jdbc/pom.xml</p> <p>Path to vulnerable library: 2/repository/org/apache/httpcomponents/httpclient/4.5.2/httpclient-4.5.2.jar</p> <p> Dependency Hierarchy: - :x: **httpclient-4.5.2.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Apache HttpClient versions prior to version 4.5.13 and 5.0.3 can misinterpret malformed authority component in request URIs passed to the library as java.net.URI object and pick the wrong target host for request execution. <p>Publish Date: 2020-12-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-13956>CVE-2020-13956</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2020-13956">https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2020-13956</a></p> <p>Release Date: 2020-07-21</p> <p>Fix Resolution: org.apache.httpcomponents:httpclient:4.5.13;org.apache.httpcomponents:httpclient-osgi:4.5.13;org.apache.httpcomponents.client5:httpclient5:5.0.3;org.apache.httpcomponents.client5:httpclient5-osgi:5.0.3</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.httpcomponents","packageName":"httpclient","packageVersion":"4.5.2","packageFilePaths":["/lmi-jdbc/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"org.apache.httpcomponents:httpclient:4.5.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.httpcomponents:httpclient:4.5.13;org.apache.httpcomponents:httpclient-osgi:4.5.13;org.apache.httpcomponents.client5:httpclient5:5.0.3;org.apache.httpcomponents.client5:httpclient5-osgi:5.0.3"}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-13956","vulnerabilityDetails":"Apache HttpClient versions prior to version 4.5.13 and 5.0.3 can misinterpret malformed authority component in request URIs passed to the library as java.net.URI object and pick the wrong target host for request execution.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-13956","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
True
CVE-2020-13956 (Medium) detected in httpclient-4.5.2.jar - ## CVE-2020-13956 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>httpclient-4.5.2.jar</b></p></summary> <p>Apache HttpComponents Client</p> <p>Path to dependency file: /lmi-jdbc/pom.xml</p> <p>Path to vulnerable library: 2/repository/org/apache/httpcomponents/httpclient/4.5.2/httpclient-4.5.2.jar</p> <p> Dependency Hierarchy: - :x: **httpclient-4.5.2.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Apache HttpClient versions prior to version 4.5.13 and 5.0.3 can misinterpret malformed authority component in request URIs passed to the library as java.net.URI object and pick the wrong target host for request execution. <p>Publish Date: 2020-12-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-13956>CVE-2020-13956</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2020-13956">https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2020-13956</a></p> <p>Release Date: 2020-07-21</p> <p>Fix Resolution: org.apache.httpcomponents:httpclient:4.5.13;org.apache.httpcomponents:httpclient-osgi:4.5.13;org.apache.httpcomponents.client5:httpclient5:5.0.3;org.apache.httpcomponents.client5:httpclient5-osgi:5.0.3</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.httpcomponents","packageName":"httpclient","packageVersion":"4.5.2","packageFilePaths":["/lmi-jdbc/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"org.apache.httpcomponents:httpclient:4.5.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.httpcomponents:httpclient:4.5.13;org.apache.httpcomponents:httpclient-osgi:4.5.13;org.apache.httpcomponents.client5:httpclient5:5.0.3;org.apache.httpcomponents.client5:httpclient5-osgi:5.0.3"}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-13956","vulnerabilityDetails":"Apache HttpClient versions prior to version 4.5.13 and 5.0.3 can misinterpret malformed authority component in request URIs passed to the library as java.net.URI object and pick the wrong target host for request execution.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-13956","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
non_process
cve medium detected in httpclient jar cve medium severity vulnerability vulnerable library httpclient jar apache httpcomponents client path to dependency file lmi jdbc pom xml path to vulnerable library repository org apache httpcomponents httpclient httpclient jar dependency hierarchy x httpclient jar vulnerable library vulnerability details apache httpclient versions prior to version and can misinterpret malformed authority component in request uris passed to the library as java net uri object and pick the wrong target host for request execution publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache httpcomponents httpclient org apache httpcomponents httpclient osgi org apache httpcomponents org apache httpcomponents osgi isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree org apache httpcomponents httpclient isminimumfixversionavailable true minimumfixversion org apache httpcomponents httpclient org apache httpcomponents httpclient osgi org apache httpcomponents org apache httpcomponents osgi basebranches vulnerabilityidentifier cve vulnerabilitydetails apache httpclient versions prior to version and can misinterpret malformed authority component in request uris passed to the library as java net uri object and pick the wrong target host for request execution vulnerabilityurl
0
98,455
20,734,626,937
IssuesEvent
2022-03-14 12:40:02
jvegax/Acme-Toolkits
https://api.github.com/repos/jvegax/Acme-Toolkits
opened
Task-056 : Produce a test suite for your project
code 🧑‍💻
Cada uno debe realizar los test asociados a la funcionalidad que haya implementado
1.0
Task-056 : Produce a test suite for your project - Cada uno debe realizar los test asociados a la funcionalidad que haya implementado
non_process
task produce a test suite for your project cada uno debe realizar los test asociados a la funcionalidad que haya implementado
0
19,374
25,501,261,199
IssuesEvent
2022-11-28 04:31:14
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
[Mirror] Archives for rules_go 0.36.0
P2 type: process team-OSS mirror request
### Please list the URLs of the archives you'd like to mirror: https://github.com/golang/sys/archive/refs/tags/v0.2.0.zip https://github.com/googleapis/go-genproto/archive/16455021b5e60501e2adf67e15f857d2f5d95388.zip https://github.com/bazelbuild/rules_go/releases/download/v0.36.0/rules_go-v0.36.0.zip
1.0
[Mirror] Archives for rules_go 0.36.0 - ### Please list the URLs of the archives you'd like to mirror: https://github.com/golang/sys/archive/refs/tags/v0.2.0.zip https://github.com/googleapis/go-genproto/archive/16455021b5e60501e2adf67e15f857d2f5d95388.zip https://github.com/bazelbuild/rules_go/releases/download/v0.36.0/rules_go-v0.36.0.zip
process
archives for rules go please list the urls of the archives you d like to mirror
1
6,550
9,638,063,162
IssuesEvent
2019-05-16 10:12:32
allinurl/goaccess
https://api.github.com/repos/allinurl/goaccess
closed
Config file: No config file used
command-line options log-processing question
Hellow I'm sorry, I'm completely newbie at coding and server administration but I very need to get GoAccess on my website. I've installed successfully GoAccess on my server but when I'm trying to set up log format with a command `# goaccess access.log -c` I got answer ``` GoAccess - version 1.3 - Nov 22 2018 23:57:27 Config file: No config file used Fatal error has occurred Error occurred at: src/parser.c - read_log - 2728 Unable to open the specified log file. No such file or directory ``` I've tried to to find anny files accoiciated with "access.log" and "goaccess" via ``` # locate -i access.log # locate -i goaccess ``` which gave me no files with such names... What I did wrong?
1.0
Config file: No config file used - Hellow I'm sorry, I'm completely newbie at coding and server administration but I very need to get GoAccess on my website. I've installed successfully GoAccess on my server but when I'm trying to set up log format with a command `# goaccess access.log -c` I got answer ``` GoAccess - version 1.3 - Nov 22 2018 23:57:27 Config file: No config file used Fatal error has occurred Error occurred at: src/parser.c - read_log - 2728 Unable to open the specified log file. No such file or directory ``` I've tried to to find anny files accoiciated with "access.log" and "goaccess" via ``` # locate -i access.log # locate -i goaccess ``` which gave me no files with such names... What I did wrong?
process
config file no config file used hellow i m sorry i m completely newbie at coding and server administration but i very need to get goaccess on my website i ve installed successfully goaccess on my server but when i m trying to set up log format with a command goaccess access log c i got answer goaccess version nov config file no config file used fatal error has occurred error occurred at src parser c read log unable to open the specified log file no such file or directory i ve tried to to find anny files accoiciated with access log and goaccess via locate i access log locate i goaccess which gave me no files with such names what i did wrong
1
17,349
23,173,027,267
IssuesEvent
2022-07-31 01:57:07
lynnandtonic/nestflix.fun
https://api.github.com/repos/lynnandtonic/nestflix.fun
closed
Add Saving the Last Soldier from “Hello, Me!” (Screenshots added)
suggested title in process
Please add as much of the following info as you can: Title: Saving the Last Soldier Type (film/tv show): film - war drama (foreign: Korean) Film or show in which it appears: Hello, Me! Is the parent film/show streaming anywhere? Yes - Netflix About when in the parent film/show does it appear? about 20 minutes into the first episode Actual footage of the film/show can be seen (yes/no)? Yes Cast: Anthony
1.0
Add Saving the Last Soldier from “Hello, Me!” (Screenshots added) - Please add as much of the following info as you can: Title: Saving the Last Soldier Type (film/tv show): film - war drama (foreign: Korean) Film or show in which it appears: Hello, Me! Is the parent film/show streaming anywhere? Yes - Netflix About when in the parent film/show does it appear? about 20 minutes into the first episode Actual footage of the film/show can be seen (yes/no)? Yes Cast: Anthony
process
add saving the last soldier from “hello me ” screenshots added please add as much of the following info as you can title saving the last soldier type film tv show film war drama foreign korean film or show in which it appears hello me is the parent film show streaming anywhere yes netflix about when in the parent film show does it appear about minutes into the first episode actual footage of the film show can be seen yes no yes cast anthony
1