Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
13,922
16,679,257,670
IssuesEvent
2021-06-07 20:35:16
Leviatan-Analytics/LA-data-processing
https://api.github.com/repos/Leviatan-Analytics/LA-data-processing
closed
Test open rofl file with League of Legends [2]
Data Processing Sprint 2 Week 2
Research alternatives for opening league of legends replay files. Output: Document including the research done on how to open this type of file.
1.0
Test open rofl file with League of Legends [2] - Research alternatives for opening league of legends replay files. Output: Document including the research done on how to open this type of file.
process
test open rofl file with league of legends research alternatives for opening league of legends replay files output document including the research done on how to open this type of file
1
36,144
2,795,869,777
IssuesEvent
2015-05-12 01:24:58
TypeStrong/atom-typescript
https://api.github.com/repos/TypeStrong/atom-typescript
reopened
Renaming with "compile on save" affects focus
priority:high up-for-grabs
GIVEN the editor has unsaved changes, AND the "compile on save" option is on, AND the rename dialog appears THEN the focus is moved to the dialog briefly until it moves back to the editor (which is in the background). ![Example](http://i.imgur.com/iP0cEYg.gif) Atom version: 0.190 atom-typescript version: 2.15.1
1.0
Renaming with "compile on save" affects focus - GIVEN the editor has unsaved changes, AND the "compile on save" option is on, AND the rename dialog appears THEN the focus is moved to the dialog briefly until it moves back to the editor (which is in the background). ![Example](http://i.imgur.com/iP0cEYg.gif) Atom version: 0.190 atom-typescript version: 2.15.1
non_process
renaming with compile on save affects focus given the editor has unsaved changes and the compile on save option is on and the rename dialog appears then the focus is moved to the dialog briefly until it moves back to the editor which is in the background atom version atom typescript version
0
10,273
13,127,903,186
IssuesEvent
2020-08-06 11:16:30
keep-network/keep-core
https://api.github.com/repos/keep-network/keep-core
closed
Remove redundant Address property in [ethereum.account] config section
process & client team 📟 client
Beacon client expects `Address` property to be present in `[ethereum.account]` config section. This information is redundant because we can read address from the key file as we do in `keep-ecdsa`. We should remove `Address` property altogether and instead of having: ``` [ethereum.account] Address = "0x65ea55c1f10491038425725dc00dffeab2a1e28a" KeyFile = "/Users/piotr/ethereum/data/keystore/UTC--2018-11-01T06-24-04.418691008Z--65ea55c1f10491038425725dc00dffeab2a1e28a" ``` Have only: ``` [ethereum.account] KeyFile = "/Users/piotr/ethereum/data/keystore/UTC--2018-11-01T06-24-04.418691008Z--65ea55c1f10491038425725dc00dffeab2a1e28a" ```
1.0
Remove redundant Address property in [ethereum.account] config section - Beacon client expects `Address` property to be present in `[ethereum.account]` config section. This information is redundant because we can read address from the key file as we do in `keep-ecdsa`. We should remove `Address` property altogether and instead of having: ``` [ethereum.account] Address = "0x65ea55c1f10491038425725dc00dffeab2a1e28a" KeyFile = "/Users/piotr/ethereum/data/keystore/UTC--2018-11-01T06-24-04.418691008Z--65ea55c1f10491038425725dc00dffeab2a1e28a" ``` Have only: ``` [ethereum.account] KeyFile = "/Users/piotr/ethereum/data/keystore/UTC--2018-11-01T06-24-04.418691008Z--65ea55c1f10491038425725dc00dffeab2a1e28a" ```
process
remove redundant address property in config section beacon client expects address property to be present in config section this information is redundant because we can read address from the key file as we do in keep ecdsa we should remove address property altogether and instead of having address keyfile users piotr ethereum data keystore utc have only keyfile users piotr ethereum data keystore utc
1
21,057
28,005,674,679
IssuesEvent
2023-03-27 15:07:16
oxidecomputer/hubris
https://api.github.com/repos/oxidecomputer/hubris
closed
update over management network should interlock against image for incorrect board
service processor robustness gimlet
We're expecting a mixed population of system board revisions in test environments shortly, and there is currently (to my knowledge) no protection against using the update server over the management network to install software for board revision A into a board of revision B. The results of such an operation are presumably, at minimum, a trip to the bench for external reprogramming; and at worst, some physically destructive event that would require board rework to revive the system. @jgallagher suggests we could put an interlock into the update server task (or possibly the control plane agent) which checks the caboose of the incoming image and rejects it if it is not right for this board. See also the internal lab ticket about this: oxidecomputer/meta#140.
1.0
update over management network should interlock against image for incorrect board - We're expecting a mixed population of system board revisions in test environments shortly, and there is currently (to my knowledge) no protection against using the update server over the management network to install software for board revision A into a board of revision B. The results of such an operation are presumably, at minimum, a trip to the bench for external reprogramming; and at worst, some physically destructive event that would require board rework to revive the system. @jgallagher suggests we could put an interlock into the update server task (or possibly the control plane agent) which checks the caboose of the incoming image and rejects it if it is not right for this board. See also the internal lab ticket about this: oxidecomputer/meta#140.
process
update over management network should interlock against image for incorrect board we re expecting a mixed population of system board revisions in test environments shortly and there is currently to my knowledge no protection against using the update server over the management network to install software for board revision a into a board of revision b the results of such an operation are presumably at minimum a trip to the bench for external reprogramming and at worst some physically destructive event that would require board rework to revive the system jgallagher suggests we could put an interlock into the update server task or possibly the control plane agent which checks the caboose of the incoming image and rejects it if it is not right for this board see also the internal lab ticket about this oxidecomputer meta
1
67,517
3,274,746,219
IssuesEvent
2015-10-26 12:39:51
YetiForceCompany/YetiForceCRM
https://api.github.com/repos/YetiForceCompany/YetiForceCRM
closed
[Question] Looking for possibility to link doc with service contract
Label::MoreInfoRequired Priority::#2 Normal Type::VerificationRequired
Shortly speaking I would like to have possibility to link document upload to the system with record of service contract. Mean that when create service contract and sign physical doc with my customer I would like to have link between this record and scanned and upload signed contract... Is it possible? Thanks in advance
1.0
[Question] Looking for possibility to link doc with service contract - Shortly speaking I would like to have possibility to link document upload to the system with record of service contract. Mean that when create service contract and sign physical doc with my customer I would like to have link between this record and scanned and upload signed contract... Is it possible? Thanks in advance
non_process
looking for possibility to link doc with service contract shortly speaking i would like to have possibility to link document upload to the system with record of service contract mean that when create service contract and sign physical doc with my customer i would like to have link between this record and scanned and upload signed contract is it possible thanks in advance
0
20,421
27,081,807,252
IssuesEvent
2023-02-14 14:30:30
barrycumbie/studious-engine-feb2023-githubPd
https://api.github.com/repos/barrycumbie/studious-engine-feb2023-githubPd
closed
start here on day two
process
let's do - a recap... with a gist! (I ❤️ gist) - better yet, gist to our REPO (wiki)
1.0
start here on day two - let's do - a recap... with a gist! (I ❤️ gist) - better yet, gist to our REPO (wiki)
process
start here on day two let s do a recap with a gist i ❤️ gist better yet gist to our repo wiki
1
99,713
30,539,596,208
IssuesEvent
2023-07-19 20:11:30
DeiseCAPBarbosa/gha-build-homolog
https://api.github.com/repos/DeiseCAPBarbosa/gha-build-homolog
closed
Build Homolog feat help + about
build-homolog
## Description Realiza deploy automatizado da aplicação. ## Environments environment_1 ## Branches feat/help feat/about
1.0
Build Homolog feat help + about - ## Description Realiza deploy automatizado da aplicação. ## Environments environment_1 ## Branches feat/help feat/about
non_process
build homolog feat help about description realiza deploy automatizado da aplicação environments environment branches feat help feat about
0
4,813
7,700,773,055
IssuesEvent
2018-05-20 06:48:00
nodejs/node
https://api.github.com/repos/nodejs/node
closed
parallel/test-child-process-exec-timeout not cleaning up child process on win7
child_process test windows
<!-- Thank you for reporting an issue. This issue tracker is for bugs and issues found within Node.js core. If you require more general support please file an issue on our help repo. https://github.com/nodejs/help Please fill in as much of the template below as you're able. Version: output of `node -v` Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows) Subsystem: if known, please specify affected core module name If possible, please provide code that demonstrates the problem, keeping it as simple and free of external dependencies as you are able. --> * **Version**: Seen on 6.9.4 * **Platform**: Windows (seen on win7) * **Subsystem**: Test <!-- Enter your issue details below this comment. --> The test parallel/test-child-process-exec-timeout is not always cleaning up its' child processes. When I run the test 10 times (tools/test.py parallel/test-process-child-exec-timeout), I get around 4-5 left over child processes. Occasionally when the processes are left lying around, the parallel test suite 'hangs' on the 1157th test in the parallel test suite. The test does not timeout with the test runner in this instance - it will hang permanently and the test suite will never complete. /cc @cjihrig @gibfahn <img width="1213" alt="screen shot 2017-01-28 at 13 37 17" src="https://cloud.githubusercontent.com/assets/8297234/22397009/f288da28-e55e-11e6-9a46-8cb16fac300f.png">
1.0
parallel/test-child-process-exec-timeout not cleaning up child process on win7 - <!-- Thank you for reporting an issue. This issue tracker is for bugs and issues found within Node.js core. If you require more general support please file an issue on our help repo. https://github.com/nodejs/help Please fill in as much of the template below as you're able. Version: output of `node -v` Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows) Subsystem: if known, please specify affected core module name If possible, please provide code that demonstrates the problem, keeping it as simple and free of external dependencies as you are able. --> * **Version**: Seen on 6.9.4 * **Platform**: Windows (seen on win7) * **Subsystem**: Test <!-- Enter your issue details below this comment. --> The test parallel/test-child-process-exec-timeout is not always cleaning up its' child processes. When I run the test 10 times (tools/test.py parallel/test-process-child-exec-timeout), I get around 4-5 left over child processes. Occasionally when the processes are left lying around, the parallel test suite 'hangs' on the 1157th test in the parallel test suite. The test does not timeout with the test runner in this instance - it will hang permanently and the test suite will never complete. /cc @cjihrig @gibfahn <img width="1213" alt="screen shot 2017-01-28 at 13 37 17" src="https://cloud.githubusercontent.com/assets/8297234/22397009/f288da28-e55e-11e6-9a46-8cb16fac300f.png">
process
parallel test child process exec timeout not cleaning up child process on thank you for reporting an issue this issue tracker is for bugs and issues found within node js core if you require more general support please file an issue on our help repo please fill in as much of the template below as you re able version output of node v platform output of uname a unix or version and or bit windows subsystem if known please specify affected core module name if possible please provide code that demonstrates the problem keeping it as simple and free of external dependencies as you are able version seen on platform windows seen on subsystem test the test parallel test child process exec timeout is not always cleaning up its child processes when i run the test times tools test py parallel test process child exec timeout i get around left over child processes occasionally when the processes are left lying around the parallel test suite hangs on the test in the parallel test suite the test does not timeout with the test runner in this instance it will hang permanently and the test suite will never complete cc cjihrig gibfahn img width alt screen shot at src
1
7,352
10,483,219,182
IssuesEvent
2019-09-24 13:37:49
Hurence/logisland
https://api.github.com/repos/Hurence/logisland
opened
add OpenFAAS support
feature processor
- controllerService: faas_service component: com.hurence.logisland.service.faas configuration: faas.server: http://myfaas.org # add field if defect found - processor: defect_tagger component: com.hurence.logisland.processor.FAASExecute configuration: fass.service: faas_service fass.function: fr.cetim.spidetp.iot.functions.HasDefect fass.param.field: record_value fass.response.field: has_defect strategy: ....
1.0
add OpenFAAS support - - controllerService: faas_service component: com.hurence.logisland.service.faas configuration: faas.server: http://myfaas.org # add field if defect found - processor: defect_tagger component: com.hurence.logisland.processor.FAASExecute configuration: fass.service: faas_service fass.function: fr.cetim.spidetp.iot.functions.HasDefect fass.param.field: record_value fass.response.field: has_defect strategy: ....
process
add openfaas support controllerservice faas service component com hurence logisland service faas configuration faas server add field if defect found processor defect tagger component com hurence logisland processor faasexecute configuration fass service faas service fass function fr cetim spidetp iot functions hasdefect fass param field record value fass response field has defect strategy
1
16,904
22,216,225,972
IssuesEvent
2022-06-08 02:08:38
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Vector grid tool - result with wrong extent
Raster Processing Bug Vectors
### What is the bug or the crash? Using Vector>Research tools>Create grid results in a grid with an incorrect extent when this extent is given manually. See steps below. ### Steps to reproduce the issue Step 1 -Create a random raster with the following extent 372783.6000,372830.4000,7026751.5000,7029915.6000 [EPSG:32618] ![image](https://user-images.githubusercontent.com/15986881/172176813-a6c907de-3db4-4822-b068-8e685cfa2099.png) Step 2 - Try to create a vector grid with cell size larger than the extent The extent considered is obtained with "Calculate from layer" and selecting the previous raster ![image](https://user-images.githubusercontent.com/15986881/172177087-62f75b31-b7fa-45e9-a722-060c20eaf990.png) It does not work (but I think that's a correct behavior). ![image](https://user-images.githubusercontent.com/15986881/172177231-1850f428-2dcf-4cf5-9ff4-98925cb29130.png) Step 3 - Then change **manually** the extent of the requested grid by at least the width of the cells. ![image](https://user-images.githubusercontent.com/15986881/172178239-4b6d1a8b-419e-4848-af0f-37739d5a93d3.png) Success! The grid is created. Step 4 - Problem However, if you zoom in the upper left border of the grid, you will see that it doesn't fit to the raster border ... ![image](https://user-images.githubusercontent.com/15986881/172179220-b472767e-d1bf-416b-9d25-df4244a4ae17.png) ... and when you check the extent of the grid, you will see that their values correspond to the rounded values of the original extent (the one manually modified. i.e 372783.6000,373080.4000,7026751.5000,7029915.6000 [EPSG:32618]). ![image](https://user-images.githubusercontent.com/15986881/172179640-cd3acf81-45ae-47be-89d9-9b3921518bed.png) Step 5 - Work around Create a copy of the original raster (save raster as ...) but increasing the extent of the output file to be larger than the grid cell ![image](https://user-images.githubusercontent.com/15986881/172182395-bd7319ff-57bb-4ca6-94ba-cd79913be235.png) Create the grid on this new raster, obtaining the extent with "Calculate from layer" ![image](https://user-images.githubusercontent.com/15986881/172183186-b35d1882-179d-41c0-a772-cd47875d10bf.png) This time, no problem to create the grid and it fits the border of the raster (see hashed polygon vs the previous grid vs the raster) ![image](https://user-images.githubusercontent.com/15986881/172183783-5f1de8c2-5b6d-49b0-814c-a2891736c61d.png) ### Versions <!--StartFragment--><!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd"> <html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css"> p, li { white-space: pre-wrap; } </style></head><body> QGIS version | 3.24.3-Tisler | QGIS code revision | cf22b74e -- | -- | -- | -- Qt version | 5.15.3 Python version | 3.9.5 GDAL/OGR version | 3.4.3 PROJ version | 9.0.0 EPSG Registry database version | v10.054 (2022-02-13) GEOS version | 3.10.2-CAPI-1.16.0 SQLite version | 3.38.1 PDAL version | 2.3.0 PostgreSQL client version | unknown SpatiaLite version | 5.0.1 QWT version | 6.1.6 QScintilla2 version | 2.13.1 OS version | Windows 10 Version 1909   |   |   |   Active Python plugins db_manager | 0.1.20 grassprovider | 2.12.99 processing | 2.12.99 sagaprovider | 2.12.99 </body></html><!--EndFragment-->QGIS version 3.24.3-Tisler QGIS code revision [cf22b74e](https://github.com/qgis/QGIS/commit/cf22b74e) Qt version 5.15.3 Python version 3.9.5 GDAL/OGR version 3.4.3 PROJ version 9.0.0 EPSG Registry database version v10.054 (2022-02-13) GEOS version 3.10.2-CAPI-1.16.0 SQLite version 3.38.1 PDAL version 2.3.0 PostgreSQL client version unknown SpatiaLite version 5.0.1 QWT version 6.1.6 QScintilla2 version 2.13.1 OS version Windows 10 Version 1909 Active Python plugins db_manager 0.1.20 grassprovider 2.12.99 processing 2.12.99 sagaprovider 2.12.99 ### Supported QGIS version - [X] I'm running a supported QGIS version according to the roadmap. ### New profile - [ ] I tried with a new QGIS profile ### Additional context _No response_
1.0
Vector grid tool - result with wrong extent - ### What is the bug or the crash? Using Vector>Research tools>Create grid results in a grid with an incorrect extent when this extent is given manually. See steps below. ### Steps to reproduce the issue Step 1 -Create a random raster with the following extent 372783.6000,372830.4000,7026751.5000,7029915.6000 [EPSG:32618] ![image](https://user-images.githubusercontent.com/15986881/172176813-a6c907de-3db4-4822-b068-8e685cfa2099.png) Step 2 - Try to create a vector grid with cell size larger than the extent The extent considered is obtained with "Calculate from layer" and selecting the previous raster ![image](https://user-images.githubusercontent.com/15986881/172177087-62f75b31-b7fa-45e9-a722-060c20eaf990.png) It does not work (but I think that's a correct behavior). ![image](https://user-images.githubusercontent.com/15986881/172177231-1850f428-2dcf-4cf5-9ff4-98925cb29130.png) Step 3 - Then change **manually** the extent of the requested grid by at least the width of the cells. ![image](https://user-images.githubusercontent.com/15986881/172178239-4b6d1a8b-419e-4848-af0f-37739d5a93d3.png) Success! The grid is created. Step 4 - Problem However, if you zoom in the upper left border of the grid, you will see that it doesn't fit to the raster border ... ![image](https://user-images.githubusercontent.com/15986881/172179220-b472767e-d1bf-416b-9d25-df4244a4ae17.png) ... and when you check the extent of the grid, you will see that their values correspond to the rounded values of the original extent (the one manually modified. i.e 372783.6000,373080.4000,7026751.5000,7029915.6000 [EPSG:32618]). ![image](https://user-images.githubusercontent.com/15986881/172179640-cd3acf81-45ae-47be-89d9-9b3921518bed.png) Step 5 - Work around Create a copy of the original raster (save raster as ...) but increasing the extent of the output file to be larger than the grid cell ![image](https://user-images.githubusercontent.com/15986881/172182395-bd7319ff-57bb-4ca6-94ba-cd79913be235.png) Create the grid on this new raster, obtaining the extent with "Calculate from layer" ![image](https://user-images.githubusercontent.com/15986881/172183186-b35d1882-179d-41c0-a772-cd47875d10bf.png) This time, no problem to create the grid and it fits the border of the raster (see hashed polygon vs the previous grid vs the raster) ![image](https://user-images.githubusercontent.com/15986881/172183783-5f1de8c2-5b6d-49b0-814c-a2891736c61d.png) ### Versions <!--StartFragment--><!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd"> <html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css"> p, li { white-space: pre-wrap; } </style></head><body> QGIS version | 3.24.3-Tisler | QGIS code revision | cf22b74e -- | -- | -- | -- Qt version | 5.15.3 Python version | 3.9.5 GDAL/OGR version | 3.4.3 PROJ version | 9.0.0 EPSG Registry database version | v10.054 (2022-02-13) GEOS version | 3.10.2-CAPI-1.16.0 SQLite version | 3.38.1 PDAL version | 2.3.0 PostgreSQL client version | unknown SpatiaLite version | 5.0.1 QWT version | 6.1.6 QScintilla2 version | 2.13.1 OS version | Windows 10 Version 1909   |   |   |   Active Python plugins db_manager | 0.1.20 grassprovider | 2.12.99 processing | 2.12.99 sagaprovider | 2.12.99 </body></html><!--EndFragment-->QGIS version 3.24.3-Tisler QGIS code revision [cf22b74e](https://github.com/qgis/QGIS/commit/cf22b74e) Qt version 5.15.3 Python version 3.9.5 GDAL/OGR version 3.4.3 PROJ version 9.0.0 EPSG Registry database version v10.054 (2022-02-13) GEOS version 3.10.2-CAPI-1.16.0 SQLite version 3.38.1 PDAL version 2.3.0 PostgreSQL client version unknown SpatiaLite version 5.0.1 QWT version 6.1.6 QScintilla2 version 2.13.1 OS version Windows 10 Version 1909 Active Python plugins db_manager 0.1.20 grassprovider 2.12.99 processing 2.12.99 sagaprovider 2.12.99 ### Supported QGIS version - [X] I'm running a supported QGIS version according to the roadmap. ### New profile - [ ] I tried with a new QGIS profile ### Additional context _No response_
process
vector grid tool result with wrong extent what is the bug or the crash using vector research tools create grid results in a grid with an incorrect extent when this extent is given manually see steps below steps to reproduce the issue step create a random raster with the following extent step try to create a vector grid with cell size larger than the extent the extent considered is obtained with calculate from layer and selecting the previous raster it does not work but i think that s a correct behavior step then change manually the extent of the requested grid by at least the width of the cells success the grid is created step problem however if you zoom in the upper left border of the grid you will see that it doesn t fit to the raster border and when you check the extent of the grid you will see that their values correspond to the rounded values of the original extent the one manually modified i e step work around create a copy of the original raster save raster as but increasing the extent of the output file to be larger than the grid cell create the grid on this new raster obtaining the extent with calculate from layer this time no problem to create the grid and it fits the border of the raster see hashed polygon vs the previous grid vs the raster versions doctype html public dtd html en p li white space pre wrap qgis version tisler qgis code revision qt version python version gdal ogr version proj version epsg registry database version geos version capi sqlite version pdal version postgresql client version unknown spatialite version qwt version version os version windows version         active python plugins db manager grassprovider processing sagaprovider qgis version tisler qgis code revision qt version python version gdal ogr version proj version epsg registry database version geos version capi sqlite version pdal version postgresql client version unknown spatialite version qwt version version os version windows version active python plugins db manager grassprovider processing sagaprovider supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context no response
1
262,658
8,272,300,517
IssuesEvent
2018-09-16 18:41:22
javaee/glassfish
https://api.github.com/repos/javaee/glassfish
closed
InjectionManager.inject doesn't handle PostConstruct methods properly
Component: naming ERR: Assignee Priority: Minor Type: Bug
The InjectionManagerImpl.inject method takes a boolean that controls whether postConstruct methods are called. Fortunately, most of the time it's called with "false". I think the only time it's called with "true" is when injecting an app client main class. If called with true, the PostConstruct invocation logic ignores method overriding and may call PostConstruct methods that shouldn't be called, or may call the same method more than once. #### Affected Versions [4.0]
1.0
InjectionManager.inject doesn't handle PostConstruct methods properly - The InjectionManagerImpl.inject method takes a boolean that controls whether postConstruct methods are called. Fortunately, most of the time it's called with "false". I think the only time it's called with "true" is when injecting an app client main class. If called with true, the PostConstruct invocation logic ignores method overriding and may call PostConstruct methods that shouldn't be called, or may call the same method more than once. #### Affected Versions [4.0]
non_process
injectionmanager inject doesn t handle postconstruct methods properly the injectionmanagerimpl inject method takes a boolean that controls whether postconstruct methods are called fortunately most of the time it s called with false i think the only time it s called with true is when injecting an app client main class if called with true the postconstruct invocation logic ignores method overriding and may call postconstruct methods that shouldn t be called or may call the same method more than once affected versions
0
13,203
15,648,952,796
IssuesEvent
2021-03-23 06:45:08
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[PM] Password should be expire for every 90 days and enforce the user to reset the Password
P1 Participant manager datastore Process: Enhancement Process: Tested dev Unknown backend
Password should be expired for every 90 days and enforce the user to reset the Password, in case the password is not changed within that period by the user.
2.0
[PM] Password should be expire for every 90 days and enforce the user to reset the Password - Password should be expired for every 90 days and enforce the user to reset the Password, in case the password is not changed within that period by the user.
process
password should be expire for every days and enforce the user to reset the password password should be expired for every days and enforce the user to reset the password in case the password is not changed within that period by the user
1
8,638
11,787,295,341
IssuesEvent
2020-03-17 13:49:06
MicrosoftDocs/vsts-docs
https://api.github.com/repos/MicrosoftDocs/vsts-docs
closed
Pipeline resource variables available at compile time?
Pri1 devops-cicd-process/tech devops/prod doc-bug
Can you clarify if the pipeline resource variables (e.g. `variables.resources.pipeline.Alias.sourceBranch`) are available at compile time or only at run time? These variables act like they are only available at run time but I'd like to know if I'm missing something. Background: I've been trying to use expressions to set a parameter based on the branch of the triggering pipeline resource but haven't been able to do so successfully in the way I was hoping. This doesn't work (compile time var) - appEnvironment is not set: ``` - template: template-repo/deploy.yml@templateRepo parameters: appName: ${{ variables.appName }} ${{ if eq(variables.resources.pipeline.appBuildPipeline.sourceBranch, 'refs/heads/dev') }}: appEnvironment: dev ``` This works (run time var): ``` - job: MyJob condition: eq(variables['resources.pipeline.appBuildPipeline.sourceBranch'], 'refs/heads/dev') steps: - download: artifactToDownload artifact: artifact-dev ``` --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: ee4ec9d0-e0d5-4fb4-7c3e-b84abfa290c2 * Version Independent ID: 3e2b80d9-30e5-0c48-49f0-4fcdfedf5eee * Content: [Resources - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/resources?view=azure-devops&tabs=schema) * Content Source: [docs/pipelines/process/resources.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/resources.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
1.0
Pipeline resource variables available at compile time? - Can you clarify if the pipeline resource variables (e.g. `variables.resources.pipeline.Alias.sourceBranch`) are available at compile time or only at run time? These variables act like they are only available at run time but I'd like to know if I'm missing something. Background: I've been trying to use expressions to set a parameter based on the branch of the triggering pipeline resource but haven't been able to do so successfully in the way I was hoping. This doesn't work (compile time var) - appEnvironment is not set: ``` - template: template-repo/deploy.yml@templateRepo parameters: appName: ${{ variables.appName }} ${{ if eq(variables.resources.pipeline.appBuildPipeline.sourceBranch, 'refs/heads/dev') }}: appEnvironment: dev ``` This works (run time var): ``` - job: MyJob condition: eq(variables['resources.pipeline.appBuildPipeline.sourceBranch'], 'refs/heads/dev') steps: - download: artifactToDownload artifact: artifact-dev ``` --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: ee4ec9d0-e0d5-4fb4-7c3e-b84abfa290c2 * Version Independent ID: 3e2b80d9-30e5-0c48-49f0-4fcdfedf5eee * Content: [Resources - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/resources?view=azure-devops&tabs=schema) * Content Source: [docs/pipelines/process/resources.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/resources.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
process
pipeline resource variables available at compile time can you clarify if the pipeline resource variables e g variables resources pipeline alias sourcebranch are available at compile time or only at run time these variables act like they are only available at run time but i d like to know if i m missing something background i ve been trying to use expressions to set a parameter based on the branch of the triggering pipeline resource but haven t been able to do so successfully in the way i was hoping this doesn t work compile time var appenvironment is not set template template repo deploy yml templaterepo parameters appname variables appname if eq variables resources pipeline appbuildpipeline sourcebranch refs heads dev appenvironment dev this works run time var job myjob condition eq variables refs heads dev steps download artifacttodownload artifact artifact dev document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
1
9,893
12,891,011,577
IssuesEvent
2020-07-13 16:58:12
MicrosoftDocs/azure-devops-docs
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
closed
Many versions wait for approval
Pri1 devops-cicd-process/tech devops/prod product-question
Let's assume YAML multi-release pipeline with stages. Run many releases of pipeline and stage-X with approval is reached by many versions. In old Release pipeline, new version cancels waiting for approval by old version. In YAML pipeline, both of them are active for approval. Is it expected behavior? Is it possible to configure auto-cancel waiting for approval for old versions? --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 4266f72c-c774-0046-4593-d01eb775d3c3 * Version Independent ID: f20827aa-a6c5-96a8-5969-e576ffbc2e38 * Content: [Stages in Azure Pipelines - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/stages?view=azure-devops&tabs=yaml) * Content Source: [docs/pipelines/process/stages.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/stages.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
1.0
Many versions wait for approval - Let's assume YAML multi-release pipeline with stages. Run many releases of pipeline and stage-X with approval is reached by many versions. In old Release pipeline, new version cancels waiting for approval by old version. In YAML pipeline, both of them are active for approval. Is it expected behavior? Is it possible to configure auto-cancel waiting for approval for old versions? --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 4266f72c-c774-0046-4593-d01eb775d3c3 * Version Independent ID: f20827aa-a6c5-96a8-5969-e576ffbc2e38 * Content: [Stages in Azure Pipelines - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/stages?view=azure-devops&tabs=yaml) * Content Source: [docs/pipelines/process/stages.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/stages.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
process
many versions wait for approval let s assume yaml multi release pipeline with stages run many releases of pipeline and stage x with approval is reached by many versions in old release pipeline new version cancels waiting for approval by old version in yaml pipeline both of them are active for approval is it expected behavior is it possible to configure auto cancel waiting for approval for old versions document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
1
68,608
8,310,096,441
IssuesEvent
2018-09-24 09:28:52
python-trio/trio
https://api.github.com/repos/python-trio/trio
opened
make trio.BlockingTrioPortal and trio.Lock work together
design discussion user happiness
In [this SO question](https://stackoverflow.com/questions/52468911/python-ways-to-synchronize-trio-tasks-and-regular-threads), @Nikratio asks about how to share a lock between Trio and a thread. There are some advantages to using `trio.Lock`, and having the thread access it through a `BlockingTrioPortal` – in particular, it means that the Trio-side code can use cancellation as normal. (It's also maybe cheaper? I haven't measured, but it seems plausible.) However, if you try this, you'll get bitten in a surprising way: `trio.Lock` tracks which task acquired it, and wants that task to release it. This is intended to help avoid mistakes and give us the option of implementing deadlock detection later (#182), but here it creates a problem: if you do `portal.run(lock.acquire)`, and then later `portal.run_sync(lock.release)`, then you'll [get an error](https://github.com/python-trio/trio/blob/46f44057a34c256fd08a45c71214175e1fcd10c6/trio/_sync.py#L583-L584), because the `BlockingTrioPortal` uses a different backing task for each call, so from the `Lock`'s perspective the task that's trying to release it is some random task unrelated to the one that acquired it. This is unfortunate, we should fix it somehow. Some options: * Make `trio.Lock` less picky about who releases it. (Like `threading.Lock`, which is totally happy for different threads to call `acquire` and `release`.) Downsides: this would mean we can never add deadlock detection or otherwise report on lock ownership when debugging; feels kind of weird and error-prone; inconsistent with recursive locks (which inherently have to track ownership). Doesn't solve the problem for other context managers that have the same issue (e.g. nurseries – see also https://github.com/python-trio/trio-asyncio/issues/42 which is about context-switching between trio and asyncio, which has essentially isomorphic concerns) * Add an API `BlockingTrioPortal` to wrap a context manager, so you write something like `with portal.async_with(lock): ...`, and `BlockingTrioPortal.async_with` is clever enough to allocate a single backing task and use it for calling both `__aenter__` and `__aexit__`. * Make it so `BlockingTrioPortal` somehow maintains a single backing task across multiple operations automatically. Managing the task lifetime becomes a bit tricky here: what if multiple threads use the same portal? Do we need to add a `BlockingTrioPortal.close` API to shut down the backing task(s)? I don't think there's any actual show-stoppers that would prevent killing the backing task(s) from `__del__`, though of course it's pretty complicated to do. Or we could change the API to make this easier to track, e.g. by having a `with portal_factory.open() as portal: ...` that you have to do in the thread to get a `portal` handle? * Change the trio<->thread API entirely, using something like @agronholm's `async with in_trio: ...` / `async with in_thread: ...` trick. https://github.com/python-trio/trio-asyncio/issues/42 has arguments for why this makes sense for trio/asyncio transitions, #649 is the trio issue for discussing the low-level API we need in the core to make this possible. I guess whether this is a good idea depends a lot on how people are using this... do they really want to switch back and forth between trio and a thread? if so it's great. Do they just want to acquire a lock, and otherwise stay in the thread? If so then writing `async with in_trio: async with lock: async with in_thread: ...` would be pretty annoying (and there's some question about how to get back into the same thread, or if that even matters). I guess that could be shortened to `async with in_trio, lock, in_thread: ...`, but I don't know if that helps much :-).
1.0
make trio.BlockingTrioPortal and trio.Lock work together - In [this SO question](https://stackoverflow.com/questions/52468911/python-ways-to-synchronize-trio-tasks-and-regular-threads), @Nikratio asks about how to share a lock between Trio and a thread. There are some advantages to using `trio.Lock`, and having the thread access it through a `BlockingTrioPortal` – in particular, it means that the Trio-side code can use cancellation as normal. (It's also maybe cheaper? I haven't measured, but it seems plausible.) However, if you try this, you'll get bitten in a surprising way: `trio.Lock` tracks which task acquired it, and wants that task to release it. This is intended to help avoid mistakes and give us the option of implementing deadlock detection later (#182), but here it creates a problem: if you do `portal.run(lock.acquire)`, and then later `portal.run_sync(lock.release)`, then you'll [get an error](https://github.com/python-trio/trio/blob/46f44057a34c256fd08a45c71214175e1fcd10c6/trio/_sync.py#L583-L584), because the `BlockingTrioPortal` uses a different backing task for each call, so from the `Lock`'s perspective the task that's trying to release it is some random task unrelated to the one that acquired it. This is unfortunate, we should fix it somehow. Some options: * Make `trio.Lock` less picky about who releases it. (Like `threading.Lock`, which is totally happy for different threads to call `acquire` and `release`.) Downsides: this would mean we can never add deadlock detection or otherwise report on lock ownership when debugging; feels kind of weird and error-prone; inconsistent with recursive locks (which inherently have to track ownership). Doesn't solve the problem for other context managers that have the same issue (e.g. nurseries – see also https://github.com/python-trio/trio-asyncio/issues/42 which is about context-switching between trio and asyncio, which has essentially isomorphic concerns) * Add an API `BlockingTrioPortal` to wrap a context manager, so you write something like `with portal.async_with(lock): ...`, and `BlockingTrioPortal.async_with` is clever enough to allocate a single backing task and use it for calling both `__aenter__` and `__aexit__`. * Make it so `BlockingTrioPortal` somehow maintains a single backing task across multiple operations automatically. Managing the task lifetime becomes a bit tricky here: what if multiple threads use the same portal? Do we need to add a `BlockingTrioPortal.close` API to shut down the backing task(s)? I don't think there's any actual show-stoppers that would prevent killing the backing task(s) from `__del__`, though of course it's pretty complicated to do. Or we could change the API to make this easier to track, e.g. by having a `with portal_factory.open() as portal: ...` that you have to do in the thread to get a `portal` handle? * Change the trio<->thread API entirely, using something like @agronholm's `async with in_trio: ...` / `async with in_thread: ...` trick. https://github.com/python-trio/trio-asyncio/issues/42 has arguments for why this makes sense for trio/asyncio transitions, #649 is the trio issue for discussing the low-level API we need in the core to make this possible. I guess whether this is a good idea depends a lot on how people are using this... do they really want to switch back and forth between trio and a thread? if so it's great. Do they just want to acquire a lock, and otherwise stay in the thread? If so then writing `async with in_trio: async with lock: async with in_thread: ...` would be pretty annoying (and there's some question about how to get back into the same thread, or if that even matters). I guess that could be shortened to `async with in_trio, lock, in_thread: ...`, but I don't know if that helps much :-).
non_process
make trio blockingtrioportal and trio lock work together in nikratio asks about how to share a lock between trio and a thread there are some advantages to using trio lock and having the thread access it through a blockingtrioportal – in particular it means that the trio side code can use cancellation as normal it s also maybe cheaper i haven t measured but it seems plausible however if you try this you ll get bitten in a surprising way trio lock tracks which task acquired it and wants that task to release it this is intended to help avoid mistakes and give us the option of implementing deadlock detection later but here it creates a problem if you do portal run lock acquire and then later portal run sync lock release then you ll because the blockingtrioportal uses a different backing task for each call so from the lock s perspective the task that s trying to release it is some random task unrelated to the one that acquired it this is unfortunate we should fix it somehow some options make trio lock less picky about who releases it like threading lock which is totally happy for different threads to call acquire and release downsides this would mean we can never add deadlock detection or otherwise report on lock ownership when debugging feels kind of weird and error prone inconsistent with recursive locks which inherently have to track ownership doesn t solve the problem for other context managers that have the same issue e g nurseries – see also which is about context switching between trio and asyncio which has essentially isomorphic concerns add an api blockingtrioportal to wrap a context manager so you write something like with portal async with lock and blockingtrioportal async with is clever enough to allocate a single backing task and use it for calling both aenter and aexit make it so blockingtrioportal somehow maintains a single backing task across multiple operations automatically managing the task lifetime becomes a bit tricky here what if multiple threads use the same portal do we need to add a blockingtrioportal close api to shut down the backing task s i don t think there s any actual show stoppers that would prevent killing the backing task s from del though of course it s pretty complicated to do or we could change the api to make this easier to track e g by having a with portal factory open as portal that you have to do in the thread to get a portal handle change the trio thread api entirely using something like agronholm s async with in trio async with in thread trick has arguments for why this makes sense for trio asyncio transitions is the trio issue for discussing the low level api we need in the core to make this possible i guess whether this is a good idea depends a lot on how people are using this do they really want to switch back and forth between trio and a thread if so it s great do they just want to acquire a lock and otherwise stay in the thread if so then writing async with in trio async with lock async with in thread would be pretty annoying and there s some question about how to get back into the same thread or if that even matters i guess that could be shortened to async with in trio lock in thread but i don t know if that helps much
0
21,602
30,004,925,160
IssuesEvent
2023-06-26 11:50:32
firebase/firebase-cpp-sdk
https://api.github.com/repos/firebase/firebase-cpp-sdk
reopened
[C++] Nightly Integration Testing Report for Firestore
type: process nightly-testing
<hidden value="integration-test-status-comment"></hidden> ### ✅&nbsp; [build against repo] Integration test succeeded! Requested by @sunmou99 on commit 9723100f301952fdfdeb31a412dec9b27873683a Last updated: Sun Jun 25 04:35 PDT 2023 **[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/5369162457)** <hidden value="integration-test-status-comment"></hidden> *** ### ✅&nbsp; [build against SDK] Integration test succeeded! Requested by @firebase-workflow-trigger[bot] on commit 9723100f301952fdfdeb31a412dec9b27873683a Last updated: Sun Jun 25 07:12 PDT 2023 **[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/5369831583)** <hidden value="integration-test-status-comment"></hidden> *** ### ✅&nbsp; [build against tip] Integration test succeeded! Requested by @sunmou99 on commit 9723100f301952fdfdeb31a412dec9b27873683a Last updated: Mon Jun 26 04:49 PDT 2023 **[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/5377330622)**
1.0
[C++] Nightly Integration Testing Report for Firestore - <hidden value="integration-test-status-comment"></hidden> ### ✅&nbsp; [build against repo] Integration test succeeded! Requested by @sunmou99 on commit 9723100f301952fdfdeb31a412dec9b27873683a Last updated: Sun Jun 25 04:35 PDT 2023 **[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/5369162457)** <hidden value="integration-test-status-comment"></hidden> *** ### ✅&nbsp; [build against SDK] Integration test succeeded! Requested by @firebase-workflow-trigger[bot] on commit 9723100f301952fdfdeb31a412dec9b27873683a Last updated: Sun Jun 25 07:12 PDT 2023 **[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/5369831583)** <hidden value="integration-test-status-comment"></hidden> *** ### ✅&nbsp; [build against tip] Integration test succeeded! Requested by @sunmou99 on commit 9723100f301952fdfdeb31a412dec9b27873683a Last updated: Mon Jun 26 04:49 PDT 2023 **[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/5377330622)**
process
nightly integration testing report for firestore ✅ nbsp integration test succeeded requested by on commit last updated sun jun pdt ✅ nbsp integration test succeeded requested by firebase workflow trigger on commit last updated sun jun pdt ✅ nbsp integration test succeeded requested by on commit last updated mon jun pdt
1
10,282
13,132,926,585
IssuesEvent
2020-08-06 19:52:52
RockefellerArchiveCenter/request_broker
https://api.github.com/repos/RockefellerArchiveCenter/request_broker
closed
Check each requested item for available delivery formats
process request
## Describe the solution you'd like Each item in a request must be deliverable in the reading room or for duplication. Requests must meet one of the following two conditions: ``` One or more instances are associated with the records One or more digital object links are associated with the records ```
1.0
Check each requested item for available delivery formats - ## Describe the solution you'd like Each item in a request must be deliverable in the reading room or for duplication. Requests must meet one of the following two conditions: ``` One or more instances are associated with the records One or more digital object links are associated with the records ```
process
check each requested item for available delivery formats describe the solution you d like each item in a request must be deliverable in the reading room or for duplication requests must meet one of the following two conditions one or more instances are associated with the records one or more digital object links are associated with the records
1
772,300
27,115,376,592
IssuesEvent
2023-02-15 18:08:52
dtcenter/METplus
https://api.github.com/repos/dtcenter/METplus
closed
Bugfix: StatAnalysis cannot be configured to run once for each valid time
type: bug component: use case wrapper priority: blocker requestor: NOAA/EMC required: FOR DEVELOPMENT RELEASE METplus: Configuration MET: Gridded Analysis Tools
*Replace italics below with details for this issue.* ## Describe the Problem ## *Provide a clear and concise description of the bug here.* ### Expected Behavior ### *Provide a clear and concise description of what you expected to happen here.* ### Environment ### Describe your runtime environment: *1. Machine: (e.g. HPC name, Linux Workstation, Mac Laptop)* *2. OS: (e.g. RedHat Linux, MacOS)* *3. Software version number(s)* ### To Reproduce ### Describe the steps to reproduce the behavior: *1. Go to '...'* *2. Click on '....'* *3. Scroll down to '....'* *4. See error* *Post relevant sample data following these instructions:* *https://dtcenter.org/community-code/model-evaluation-tools-met/met-help-desk#ftp* ### Relevant Deadlines ### *List relevant project deadlines here or state NONE.* ### Funding Source ### *Define the source of funding and account keys here or state NONE.* ## Define the Metadata ## ### Assignee ### - [ ] Select **engineer(s)** or **no engineer** required - [ ] Select **scientist(s)** or **no scientist** required ### Labels ### - [ ] Select **component(s)** - [ ] Select **priority** - [ ] Select **requestor(s)** ### Projects and Milestone ### - [ ] Select **Organization** level **Project** for support of the current coordinated release - [ ] Select **Repository** level **Project** for development toward the next official release or add **alert: NEED PROJECT ASSIGNMENT** label - [ ] Select **Milestone** as the next bugfix version ## Define Related Issue(s) ## Consider the impact to the other METplus components. - [ ] [METplus](https://github.com/dtcenter/METplus/issues/new/choose), [MET](https://github.com/dtcenter/MET/issues/new/choose), [METdataio](https://github.com/dtcenter/METdataio/issues/new/choose), [METviewer](https://github.com/dtcenter/METviewer/issues/new/choose), [METexpress](https://github.com/dtcenter/METexpress/issues/new/choose), [METcalcpy](https://github.com/dtcenter/METcalcpy/issues/new/choose), [METplotpy](https://github.com/dtcenter/METplotpy/issues/new/choose) ## Bugfix Checklist ## See the [METplus Workflow](https://metplus.readthedocs.io/en/latest/Contributors_Guide/github_workflow.html) for details. - [ ] Complete the issue definition above, including the **Time Estimate** and **Funding Source**. - [ ] Fork this repository or create a branch of **main_\<Version>**. Branch name: `bugfix_<Issue Number>_main_<Version>_<Description>` - [ ] Fix the bug and test your changes. - [ ] Add/update log messages for easier debugging. - [ ] Add/update unit tests. - [ ] Add/update documentation. - [ ] Add any new Python packages to the [METplus Components Python Requirements](https://metplus.readthedocs.io/en/develop/Users_Guide/overview.html#metplus-components-python-requirements) table. - [ ] Push local changes to GitHub. - [ ] Submit a pull request to merge into **main_\<Version>**. Pull request: `bugfix <Issue Number> main_<Version> <Description>` - [ ] Define the pull request metadata, as permissions allow. Select: **Reviewer(s)** and **Development** issues Select: **Organization** level software support **Project** for the current coordinated release Select: **Milestone** as the next bugfix version - [ ] Iterate until the reviewer(s) accept and merge your changes. - [ ] Delete your fork or branch. - [ ] Complete the steps above to fix the bug on the **develop** branch. Branch name: `bugfix_<Issue Number>_develop_<Description>` Pull request: `bugfix <Issue Number> develop <Description>` Select: **Reviewer(s)** and **Development** issues Select: **Repository** level development cycle **Project** for the next official release Select: **Milestone** as the next official version - [ ] Close this issue.
1.0
Bugfix: StatAnalysis cannot be configured to run once for each valid time - *Replace italics below with details for this issue.* ## Describe the Problem ## *Provide a clear and concise description of the bug here.* ### Expected Behavior ### *Provide a clear and concise description of what you expected to happen here.* ### Environment ### Describe your runtime environment: *1. Machine: (e.g. HPC name, Linux Workstation, Mac Laptop)* *2. OS: (e.g. RedHat Linux, MacOS)* *3. Software version number(s)* ### To Reproduce ### Describe the steps to reproduce the behavior: *1. Go to '...'* *2. Click on '....'* *3. Scroll down to '....'* *4. See error* *Post relevant sample data following these instructions:* *https://dtcenter.org/community-code/model-evaluation-tools-met/met-help-desk#ftp* ### Relevant Deadlines ### *List relevant project deadlines here or state NONE.* ### Funding Source ### *Define the source of funding and account keys here or state NONE.* ## Define the Metadata ## ### Assignee ### - [ ] Select **engineer(s)** or **no engineer** required - [ ] Select **scientist(s)** or **no scientist** required ### Labels ### - [ ] Select **component(s)** - [ ] Select **priority** - [ ] Select **requestor(s)** ### Projects and Milestone ### - [ ] Select **Organization** level **Project** for support of the current coordinated release - [ ] Select **Repository** level **Project** for development toward the next official release or add **alert: NEED PROJECT ASSIGNMENT** label - [ ] Select **Milestone** as the next bugfix version ## Define Related Issue(s) ## Consider the impact to the other METplus components. - [ ] [METplus](https://github.com/dtcenter/METplus/issues/new/choose), [MET](https://github.com/dtcenter/MET/issues/new/choose), [METdataio](https://github.com/dtcenter/METdataio/issues/new/choose), [METviewer](https://github.com/dtcenter/METviewer/issues/new/choose), [METexpress](https://github.com/dtcenter/METexpress/issues/new/choose), [METcalcpy](https://github.com/dtcenter/METcalcpy/issues/new/choose), [METplotpy](https://github.com/dtcenter/METplotpy/issues/new/choose) ## Bugfix Checklist ## See the [METplus Workflow](https://metplus.readthedocs.io/en/latest/Contributors_Guide/github_workflow.html) for details. - [ ] Complete the issue definition above, including the **Time Estimate** and **Funding Source**. - [ ] Fork this repository or create a branch of **main_\<Version>**. Branch name: `bugfix_<Issue Number>_main_<Version>_<Description>` - [ ] Fix the bug and test your changes. - [ ] Add/update log messages for easier debugging. - [ ] Add/update unit tests. - [ ] Add/update documentation. - [ ] Add any new Python packages to the [METplus Components Python Requirements](https://metplus.readthedocs.io/en/develop/Users_Guide/overview.html#metplus-components-python-requirements) table. - [ ] Push local changes to GitHub. - [ ] Submit a pull request to merge into **main_\<Version>**. Pull request: `bugfix <Issue Number> main_<Version> <Description>` - [ ] Define the pull request metadata, as permissions allow. Select: **Reviewer(s)** and **Development** issues Select: **Organization** level software support **Project** for the current coordinated release Select: **Milestone** as the next bugfix version - [ ] Iterate until the reviewer(s) accept and merge your changes. - [ ] Delete your fork or branch. - [ ] Complete the steps above to fix the bug on the **develop** branch. Branch name: `bugfix_<Issue Number>_develop_<Description>` Pull request: `bugfix <Issue Number> develop <Description>` Select: **Reviewer(s)** and **Development** issues Select: **Repository** level development cycle **Project** for the next official release Select: **Milestone** as the next official version - [ ] Close this issue.
non_process
bugfix statanalysis cannot be configured to run once for each valid time replace italics below with details for this issue describe the problem provide a clear and concise description of the bug here expected behavior provide a clear and concise description of what you expected to happen here environment describe your runtime environment machine e g hpc name linux workstation mac laptop os e g redhat linux macos software version number s to reproduce describe the steps to reproduce the behavior go to click on scroll down to see error post relevant sample data following these instructions relevant deadlines list relevant project deadlines here or state none funding source define the source of funding and account keys here or state none define the metadata assignee select engineer s or no engineer required select scientist s or no scientist required labels select component s select priority select requestor s projects and milestone select organization level project for support of the current coordinated release select repository level project for development toward the next official release or add alert need project assignment label select milestone as the next bugfix version define related issue s consider the impact to the other metplus components bugfix checklist see the for details complete the issue definition above including the time estimate and funding source fork this repository or create a branch of main branch name bugfix main fix the bug and test your changes add update log messages for easier debugging add update unit tests add update documentation add any new python packages to the table push local changes to github submit a pull request to merge into main pull request bugfix main define the pull request metadata as permissions allow select reviewer s and development issues select organization level software support project for the current coordinated release select milestone as the next bugfix version iterate until the reviewer s accept and merge your changes delete your fork or branch complete the steps above to fix the bug on the develop branch branch name bugfix develop pull request bugfix develop select reviewer s and development issues select repository level development cycle project for the next official release select milestone as the next official version close this issue
0
20,608
27,273,818,793
IssuesEvent
2023-02-23 02:00:07
lizhihao6/get-daily-arxiv-noti
https://api.github.com/repos/lizhihao6/get-daily-arxiv-noti
opened
New submissions for Thu, 23 Feb 23
event camera white balance isp compression image signal processing image signal process raw raw image events camera color contrast events AWB
## Keyword: events ### Connecting Vision and Language with Video Localized Narratives - **Authors:** Paul Voigtlaender, Soravit Changpinyo, Jordi Pont-Tuset, Radu Soricut, Vittorio Ferrari - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2302.11217 - **Pdf link:** https://arxiv.org/pdf/2302.11217 - **Abstract** We propose Video Localized Narratives, a new form of multimodal video annotations connecting vision and language. In the original Localized Narratives, annotators speak and move their mouse simultaneously on an image, thus grounding each word with a mouse trace segment. However, this is challenging on a video. Our new protocol empowers annotators to tell the story of a video with Localized Narratives, capturing even complex events involving multiple actors interacting with each other and with several passive objects. We annotated 20k videos of the OVIS, UVO, and Oops datasets, totalling 1.7M words. Based on this data, we also construct new benchmarks for the video narrative grounding and video question-answering tasks, and provide reference results from strong baseline models. Our annotations are available at https://google.github.io/video-localized-narratives/. ## Keyword: event camera There is no result ## Keyword: events camera There is no result ## Keyword: white balance There is no result ## Keyword: color contrast There is no result ## Keyword: AWB There is no result ## Keyword: ISP ### Invariant Target Detection in Images through the Normalized 2-D Correlation Technique - **Authors:** Fatin E. M. Al-Obaidi, Anwar H. Al-Saleh, Shaymaa H. Kafi, Ali J.Karam, Ali A. D. Al-Zuky - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2302.11196 - **Pdf link:** https://arxiv.org/pdf/2302.11196 - **Abstract** The normalized 2-D correlation technique is a robust method for detecting targets in images due to its ability to remain invariant under rotation, translation, and scaling. This paper examines the impact of translation, and scaling on target identification in images. The results indicate a high level of accuracy in detecting targets, even when they are exhibit variations in location and size. The results indicate that the similarity between the image and the two used targets improves as the resize ratio increases. All statistical estimators demonstrate a strong similarity between the original and extracted targets. The elapsed time for all scenarios falls within the range (44.75-44.85), (37.48-37.73) seconds for bird and children targets respectively, and the correlation coefficient displays stable relationships with values that fall within the range of (0.90-0.98) and (0.87-0.93) for bird and children targets respectively. ## Keyword: image signal processing There is no result ## Keyword: image signal process There is no result ## Keyword: compression There is no result ## Keyword: RAW ### Transformer-Based Sensor Fusion for Autonomous Driving: A Survey - **Authors:** Apoorv Singh - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2302.11481 - **Pdf link:** https://arxiv.org/pdf/2302.11481 - **Abstract** Sensor fusion is an essential topic in many perception systems, such as autonomous driving and robotics. Transformers-based detection head and CNN-based feature encoder to extract features from raw sensor-data has emerged as one of the best performing sensor-fusion 3D-detection-framework, according to the dataset leaderboards. In this work we provide an in-depth literature survey of transformer based 3D-object detection task in the recent past, primarily focusing on the sensor fusion. We also briefly go through the Vision transformers (ViT) basics, so that readers can easily follow through the paper. Moreover, we also briefly go through few of the non-transformer based less-dominant methods for sensor fusion for autonomous driving. In conclusion we summarize with sensor-fusion trends to follow and provoke future research. More updated summary can be found at: https://github.com/ApoorvRoboticist/Transformers-Sensor-Fusion ## Keyword: raw image There is no result
2.0
New submissions for Thu, 23 Feb 23 - ## Keyword: events ### Connecting Vision and Language with Video Localized Narratives - **Authors:** Paul Voigtlaender, Soravit Changpinyo, Jordi Pont-Tuset, Radu Soricut, Vittorio Ferrari - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2302.11217 - **Pdf link:** https://arxiv.org/pdf/2302.11217 - **Abstract** We propose Video Localized Narratives, a new form of multimodal video annotations connecting vision and language. In the original Localized Narratives, annotators speak and move their mouse simultaneously on an image, thus grounding each word with a mouse trace segment. However, this is challenging on a video. Our new protocol empowers annotators to tell the story of a video with Localized Narratives, capturing even complex events involving multiple actors interacting with each other and with several passive objects. We annotated 20k videos of the OVIS, UVO, and Oops datasets, totalling 1.7M words. Based on this data, we also construct new benchmarks for the video narrative grounding and video question-answering tasks, and provide reference results from strong baseline models. Our annotations are available at https://google.github.io/video-localized-narratives/. ## Keyword: event camera There is no result ## Keyword: events camera There is no result ## Keyword: white balance There is no result ## Keyword: color contrast There is no result ## Keyword: AWB There is no result ## Keyword: ISP ### Invariant Target Detection in Images through the Normalized 2-D Correlation Technique - **Authors:** Fatin E. M. Al-Obaidi, Anwar H. Al-Saleh, Shaymaa H. Kafi, Ali J.Karam, Ali A. D. Al-Zuky - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2302.11196 - **Pdf link:** https://arxiv.org/pdf/2302.11196 - **Abstract** The normalized 2-D correlation technique is a robust method for detecting targets in images due to its ability to remain invariant under rotation, translation, and scaling. This paper examines the impact of translation, and scaling on target identification in images. The results indicate a high level of accuracy in detecting targets, even when they are exhibit variations in location and size. The results indicate that the similarity between the image and the two used targets improves as the resize ratio increases. All statistical estimators demonstrate a strong similarity between the original and extracted targets. The elapsed time for all scenarios falls within the range (44.75-44.85), (37.48-37.73) seconds for bird and children targets respectively, and the correlation coefficient displays stable relationships with values that fall within the range of (0.90-0.98) and (0.87-0.93) for bird and children targets respectively. ## Keyword: image signal processing There is no result ## Keyword: image signal process There is no result ## Keyword: compression There is no result ## Keyword: RAW ### Transformer-Based Sensor Fusion for Autonomous Driving: A Survey - **Authors:** Apoorv Singh - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2302.11481 - **Pdf link:** https://arxiv.org/pdf/2302.11481 - **Abstract** Sensor fusion is an essential topic in many perception systems, such as autonomous driving and robotics. Transformers-based detection head and CNN-based feature encoder to extract features from raw sensor-data has emerged as one of the best performing sensor-fusion 3D-detection-framework, according to the dataset leaderboards. In this work we provide an in-depth literature survey of transformer based 3D-object detection task in the recent past, primarily focusing on the sensor fusion. We also briefly go through the Vision transformers (ViT) basics, so that readers can easily follow through the paper. Moreover, we also briefly go through few of the non-transformer based less-dominant methods for sensor fusion for autonomous driving. In conclusion we summarize with sensor-fusion trends to follow and provoke future research. More updated summary can be found at: https://github.com/ApoorvRoboticist/Transformers-Sensor-Fusion ## Keyword: raw image There is no result
process
new submissions for thu feb keyword events connecting vision and language with video localized narratives authors paul voigtlaender soravit changpinyo jordi pont tuset radu soricut vittorio ferrari subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract we propose video localized narratives a new form of multimodal video annotations connecting vision and language in the original localized narratives annotators speak and move their mouse simultaneously on an image thus grounding each word with a mouse trace segment however this is challenging on a video our new protocol empowers annotators to tell the story of a video with localized narratives capturing even complex events involving multiple actors interacting with each other and with several passive objects we annotated videos of the ovis uvo and oops datasets totalling words based on this data we also construct new benchmarks for the video narrative grounding and video question answering tasks and provide reference results from strong baseline models our annotations are available at keyword event camera there is no result keyword events camera there is no result keyword white balance there is no result keyword color contrast there is no result keyword awb there is no result keyword isp invariant target detection in images through the normalized d correlation technique authors fatin e m al obaidi anwar h al saleh shaymaa h kafi ali j karam ali a d al zuky subjects computer vision and pattern recognition cs cv image and video processing eess iv arxiv link pdf link abstract the normalized d correlation technique is a robust method for detecting targets in images due to its ability to remain invariant under rotation translation and scaling this paper examines the impact of translation and scaling on target identification in images the results indicate a high level of accuracy in detecting targets even when they are exhibit variations in location and size the results indicate that the similarity between the image and the two used targets improves as the resize ratio increases all statistical estimators demonstrate a strong similarity between the original and extracted targets the elapsed time for all scenarios falls within the range seconds for bird and children targets respectively and the correlation coefficient displays stable relationships with values that fall within the range of and for bird and children targets respectively keyword image signal processing there is no result keyword image signal process there is no result keyword compression there is no result keyword raw transformer based sensor fusion for autonomous driving a survey authors apoorv singh subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract sensor fusion is an essential topic in many perception systems such as autonomous driving and robotics transformers based detection head and cnn based feature encoder to extract features from raw sensor data has emerged as one of the best performing sensor fusion detection framework according to the dataset leaderboards in this work we provide an in depth literature survey of transformer based object detection task in the recent past primarily focusing on the sensor fusion we also briefly go through the vision transformers vit basics so that readers can easily follow through the paper moreover we also briefly go through few of the non transformer based less dominant methods for sensor fusion for autonomous driving in conclusion we summarize with sensor fusion trends to follow and provoke future research more updated summary can be found at keyword raw image there is no result
1
184,527
31,997,598,145
IssuesEvent
2023-09-21 10:09:49
NIAEFEUP/nitsig
https://api.github.com/repos/NIAEFEUP/nitsig
opened
Multiline info cards
bug ui-design
The current info cards implementation does not take multiple lines into account, this should be fixed ![image](https://github.com/NIAEFEUP/nitsig/assets/14280476/9eb684e9-0604-45c8-b78e-5a973d2a55bc)
1.0
Multiline info cards - The current info cards implementation does not take multiple lines into account, this should be fixed ![image](https://github.com/NIAEFEUP/nitsig/assets/14280476/9eb684e9-0604-45c8-b78e-5a973d2a55bc)
non_process
multiline info cards the current info cards implementation does not take multiple lines into account this should be fixed
0
728
3,214,216,124
IssuesEvent
2015-10-06 23:58:04
GsDevKit/GsDevKit
https://api.github.com/repos/GsDevKit/GsDevKit
closed
Upgrade Grease reference to 1.1.10
in process
I noticed GLASS1 is still referencing Grease 1.0.7.x Seems to me that an upgrade to 1.1.10 would be good to do. We need (and load) it in Seaside.
1.0
Upgrade Grease reference to 1.1.10 - I noticed GLASS1 is still referencing Grease 1.0.7.x Seems to me that an upgrade to 1.1.10 would be good to do. We need (and load) it in Seaside.
process
upgrade grease reference to i noticed is still referencing grease x seems to me that an upgrade to would be good to do we need and load it in seaside
1
5,951
8,775,454,331
IssuesEvent
2018-12-18 23:11:47
kerubistan/kerub
https://api.github.com/repos/kerubistan/kerub
opened
monitor physical disks
component:data processing enhancement priority: normal
check the drives periodically for predicted errors, create a problem when a drive fails
1.0
monitor physical disks - check the drives periodically for predicted errors, create a problem when a drive fails
process
monitor physical disks check the drives periodically for predicted errors create a problem when a drive fails
1
102,358
31,927,006,480
IssuesEvent
2023-09-19 03:08:39
apache/camel-quarkus
https://api.github.com/repos/apache/camel-quarkus
closed
[CI] - Camel Main Branch Build Failure
build/camel-main
This is a placeholder issue used by the [nightly sync workflow](https://github.com/apache/camel-quarkus/actions/workflows/camel-master-cron.yaml) for the [`camel-main`](https://github.com/apache/camel-quarkus/tree/camel-main) branch.
1.0
[CI] - Camel Main Branch Build Failure - This is a placeholder issue used by the [nightly sync workflow](https://github.com/apache/camel-quarkus/actions/workflows/camel-master-cron.yaml) for the [`camel-main`](https://github.com/apache/camel-quarkus/tree/camel-main) branch.
non_process
camel main branch build failure this is a placeholder issue used by the for the branch
0
351,248
10,514,571,364
IssuesEvent
2019-09-28 01:43:42
AY1920S1-CS2113T-W17-4/main
https://api.github.com/repos/AY1920S1-CS2113T-W17-4/main
opened
As a Computing student, I can have a do-after task
priority.Medium type.Story
so that I know what tasks need to be done after completing a specific task.
1.0
As a Computing student, I can have a do-after task - so that I know what tasks need to be done after completing a specific task.
non_process
as a computing student i can have a do after task so that i know what tasks need to be done after completing a specific task
0
12,446
14,934,581,563
IssuesEvent
2021-01-25 10:43:32
pystatgen/sgkit
https://api.github.com/repos/pystatgen/sgkit
closed
Run build on a schedule
process + tools
We should run the build daily to pick up changes to dependencies that cause our tests to fail (like #421).
1.0
Run build on a schedule - We should run the build daily to pick up changes to dependencies that cause our tests to fail (like #421).
process
run build on a schedule we should run the build daily to pick up changes to dependencies that cause our tests to fail like
1
160,895
12,520,995,353
IssuesEvent
2020-06-03 16:45:59
aliasrobotics/RVD
https://api.github.com/repos/aliasrobotics/RVD
opened
Use of possibly insecure function - consider using safer ast, ./src/ros_comm/rosparam/src/rosparam/__init__.py:113
bandit bug components software robot component: ROS static analysis testing triage version: melodic
```yaml { "id": 1, "title": "Use of possibly insecure function - consider using safer ast, ./src/ros_comm/rosparam/src/rosparam/__init__.py:113", "type": "bug", "description": "HIGH confidence of MEDIUM severity bug. Use of possibly insecure function - consider using safer ast.literal_eval. ./src/ros_comm/rosparam/src/rosparam/__init__.py:113. See links for more info on the bug.", "cwe": "None", "cve": "None", "keywords": [ "bandit", "bug", "static analysis", "testing", "triage", "bug", "version: melodic", "robot component: ROS", "components software" ], "system": "", "vendor": null, "severity": { "rvss-score": 0, "rvss-vector": "", "severity-description": "", "cvss-score": 0, "cvss-vector": "" }, "links": "", "flaw": { "phase": "testing", "specificity": "subject-specific", "architectural-location": "application-specific", "application": "N/A", "subsystem": "N/A", "package": "N/A", "languages": "None", "date-detected": "2020-06-03 (16:45)", "detected-by": "Alias Robotics", "detected-by-method": "testing static", "date-reported": "2020-06-03 (16:45)", "reported-by": "Alias Robotics", "reported-by-relationship": "automatic", "issue": "", "reproducibility": "always", "trace": "./src/ros_comm/rosparam/src/rosparam/__init__.py:113", "reproduction": "See artifacts below (if available)", "reproduction-image": "" }, "exploitation": { "description": "", "exploitation-image": "", "exploitation-vector": "" }, "mitigation": { "description": "", "pull-request": "", "date-mitigation": "" } } ```
1.0
Use of possibly insecure function - consider using safer ast, ./src/ros_comm/rosparam/src/rosparam/__init__.py:113 - ```yaml { "id": 1, "title": "Use of possibly insecure function - consider using safer ast, ./src/ros_comm/rosparam/src/rosparam/__init__.py:113", "type": "bug", "description": "HIGH confidence of MEDIUM severity bug. Use of possibly insecure function - consider using safer ast.literal_eval. ./src/ros_comm/rosparam/src/rosparam/__init__.py:113. See links for more info on the bug.", "cwe": "None", "cve": "None", "keywords": [ "bandit", "bug", "static analysis", "testing", "triage", "bug", "version: melodic", "robot component: ROS", "components software" ], "system": "", "vendor": null, "severity": { "rvss-score": 0, "rvss-vector": "", "severity-description": "", "cvss-score": 0, "cvss-vector": "" }, "links": "", "flaw": { "phase": "testing", "specificity": "subject-specific", "architectural-location": "application-specific", "application": "N/A", "subsystem": "N/A", "package": "N/A", "languages": "None", "date-detected": "2020-06-03 (16:45)", "detected-by": "Alias Robotics", "detected-by-method": "testing static", "date-reported": "2020-06-03 (16:45)", "reported-by": "Alias Robotics", "reported-by-relationship": "automatic", "issue": "", "reproducibility": "always", "trace": "./src/ros_comm/rosparam/src/rosparam/__init__.py:113", "reproduction": "See artifacts below (if available)", "reproduction-image": "" }, "exploitation": { "description": "", "exploitation-image": "", "exploitation-vector": "" }, "mitigation": { "description": "", "pull-request": "", "date-mitigation": "" } } ```
non_process
use of possibly insecure function consider using safer ast src ros comm rosparam src rosparam init py yaml id title use of possibly insecure function consider using safer ast src ros comm rosparam src rosparam init py type bug description high confidence of medium severity bug use of possibly insecure function consider using safer ast literal eval src ros comm rosparam src rosparam init py see links for more info on the bug cwe none cve none keywords bandit bug static analysis testing triage bug version melodic robot component ros components software system vendor null severity rvss score rvss vector severity description cvss score cvss vector links flaw phase testing specificity subject specific architectural location application specific application n a subsystem n a package n a languages none date detected detected by alias robotics detected by method testing static date reported reported by alias robotics reported by relationship automatic issue reproducibility always trace src ros comm rosparam src rosparam init py reproduction see artifacts below if available reproduction image exploitation description exploitation image exploitation vector mitigation description pull request date mitigation
0
10,166
13,044,162,691
IssuesEvent
2020-07-29 03:47:35
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `AesEncrypt` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `AesEncrypt` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @lonng ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `AesEncrypt` from TiDB - ## Description Port the scalar function `AesEncrypt` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @lonng ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function aesencrypt from tidb description port the scalar function aesencrypt from tidb to coprocessor score mentor s lonng recommended skills rust programming learning materials already implemented expressions ported from tidb
1
3,211
6,266,406,261
IssuesEvent
2017-07-17 01:45:32
nodejs/node
https://api.github.com/repos/nodejs/node
closed
spawnSync can be used to perform OOB memory write
child_process security
<!-- Thank you for reporting an issue. This issue tracker is for bugs and issues found within Node.js core. If you require more general support please file an issue on our help repo. https://github.com/nodejs/help Please fill in as much of the template below as you're able. Version: output of `node -v` Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows) Subsystem: if known, please specify affected core module name If possible, please provide code that demonstrates the problem, keeping it as simple and free of external dependencies as you are able. --> * **Version**: 6.4-8 * **Platform**: * **Subsystem**: <!-- Enter your issue details below this comment. --> Apologies for the multiple issues, this is the last of the bunch and maybe most serious: spawnSync can be used to perform an OOB memory write. Specifically, the `buffer` allocated on line 979 of `spawn_sync.cc`: ``` buffer = new char[list_size + data_size]; ``` is overwritten with `StringBytes::Write` on line 986, by passing a JS arugment that is larger than the allocated buffer. Here is an exploit using the high-level node API: ```javascript const spawn = require('child_process').spawnSync; let counter = 0; // mess with args const args = [ 'ls', '-a' ]; Object.defineProperty(args, 2, { get: () => { if (counter++ === 2) { // set this to 3 for v6.4, 2 for 8 // compute giant string to overlow let str = 'pwn0'; for (let i =0 ;i <100024; i++) { str+='pwnpwn' } return str; } else { return 'pwn0'; } }, enumerable: true }); args.slice = () => { // so normalizing args does nothing return args; }; args.unshift = (file) => { // need this so the unshift in child_process doesn't throw }; spawn('ls', args); ``` You need to mess with args to get the JS code that calls the binding layer to do what you want, but you can also just call the binding function directly: ```javascript let counter = 0; const spawn_sync = process.binding('spawn_sync'); // compute envPairs as done by child_process let envPairs = []; for (var key in process.env) { envPairs.push(key + '=' + process.env[key]); } // mess with args const args = [ '-a' ]; Object.defineProperty(args, 1, { get: () => { if (counter++ === 2) { // compute giant string to overlow let str = 'pwn0'; for (let i =0 ;i <100024; i++) { str+='pwnpwn' } return str; } else { return 'pwn0'; } }, enumerable: true }); const options = { file: 'ls', args: args, envPairs: envPairs, stdio: [ { type: 'pipe', readable: true, writable: false }, { type: 'pipe', readable: false, writable: true }, { type: 'pipe', readable: false, writable: true } ] }; spawn_sync.spawn(options); ``` + @mlfbrown for working with me on this
1.0
spawnSync can be used to perform OOB memory write - <!-- Thank you for reporting an issue. This issue tracker is for bugs and issues found within Node.js core. If you require more general support please file an issue on our help repo. https://github.com/nodejs/help Please fill in as much of the template below as you're able. Version: output of `node -v` Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows) Subsystem: if known, please specify affected core module name If possible, please provide code that demonstrates the problem, keeping it as simple and free of external dependencies as you are able. --> * **Version**: 6.4-8 * **Platform**: * **Subsystem**: <!-- Enter your issue details below this comment. --> Apologies for the multiple issues, this is the last of the bunch and maybe most serious: spawnSync can be used to perform an OOB memory write. Specifically, the `buffer` allocated on line 979 of `spawn_sync.cc`: ``` buffer = new char[list_size + data_size]; ``` is overwritten with `StringBytes::Write` on line 986, by passing a JS arugment that is larger than the allocated buffer. Here is an exploit using the high-level node API: ```javascript const spawn = require('child_process').spawnSync; let counter = 0; // mess with args const args = [ 'ls', '-a' ]; Object.defineProperty(args, 2, { get: () => { if (counter++ === 2) { // set this to 3 for v6.4, 2 for 8 // compute giant string to overlow let str = 'pwn0'; for (let i =0 ;i <100024; i++) { str+='pwnpwn' } return str; } else { return 'pwn0'; } }, enumerable: true }); args.slice = () => { // so normalizing args does nothing return args; }; args.unshift = (file) => { // need this so the unshift in child_process doesn't throw }; spawn('ls', args); ``` You need to mess with args to get the JS code that calls the binding layer to do what you want, but you can also just call the binding function directly: ```javascript let counter = 0; const spawn_sync = process.binding('spawn_sync'); // compute envPairs as done by child_process let envPairs = []; for (var key in process.env) { envPairs.push(key + '=' + process.env[key]); } // mess with args const args = [ '-a' ]; Object.defineProperty(args, 1, { get: () => { if (counter++ === 2) { // compute giant string to overlow let str = 'pwn0'; for (let i =0 ;i <100024; i++) { str+='pwnpwn' } return str; } else { return 'pwn0'; } }, enumerable: true }); const options = { file: 'ls', args: args, envPairs: envPairs, stdio: [ { type: 'pipe', readable: true, writable: false }, { type: 'pipe', readable: false, writable: true }, { type: 'pipe', readable: false, writable: true } ] }; spawn_sync.spawn(options); ``` + @mlfbrown for working with me on this
process
spawnsync can be used to perform oob memory write thank you for reporting an issue this issue tracker is for bugs and issues found within node js core if you require more general support please file an issue on our help repo please fill in as much of the template below as you re able version output of node v platform output of uname a unix or version and or bit windows subsystem if known please specify affected core module name if possible please provide code that demonstrates the problem keeping it as simple and free of external dependencies as you are able version platform subsystem apologies for the multiple issues this is the last of the bunch and maybe most serious spawnsync can be used to perform an oob memory write specifically the buffer allocated on line of spawn sync cc buffer new char is overwritten with stringbytes write on line by passing a js arugment that is larger than the allocated buffer here is an exploit using the high level node api javascript const spawn require child process spawnsync let counter mess with args const args object defineproperty args get if counter set this to for for compute giant string to overlow let str for let i i i str pwnpwn return str else return enumerable true args slice so normalizing args does nothing return args args unshift file need this so the unshift in child process doesn t throw spawn ls args you need to mess with args to get the js code that calls the binding layer to do what you want but you can also just call the binding function directly javascript let counter const spawn sync process binding spawn sync compute envpairs as done by child process let envpairs for var key in process env envpairs push key process env mess with args const args object defineproperty args get if counter compute giant string to overlow let str for let i i i str pwnpwn return str else return enumerable true const options file ls args args envpairs envpairs stdio type pipe readable true writable false type pipe readable false writable true type pipe readable false writable true spawn sync spawn options mlfbrown for working with me on this
1
438,785
12,651,412,594
IssuesEvent
2020-06-17 00:14:11
linkerd/linkerd2
https://api.github.com/repos/linkerd/linkerd2
closed
multicluster: support per-mirror credentials
area/cli priority/P0
#4457 raises an issue I didn't notice -- `setup-remote` currently creates an SA/RBAC to be used by all remote gateways. This makes it impossible to revoke access to any one cluster. It's all-or-nothing. It also creates some unnecessary install-time-concerns. Can we change the way that Service Accounts are allocated to do this at link-time rather than at install-time? Basically, something like: ``` # Create new SA/RBAC for the _west_ service-mirror to discover services in the _east_ cluster. linkerd --context=east multicluster allow west | kubectl apply -f - --context=east # Get the raw SA and prepare for use in the _west_ cluster linkerd --context=east multicluster link east | kubectl apply -f - --context=west ``` This could also support reusing credentials for folks who are unconcerned: ``` linkerd --context=east multicluster allow all-the-mirrors | kubectl apply -f - --context=east linkerd --context=east multicluster link all-the-mirrors | kubectl apply -f - --context=west1 linkerd --context=east multicluster link all-the-mirrors | kubectl apply -f - --context=west2 ``` Note that `link` should support a `--cluster-server` flag to override the value in the kubeconfig (for private networks, primarily).
1.0
multicluster: support per-mirror credentials - #4457 raises an issue I didn't notice -- `setup-remote` currently creates an SA/RBAC to be used by all remote gateways. This makes it impossible to revoke access to any one cluster. It's all-or-nothing. It also creates some unnecessary install-time-concerns. Can we change the way that Service Accounts are allocated to do this at link-time rather than at install-time? Basically, something like: ``` # Create new SA/RBAC for the _west_ service-mirror to discover services in the _east_ cluster. linkerd --context=east multicluster allow west | kubectl apply -f - --context=east # Get the raw SA and prepare for use in the _west_ cluster linkerd --context=east multicluster link east | kubectl apply -f - --context=west ``` This could also support reusing credentials for folks who are unconcerned: ``` linkerd --context=east multicluster allow all-the-mirrors | kubectl apply -f - --context=east linkerd --context=east multicluster link all-the-mirrors | kubectl apply -f - --context=west1 linkerd --context=east multicluster link all-the-mirrors | kubectl apply -f - --context=west2 ``` Note that `link` should support a `--cluster-server` flag to override the value in the kubeconfig (for private networks, primarily).
non_process
multicluster support per mirror credentials raises an issue i didn t notice setup remote currently creates an sa rbac to be used by all remote gateways this makes it impossible to revoke access to any one cluster it s all or nothing it also creates some unnecessary install time concerns can we change the way that service accounts are allocated to do this at link time rather than at install time basically something like create new sa rbac for the west service mirror to discover services in the east cluster linkerd context east multicluster allow west kubectl apply f context east get the raw sa and prepare for use in the west cluster linkerd context east multicluster link east kubectl apply f context west this could also support reusing credentials for folks who are unconcerned linkerd context east multicluster allow all the mirrors kubectl apply f context east linkerd context east multicluster link all the mirrors kubectl apply f context linkerd context east multicluster link all the mirrors kubectl apply f context note that link should support a cluster server flag to override the value in the kubeconfig for private networks primarily
0
149,019
19,562,654,829
IssuesEvent
2022-01-03 18:28:29
ibm-cio-vulnerability-scanning/insomnia
https://api.github.com/repos/ibm-cio-vulnerability-scanning/insomnia
opened
CVE-2021-37712 (High) detected in tar-4.4.13.tgz, tar-6.1.0.tgz
security vulnerability
## CVE-2021-37712 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>tar-4.4.13.tgz</b>, <b>tar-6.1.0.tgz</b></p></summary> <p> <details><summary><b>tar-4.4.13.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.13.tgz">https://registry.npmjs.org/tar/-/tar-4.4.13.tgz</a></p> <p> Dependency Hierarchy: - webpack-4.43.0.tgz (Root Library) - watchpack-1.6.1.tgz - chokidar-2.1.8.tgz - fsevents-1.2.12.tgz - node-pre-gyp-0.14.0.tgz - :x: **tar-4.4.13.tgz** (Vulnerable Library) </details> <details><summary><b>tar-6.1.0.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-6.1.0.tgz">https://registry.npmjs.org/tar/-/tar-6.1.0.tgz</a></p> <p>Path to dependency file: /packages/insomnia-components/package.json</p> <p>Path to vulnerable library: /packages/insomnia-components/node_modules/tar/package.json</p> <p> Dependency Hierarchy: - react-6.2.9.tgz (Root Library) - core-6.2.9.tgz - core-server-6.2.9.tgz - terser-webpack-plugin-3.1.0.tgz - cacache-15.0.6.tgz - :x: **tar-6.1.0.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/ibm-cio-vulnerability-scanning/insomnia/commit/c08890295e602a2b6dc87c5303b52c137d02f0c9">c08890295e602a2b6dc87c5303b52c137d02f0c9</a></p> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The npm package "tar" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with names containing unicode values that normalized to the same value. Additionally, on Windows systems, long path portions would resolve to the same file system entities as their 8.3 "short path" counterparts. A specially crafted tar archive could thus include a directory with one form of the path, followed by a symbolic link with a different string that resolves to the same file system entity, followed by a file using the first form. By first creating a directory, and then replacing that directory with a symlink that had a different apparent name that resolved to the same entry in the filesystem, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-qq89-hq3f-393p. <p>Publish Date: 2021-08-31 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37712>CVE-2021-37712</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p">https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p</a></p> <p>Release Date: 2021-08-31</p> <p>Fix Resolution: tar - 4.4.18, 5.0.10, 6.1.9</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-37712 (High) detected in tar-4.4.13.tgz, tar-6.1.0.tgz - ## CVE-2021-37712 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>tar-4.4.13.tgz</b>, <b>tar-6.1.0.tgz</b></p></summary> <p> <details><summary><b>tar-4.4.13.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.13.tgz">https://registry.npmjs.org/tar/-/tar-4.4.13.tgz</a></p> <p> Dependency Hierarchy: - webpack-4.43.0.tgz (Root Library) - watchpack-1.6.1.tgz - chokidar-2.1.8.tgz - fsevents-1.2.12.tgz - node-pre-gyp-0.14.0.tgz - :x: **tar-4.4.13.tgz** (Vulnerable Library) </details> <details><summary><b>tar-6.1.0.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-6.1.0.tgz">https://registry.npmjs.org/tar/-/tar-6.1.0.tgz</a></p> <p>Path to dependency file: /packages/insomnia-components/package.json</p> <p>Path to vulnerable library: /packages/insomnia-components/node_modules/tar/package.json</p> <p> Dependency Hierarchy: - react-6.2.9.tgz (Root Library) - core-6.2.9.tgz - core-server-6.2.9.tgz - terser-webpack-plugin-3.1.0.tgz - cacache-15.0.6.tgz - :x: **tar-6.1.0.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/ibm-cio-vulnerability-scanning/insomnia/commit/c08890295e602a2b6dc87c5303b52c137d02f0c9">c08890295e602a2b6dc87c5303b52c137d02f0c9</a></p> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The npm package "tar" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with names containing unicode values that normalized to the same value. Additionally, on Windows systems, long path portions would resolve to the same file system entities as their 8.3 "short path" counterparts. A specially crafted tar archive could thus include a directory with one form of the path, followed by a symbolic link with a different string that resolves to the same file system entity, followed by a file using the first form. By first creating a directory, and then replacing that directory with a symlink that had a different apparent name that resolved to the same entry in the filesystem, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-qq89-hq3f-393p. <p>Publish Date: 2021-08-31 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37712>CVE-2021-37712</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p">https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p</a></p> <p>Release Date: 2021-08-31</p> <p>Fix Resolution: tar - 4.4.18, 5.0.10, 6.1.9</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in tar tgz tar tgz cve high severity vulnerability vulnerable libraries tar tgz tar tgz tar tgz tar for node library home page a href dependency hierarchy webpack tgz root library watchpack tgz chokidar tgz fsevents tgz node pre gyp tgz x tar tgz vulnerable library tar tgz tar for node library home page a href path to dependency file packages insomnia components package json path to vulnerable library packages insomnia components node modules tar package json dependency hierarchy react tgz root library core tgz core server tgz terser webpack plugin tgz cacache tgz x tar tgz vulnerable library found in head commit a href found in base branch develop vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite and arbitrary code execution vulnerability node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with names containing unicode values that normalized to the same value additionally on windows systems long path portions would resolve to the same file system entities as their short path counterparts a specially crafted tar archive could thus include a directory with one form of the path followed by a symbolic link with a different string that resolves to the same file system entity followed by a file using the first form by first creating a directory and then replacing that directory with a symlink that had a different apparent name that resolved to the same entry in the filesystem it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite these issues were addressed in releases and the branch of node tar has been deprecated and did not receive patches for these issues if you are still using a release we recommend you update to a more recent version of node tar if this is not possible a workaround is available in the referenced ghsa publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar step up your open source security game with whitesource
0
7,840
11,013,477,332
IssuesEvent
2019-12-04 20:33:18
thewca/wca-regulations
https://api.github.com/repos/thewca/wca-regulations
closed
WRC Projects (Mid-2019)
2020 process
Here's a rough list of what I want to tackle with new WRC members in 2019: # [Simplifications](https://github.com/orgs/thewca/projects/4#column-6349209) Changes that reduce what a competitor/judge/Delegate needs to know without significantly impacting the nature of the sport. - "Solved or Not" - No +2 for misalignment - Disallow missing/misplaced pieces in the solved state - Clock Pins - Allow distinguishable clock pins. - Simplify pins in scramble / ignore pins - Reduce valid FMC moves to 18 basic moves. # Scramble Accountability - Accuracy: - (A2d1) Scramble signatures - Scramble checkers? - Secrecy - (4b2a) Only the competition's designated Delegates may handle scrambles - (4b2++) Visually isolated scramble tables # References/Guides/Tutorials - FMC grading ("A printable sheet on how to judge FMC (full list of allowed moves, how to count moves, examples of allowed and disallowed notation).") - Scrambling - Orientation - Notation - Multi-Blind - Judging - Inspection - End of Attempt - Misalignments - BLD - Hardware - Examples of common acceptable and unacceptable puzzles. - Currently known Bluetooth puzzles and how to spot/check them. (Possibly private to Delegates.) - Advancement conditions # Fresh Events Ideas: - Tiers - Ephemeral Events - Encourage Unofficial Events - As an alternative to temporary official events/rounds? # Hardware - Cube Covers - O-Rings (or equivalent) at every competition - Bluetooth Cubes # Processes - Ongoing - Manage the process of incoming emails. - Manage the process of making decisions. - Manage the process of following up decisions with documentation changes. - Manage the process of updating Delegates on important decisions. - Manage feedback from the community. - Backlog - Read all 2019 Delegate reports - Add backlog of incidents to Incidents log - [`wca-regulations` issues](https://github.com/thewca/wca-regulations/issues) # Code of Conduct - Look at other examples and help draft # Education/Outreach/Inclusion - Allow/encourage competitions to reserve a certain fraction of spots for new competitors. - How to update the Regulations competitions more welcome to newcomers.
1.0
WRC Projects (Mid-2019) - Here's a rough list of what I want to tackle with new WRC members in 2019: # [Simplifications](https://github.com/orgs/thewca/projects/4#column-6349209) Changes that reduce what a competitor/judge/Delegate needs to know without significantly impacting the nature of the sport. - "Solved or Not" - No +2 for misalignment - Disallow missing/misplaced pieces in the solved state - Clock Pins - Allow distinguishable clock pins. - Simplify pins in scramble / ignore pins - Reduce valid FMC moves to 18 basic moves. # Scramble Accountability - Accuracy: - (A2d1) Scramble signatures - Scramble checkers? - Secrecy - (4b2a) Only the competition's designated Delegates may handle scrambles - (4b2++) Visually isolated scramble tables # References/Guides/Tutorials - FMC grading ("A printable sheet on how to judge FMC (full list of allowed moves, how to count moves, examples of allowed and disallowed notation).") - Scrambling - Orientation - Notation - Multi-Blind - Judging - Inspection - End of Attempt - Misalignments - BLD - Hardware - Examples of common acceptable and unacceptable puzzles. - Currently known Bluetooth puzzles and how to spot/check them. (Possibly private to Delegates.) - Advancement conditions # Fresh Events Ideas: - Tiers - Ephemeral Events - Encourage Unofficial Events - As an alternative to temporary official events/rounds? # Hardware - Cube Covers - O-Rings (or equivalent) at every competition - Bluetooth Cubes # Processes - Ongoing - Manage the process of incoming emails. - Manage the process of making decisions. - Manage the process of following up decisions with documentation changes. - Manage the process of updating Delegates on important decisions. - Manage feedback from the community. - Backlog - Read all 2019 Delegate reports - Add backlog of incidents to Incidents log - [`wca-regulations` issues](https://github.com/thewca/wca-regulations/issues) # Code of Conduct - Look at other examples and help draft # Education/Outreach/Inclusion - Allow/encourage competitions to reserve a certain fraction of spots for new competitors. - How to update the Regulations competitions more welcome to newcomers.
process
wrc projects mid here s a rough list of what i want to tackle with new wrc members in changes that reduce what a competitor judge delegate needs to know without significantly impacting the nature of the sport solved or not no for misalignment disallow missing misplaced pieces in the solved state clock pins allow distinguishable clock pins simplify pins in scramble ignore pins reduce valid fmc moves to basic moves scramble accountability accuracy scramble signatures scramble checkers secrecy only the competition s designated delegates may handle scrambles visually isolated scramble tables references guides tutorials fmc grading a printable sheet on how to judge fmc full list of allowed moves how to count moves examples of allowed and disallowed notation scrambling orientation notation multi blind judging inspection end of attempt misalignments bld hardware examples of common acceptable and unacceptable puzzles currently known bluetooth puzzles and how to spot check them possibly private to delegates advancement conditions fresh events ideas tiers ephemeral events encourage unofficial events as an alternative to temporary official events rounds hardware cube covers o rings or equivalent at every competition bluetooth cubes processes ongoing manage the process of incoming emails manage the process of making decisions manage the process of following up decisions with documentation changes manage the process of updating delegates on important decisions manage feedback from the community backlog read all delegate reports add backlog of incidents to incidents log code of conduct look at other examples and help draft education outreach inclusion allow encourage competitions to reserve a certain fraction of spots for new competitors how to update the regulations competitions more welcome to newcomers
1
306,416
23,159,597,567
IssuesEvent
2022-07-29 16:12:53
alibaba/ilogtail
https://api.github.com/repos/alibaba/ilogtail
opened
[DOC]: Add manual page for metric_http
documentation
<!-- Please describe any errors, ambiguities, or other improvement opportunities that you can find in the documentation. -->
1.0
[DOC]: Add manual page for metric_http - <!-- Please describe any errors, ambiguities, or other improvement opportunities that you can find in the documentation. -->
non_process
add manual page for metric http please describe any errors ambiguities or other improvement opportunities that you can find in the documentation
0
2,504
5,277,587,390
IssuesEvent
2017-02-07 03:56:00
rubberduck-vba/Rubberduck
https://api.github.com/repos/rubberduck-vba/Rubberduck
closed
an underscore is not always a continuation line
antlr bug parse-tree-preprocessing
````vb Sub testRD2() Dim i As Long 10 For i = 1 To 10 20 Debug.Print i ' comment_ 30 Next i ' putting an underscore on the end of comment gets: ' Consider renaming line label '30' VBAProject.Module1 End Sub ```` same happens if a variable name ends in an underscore and is last on the line and the next line has a line number .
1.0
an underscore is not always a continuation line - ````vb Sub testRD2() Dim i As Long 10 For i = 1 To 10 20 Debug.Print i ' comment_ 30 Next i ' putting an underscore on the end of comment gets: ' Consider renaming line label '30' VBAProject.Module1 End Sub ```` same happens if a variable name ends in an underscore and is last on the line and the next line has a line number .
process
an underscore is not always a continuation line vb sub dim i as long for i to debug print i comment next i putting an underscore on the end of comment gets consider renaming line label vbaproject end sub same happens if a variable name ends in an underscore and is last on the line and the next line has a line number
1
21,218
28,299,926,119
IssuesEvent
2023-04-10 04:34:24
CATcher-org/WATcher
https://api.github.com/repos/CATcher-org/WATcher
closed
Issues/PRs not reloaded before data gets fetched when changing repository from header
bug aspect-Process aspect-UIX difficulty.Moderate category.Bug
**Describe the bug** When changing repository from the header, the original repository is as below: <img width="1311" alt="Screenshot 2023-03-20 at 12 45 04 PM" src="https://user-images.githubusercontent.com/25452497/226249290-77f22e25-1b43-4c05-9b1f-1e759b9c93c3.png"> But after changing to another repository from the header, <img width="430" alt="image" src="https://user-images.githubusercontent.com/25452497/226251652-d6c61f7d-5959-4185-8928-4c70871cd7c4.png"> <img width="801" alt="image" src="https://user-images.githubusercontent.com/25452497/226251702-d0a38b6c-efa3-497f-8389-a51f964ad433.png"> the issues and PRs do not reload immediately <img width="1391" alt="image" src="https://user-images.githubusercontent.com/25452497/226251737-bf2e1115-6bd8-4e82-8226-7317c2e06658.png"> the issues and PRs displayed will only get updated when the data with the new repository gets fetched from GitHub <img width="1451" alt="image" src="https://user-images.githubusercontent.com/25452497/226251813-472f2f9f-dfaf-4753-ada4-295b4801ba18.png"> **To Reproduce** 1. Go to 'AY2223S1-CS2103T-T09-4/tp' on WATcher 2. Click on 'Change Repository' on header 3. Input 'CATcher-org/WATcher' 4. See error **Expected behavior** The issues and PRs should get reloaded automatically instead of being displayed while fetching data. A loading screen should be displayed during the fetching. **Desktop (please complete the following information):** - OS: MacOS Monterey (12.5) - Browser Chrome 110 - Version 110.0.5481.177
1.0
Issues/PRs not reloaded before data gets fetched when changing repository from header - **Describe the bug** When changing repository from the header, the original repository is as below: <img width="1311" alt="Screenshot 2023-03-20 at 12 45 04 PM" src="https://user-images.githubusercontent.com/25452497/226249290-77f22e25-1b43-4c05-9b1f-1e759b9c93c3.png"> But after changing to another repository from the header, <img width="430" alt="image" src="https://user-images.githubusercontent.com/25452497/226251652-d6c61f7d-5959-4185-8928-4c70871cd7c4.png"> <img width="801" alt="image" src="https://user-images.githubusercontent.com/25452497/226251702-d0a38b6c-efa3-497f-8389-a51f964ad433.png"> the issues and PRs do not reload immediately <img width="1391" alt="image" src="https://user-images.githubusercontent.com/25452497/226251737-bf2e1115-6bd8-4e82-8226-7317c2e06658.png"> the issues and PRs displayed will only get updated when the data with the new repository gets fetched from GitHub <img width="1451" alt="image" src="https://user-images.githubusercontent.com/25452497/226251813-472f2f9f-dfaf-4753-ada4-295b4801ba18.png"> **To Reproduce** 1. Go to 'AY2223S1-CS2103T-T09-4/tp' on WATcher 2. Click on 'Change Repository' on header 3. Input 'CATcher-org/WATcher' 4. See error **Expected behavior** The issues and PRs should get reloaded automatically instead of being displayed while fetching data. A loading screen should be displayed during the fetching. **Desktop (please complete the following information):** - OS: MacOS Monterey (12.5) - Browser Chrome 110 - Version 110.0.5481.177
process
issues prs not reloaded before data gets fetched when changing repository from header describe the bug when changing repository from the header the original repository is as below img width alt screenshot at pm src but after changing to another repository from the header img width alt image src img width alt image src the issues and prs do not reload immediately img width alt image src the issues and prs displayed will only get updated when the data with the new repository gets fetched from github img width alt image src to reproduce go to tp on watcher click on change repository on header input catcher org watcher see error expected behavior the issues and prs should get reloaded automatically instead of being displayed while fetching data a loading screen should be displayed during the fetching desktop please complete the following information os macos monterey browser chrome version
1
16,533
21,560,232,834
IssuesEvent
2022-05-01 03:31:58
ctmccull/CPP-528
https://api.github.com/repos/ctmccull/CPP-528
closed
Utilization of Kanban boards
Team process
[ ] Steps have task lists [ ] Each card assigned to a team member [ ] Update card status weekly
1.0
Utilization of Kanban boards - [ ] Steps have task lists [ ] Each card assigned to a team member [ ] Update card status weekly
process
utilization of kanban boards steps have task lists each card assigned to a team member update card status weekly
1
335,213
30,018,244,840
IssuesEvent
2023-06-26 20:33:25
microsoft/vscode-python-debugger
https://api.github.com/repos/microsoft/vscode-python-debugger
opened
TPI: Debugging Python, Test automatic configuration
testplan-item
Refs: https://github.com/microsoft/vscode-python/issues/19503 - [ ] macOS - [ ] linux - [ ] windows Complexity: 3 Author: @paulacamargo25 --- Prerequisites: - Install python, version >= 3.7 - Install the [`debugpy`](https://marketplace.visualstudio.com/items?itemName=ms-python.debugpy&ssr=false) extension. Automatically detect a Flask application and run it with the correct Debug Configuration. ### Steps Part 1: Debugging python file 1. Create a python file with a simple code. 2. Head over to the Run And Debug tab, and click on Show all automatic debug configurations. <img width="390" alt="Screen Shot 2022-07-25 at 2 09 44 PM" src="https://user-images.githubusercontent.com/17892325/180874578-32d868f9-b93f-4460-8643-e1b18b3147e9.png"> 3. A window will open with a list of options, choose `Debugpy`. 4. You should now see a list of debug options, and there should be the Python File option. ### Verification 1. Make sure that the application has been executed correctly, you can put some breakpoints, to test that the debugging works. 2. . If you repeat the steps and instead of clicking the option, you click the wheel, it should open the launch.json file with the configuration prefilled. Make sure this is correct and can be debugged. 3. Another form to show the automatic configuration is typing 'debug ' (with a space) in Quick open (⌘P) or by triggering the Debug: Select and Start Debugging command. Test that the recognition works here too. Part 2: Try other automatic configuration 1. There are automatic configurations implemented for Django, FastApi and Flask, if you have any of these projects you can try that they also work with them. Because this functionality has already been tested in the Python extension, you don't need to test each one. The idea of this tpi is to make sure that it also works in the `debugpy` extension. Trying one of them is enough.
1.0
TPI: Debugging Python, Test automatic configuration - Refs: https://github.com/microsoft/vscode-python/issues/19503 - [ ] macOS - [ ] linux - [ ] windows Complexity: 3 Author: @paulacamargo25 --- Prerequisites: - Install python, version >= 3.7 - Install the [`debugpy`](https://marketplace.visualstudio.com/items?itemName=ms-python.debugpy&ssr=false) extension. Automatically detect a Flask application and run it with the correct Debug Configuration. ### Steps Part 1: Debugging python file 1. Create a python file with a simple code. 2. Head over to the Run And Debug tab, and click on Show all automatic debug configurations. <img width="390" alt="Screen Shot 2022-07-25 at 2 09 44 PM" src="https://user-images.githubusercontent.com/17892325/180874578-32d868f9-b93f-4460-8643-e1b18b3147e9.png"> 3. A window will open with a list of options, choose `Debugpy`. 4. You should now see a list of debug options, and there should be the Python File option. ### Verification 1. Make sure that the application has been executed correctly, you can put some breakpoints, to test that the debugging works. 2. . If you repeat the steps and instead of clicking the option, you click the wheel, it should open the launch.json file with the configuration prefilled. Make sure this is correct and can be debugged. 3. Another form to show the automatic configuration is typing 'debug ' (with a space) in Quick open (⌘P) or by triggering the Debug: Select and Start Debugging command. Test that the recognition works here too. Part 2: Try other automatic configuration 1. There are automatic configurations implemented for Django, FastApi and Flask, if you have any of these projects you can try that they also work with them. Because this functionality has already been tested in the Python extension, you don't need to test each one. The idea of this tpi is to make sure that it also works in the `debugpy` extension. Trying one of them is enough.
non_process
tpi debugging python test automatic configuration refs macos linux windows complexity author prerequisites install python version install the extension automatically detect a flask application and run it with the correct debug configuration steps part debugging python file create a python file with a simple code head over to the run and debug tab and click on show all automatic debug configurations img width alt screen shot at pm src a window will open with a list of options choose debugpy you should now see a list of debug options and there should be the python file option verification make sure that the application has been executed correctly you can put some breakpoints to test that the debugging works if you repeat the steps and instead of clicking the option you click the wheel it should open the launch json file with the configuration prefilled make sure this is correct and can be debugged another form to show the automatic configuration is typing debug with a space in quick open ⌘p or by triggering the debug select and start debugging command test that the recognition works here too part try other automatic configuration there are automatic configurations implemented for django fastapi and flask if you have any of these projects you can try that they also work with them because this functionality has already been tested in the python extension you don t need to test each one the idea of this tpi is to make sure that it also works in the debugpy extension trying one of them is enough
0
6,313
9,312,961,574
IssuesEvent
2019-03-26 03:36:52
theamrzaki/text_summurization_abstractive_methods
https://api.github.com/repos/theamrzaki/text_summurization_abstractive_methods
closed
asking for instructions !!!
Data Processing
Hi Amrzaki! I read your blogs on Medium, they are very good. I am new to text summarization and was wondering how to run the pointer-generator model with coverage on new data, I mean how to use it to summarize new articles? your help is appreciated.
1.0
asking for instructions !!! - Hi Amrzaki! I read your blogs on Medium, they are very good. I am new to text summarization and was wondering how to run the pointer-generator model with coverage on new data, I mean how to use it to summarize new articles? your help is appreciated.
process
asking for instructions hi amrzaki i read your blogs on medium they are very good i am new to text summarization and was wondering how to run the pointer generator model with coverage on new data i mean how to use it to summarize new articles your help is appreciated
1
627,914
19,956,681,402
IssuesEvent
2022-01-28 00:40:03
microsoft/terminal
https://api.github.com/repos/microsoft/terminal
closed
Settings UI: bad spacing between pivot header and its content
Help Wanted Issue-Bug Product-Terminal In-PR Priority-3 Area-Settings UI
<!-- 🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨 I ACKNOWLEDGE THE FOLLOWING BEFORE PROCEEDING: 1. If I delete this entire template and go my own path, the core team may close my issue without further explanation or engagement. 2. If I list multiple bugs/concerns in this one issue, the core team may close my issue without further explanation or engagement. 3. If I write an issue that has many duplicates, the core team may close my issue without further explanation or engagement (and without necessarily spending time to find the exact duplicate ID number). 4. If I leave the title incomplete when filing the issue, the core team may close my issue without further explanation or engagement. 5. If I file something completely blank in the body, the core team may close my issue without further explanation or engagement. All good? Then proceed! --> # Description of the new feature/enhancement There should be more space between the pivot tab header and its content. I'm am not sure if it is against Fluent design guidelines but IMHO the second version looks significantly better. This minor UI tweak should take less than 2 minutes to apply. **Before:** ![before](https://user-images.githubusercontent.com/26736457/116431630-41cd0900-a848-11eb-9dc5-dffc613da147.png) **After:** ![after](https://user-images.githubusercontent.com/26736457/116431650-44c7f980-a848-11eb-9407-aa610fc49ada.png) # Proposed technical implementation details (optional) `<PivotItem x:Uid="Profile_General" Margin="12,12,12,0">` or something like this. Do it for every `PivotItem` in `ProfilesPivot` File: https://github.com/microsoft/terminal/blob/main/src/cascadia/TerminalSettingsEditor/Profiles.xaml
1.0
Settings UI: bad spacing between pivot header and its content - <!-- 🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨 I ACKNOWLEDGE THE FOLLOWING BEFORE PROCEEDING: 1. If I delete this entire template and go my own path, the core team may close my issue without further explanation or engagement. 2. If I list multiple bugs/concerns in this one issue, the core team may close my issue without further explanation or engagement. 3. If I write an issue that has many duplicates, the core team may close my issue without further explanation or engagement (and without necessarily spending time to find the exact duplicate ID number). 4. If I leave the title incomplete when filing the issue, the core team may close my issue without further explanation or engagement. 5. If I file something completely blank in the body, the core team may close my issue without further explanation or engagement. All good? Then proceed! --> # Description of the new feature/enhancement There should be more space between the pivot tab header and its content. I'm am not sure if it is against Fluent design guidelines but IMHO the second version looks significantly better. This minor UI tweak should take less than 2 minutes to apply. **Before:** ![before](https://user-images.githubusercontent.com/26736457/116431630-41cd0900-a848-11eb-9dc5-dffc613da147.png) **After:** ![after](https://user-images.githubusercontent.com/26736457/116431650-44c7f980-a848-11eb-9407-aa610fc49ada.png) # Proposed technical implementation details (optional) `<PivotItem x:Uid="Profile_General" Margin="12,12,12,0">` or something like this. Do it for every `PivotItem` in `ProfilesPivot` File: https://github.com/microsoft/terminal/blob/main/src/cascadia/TerminalSettingsEditor/Profiles.xaml
non_process
settings ui bad spacing between pivot header and its content 🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨 i acknowledge the following before proceeding if i delete this entire template and go my own path the core team may close my issue without further explanation or engagement if i list multiple bugs concerns in this one issue the core team may close my issue without further explanation or engagement if i write an issue that has many duplicates the core team may close my issue without further explanation or engagement and without necessarily spending time to find the exact duplicate id number if i leave the title incomplete when filing the issue the core team may close my issue without further explanation or engagement if i file something completely blank in the body the core team may close my issue without further explanation or engagement all good then proceed description of the new feature enhancement there should be more space between the pivot tab header and its content i m am not sure if it is against fluent design guidelines but imho the second version looks significantly better this minor ui tweak should take less than minutes to apply before after proposed technical implementation details optional or something like this do it for every pivotitem in profilespivot file
0
8,020
11,206,983,520
IssuesEvent
2020-01-06 01:19:57
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
r.series broken in Processing due to wrong "range=0,0" pre-definition
Bug High Priority Processing Regression
Author Name: **Markus Neteler** (Markus Neteler) Original Redmine Issue: [21452](https://issues.qgis.org/issues/21452) Affected QGIS version: 3.7(master) Redmine category:processing/grass --- In QGIS 3.4, 3.6 r.series is broken in Processing due to wrong "range=0,0" pre-definition. It should be empty but is pre-populated with 0,0. That leads to an *empty result* (basically a NO DATA raster map): [...] g.proj -c proj4="+proj=lcc +lat_1=36.16666666666666 +lat_2=34.33333333333334 +lat_0=33.75 +lon_0=-79 +x_0=609601.22 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs" r.external input="/home/mneteler/lsat7_2000.tif" band=1 output="rast_5c7bdac0189302" --overwrite -o g.region n=228513.0 s=214975.5 e=645012.0 w=629992.5 res=28.5 r.series input=rast_5c7bdac0189302 method="average" range="0,0" output=output28566faed0e94fdaa03a9cfb57fb94a9 --overwrite --> here range="0,0" must not be there unless the user enters values in the Processing form under "Advanced". *How to reproduce:* - test dataset (multilayer Landsat-7): lsat7_2000.tif attached (2MB) - open dataset - open r.series dialog from Processing - select lsat7_2000.tif - click "run" (default is "average" method which is fine to demonstrate the problem) I tried to fix that but probably QgsProcessingParameterRange() is somehow broken or wrongly used in Processing? It is likely that the suboptimal use of QgsProcessingParameterRange() affects also other GRASS GIS modules present in Processing. --- - [lsat7_2000.tif](https://issues.qgis.org/attachments/download/14506/lsat7_2000.tif) (Markus Neteler) --- Related issue(s): #29374 (relates) Redmine related issue(s): [21558](https://issues.qgis.org/issues/21558) ---
1.0
r.series broken in Processing due to wrong "range=0,0" pre-definition - Author Name: **Markus Neteler** (Markus Neteler) Original Redmine Issue: [21452](https://issues.qgis.org/issues/21452) Affected QGIS version: 3.7(master) Redmine category:processing/grass --- In QGIS 3.4, 3.6 r.series is broken in Processing due to wrong "range=0,0" pre-definition. It should be empty but is pre-populated with 0,0. That leads to an *empty result* (basically a NO DATA raster map): [...] g.proj -c proj4="+proj=lcc +lat_1=36.16666666666666 +lat_2=34.33333333333334 +lat_0=33.75 +lon_0=-79 +x_0=609601.22 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs" r.external input="/home/mneteler/lsat7_2000.tif" band=1 output="rast_5c7bdac0189302" --overwrite -o g.region n=228513.0 s=214975.5 e=645012.0 w=629992.5 res=28.5 r.series input=rast_5c7bdac0189302 method="average" range="0,0" output=output28566faed0e94fdaa03a9cfb57fb94a9 --overwrite --> here range="0,0" must not be there unless the user enters values in the Processing form under "Advanced". *How to reproduce:* - test dataset (multilayer Landsat-7): lsat7_2000.tif attached (2MB) - open dataset - open r.series dialog from Processing - select lsat7_2000.tif - click "run" (default is "average" method which is fine to demonstrate the problem) I tried to fix that but probably QgsProcessingParameterRange() is somehow broken or wrongly used in Processing? It is likely that the suboptimal use of QgsProcessingParameterRange() affects also other GRASS GIS modules present in Processing. --- - [lsat7_2000.tif](https://issues.qgis.org/attachments/download/14506/lsat7_2000.tif) (Markus Neteler) --- Related issue(s): #29374 (relates) Redmine related issue(s): [21558](https://issues.qgis.org/issues/21558) ---
process
r series broken in processing due to wrong range pre definition author name markus neteler markus neteler original redmine issue affected qgis version master redmine category processing grass in qgis r series is broken in processing due to wrong range pre definition it should be empty but is pre populated with that leads to an empty result basically a no data raster map g proj c proj lcc lat lat lat lon x y ellps units m no defs r external input home mneteler tif band output rast overwrite o g region n s e w res r series input rast method average range output overwrite here range must not be there unless the user enters values in the processing form under advanced how to reproduce test dataset multilayer landsat tif attached open dataset open r series dialog from processing select tif click run default is average method which is fine to demonstrate the problem i tried to fix that but probably qgsprocessingparameterrange is somehow broken or wrongly used in processing it is likely that the suboptimal use of qgsprocessingparameterrange affects also other grass gis modules present in processing markus neteler related issue s relates redmine related issue s
1
10,202
13,066,528,356
IssuesEvent
2020-07-30 21:54:07
googleapis/python-bigquery
https://api.github.com/repos/googleapis/python-bigquery
closed
Testing: reduce / remove warning spew
api: bigquery testing type: process
```python $ nox -re unit-3.6 nox > Running session unit-3.6 nox > Re-using existing virtual environment at .nox/unit-3-6. nox > pip install mock pytest google-cloud-testutils pytest-cov freezegun nox > pip install grpcio nox > pip install -e .[all,fastparquet] nox > pip install ipython nox > py.test --quiet --cov=google.cloud.bigquery --cov=tests.unit --cov-append --cov-config=.coveragerc --cov-report= --cov-fail-under=0 tests/unit ........................................................................ [ 5%] ........................................................................ [ 10%] ........................................................................ [ 15%] ........................................................................ [ 20%] ........................................................................ [ 25%] ........................................................................ [ 30%] ..ss.....s.............................................................. [ 35%] ........................................................................ [ 40%] ........................................................................ [ 45%] ........................................................................ [ 50%] ........................................................................ [ 55%] ........................................................................ [ 60%] ........................................................................ [ 65%] ........................................................................ [ 70%] ........................................................................ [ 75%] ........................................................................ [ 80%] ........................................................................ [ 85%] ........................................................................ [ 90%] ........................................................................ [ 95%] ..................................................................... [100%] =============================== warnings summary =============================== tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_struct[RECORD] tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_struct[record] tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_struct[STRUCT] tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_struct[struct] /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test__pandas_helpers.py:302: FutureWarning: num_children is deprecated, use num_fields assert actual.num_children == len(fields) tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_array_struct[RECORD] tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_array_struct[record] tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_array_struct[STRUCT] tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_array_struct[struct] /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test__pandas_helpers.py:347: FutureWarning: num_children is deprecated, use num_fields assert actual.value_type.num_children == len(fields) tests/unit/test__pandas_helpers.py::test_bq_to_arrow_schema_w_unknown_type /home/tseaver/projects/agendaless/Google/src/python-bigquery/google/cloud/bigquery/_pandas_helpers.py:195: UserWarning: Unable to determine type for field 'field3'. warnings.warn("Unable to determine type for field '{}'.".format(bq_field.name)) tests/unit/test_client.py::TestClient::test__call_api_applying_custom_retry_on_timeout tests/unit/test_client.py::TestClient::test_create_bqstorage_client_missing_dependency tests/unit/test_magics.py::test_bigquery_magic_without_optional_arguments /home/tseaver/projects/agendaless/Google/src/python-bigquery/.nox/unit-3-6/lib/python3.6/site-packages/google/auth/_default.py:69: UserWarning: Your application has authenticated using end user credentials from Google Cloud SDK without a quota project. You might receive a "quota exceeded" or "API not enabled" error. We recommend you rerun `gcloud auth application-default login` and make sure a quota project is added. Or you can use service accounts instead. For more information about service accounts, see https://cloud.google.com/docs/authentication/ warnings.warn(_CLOUD_SDK_CREDENTIALS_WARNING) tests/unit/test_client.py::TestClientUpload::test_load_table_from_dataframe_wo_pyarrow_custom_compression /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test_client.py:7477: PyarrowMissingWarning: Loading dataframe data without pyarrow installed is deprecated and will become unsupported in the future. Please install the pyarrow package. parquet_compression="gzip", tests/unit/test_client.py::TestClientUpload::test_load_table_from_dataframe_wo_pyarrow_custom_compression /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test_client.py:7477: PendingDeprecationWarning: job_config.schema is set, but not used to assist in identifying correct types for data serialization. Please install the pyarrow package. parquet_compression="gzip", tests/unit/test_job.py::TestLoadJobConfig::test_time_partitioning_hit tests/unit/test_job.py::TestLoadJobConfig::test_time_partitioning_setter /home/tseaver/projects/agendaless/Google/src/python-bigquery/google/cloud/bigquery/table.py:2034: PendingDeprecationWarning: TimePartitioning.require_partition_filter will be removed in future versions. Please use Table.require_partition_filter instead. self.require_partition_filter = require_partition_filter tests/unit/test_job.py::TestQueryJob::test_to_dataframe_column_date_dtypes_wo_pyarrow /home/tseaver/projects/agendaless/Google/src/python-bigquery/google/cloud/bigquery/job.py:3383: PyarrowMissingWarning: Converting to a dataframe without pyarrow installed is often slower and will become unsupported in the future. Please install the pyarrow package. date_as_object=date_as_object, tests/unit/test_table.py::TestRowIterator::test_to_arrow_w_unknown_type /home/tseaver/projects/agendaless/Google/src/python-bigquery/google/cloud/bigquery/_pandas_helpers.py:195: UserWarning: Unable to determine type for field 'sport'. warnings.warn("Unable to determine type for field '{}'.".format(bq_field.name)) tests/unit/test_table.py::TestRowIterator::test_to_dataframe_concat_categorical_dtype_wo_pyarrow /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test_table.py:3500: PyarrowMissingWarning: Converting to a dataframe without pyarrow installed is often slower and will become unsupported in the future. Please install the pyarrow package. categories=["low", "medium", "high"], ordered=False, tests/unit/test_table.py::TestRowIterator::test_to_dataframe_progress_bar_wo_pyarrow tests/unit/test_table.py::TestRowIterator::test_to_dataframe_progress_bar_wo_pyarrow tests/unit/test_table.py::TestRowIterator::test_to_dataframe_progress_bar_wo_pyarrow /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test_table.py:2373: PyarrowMissingWarning: Converting to a dataframe without pyarrow installed is often slower and will become unsupported in the future. Please install the pyarrow package. df = row_iterator.to_dataframe(progress_bar_type=progress_bar_type) tests/unit/test_table.py::TestRowIterator::test_to_dataframe_w_bqstorage_v1beta1_no_streams /home/tseaver/projects/agendaless/Google/src/python-bigquery/google/cloud/bigquery/_pandas_helpers.py:645: DeprecationWarning: Support for BigQuery Storage v1beta1 clients is deprecated, please consider upgrading the client to BigQuery Storage v1 stable version. category=DeprecationWarning, tests/unit/test_table.py::TestRowIterator::test_to_dataframe_w_empty_results_wo_pyarrow /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test_table.py:2502: PyarrowMissingWarning: Converting to a dataframe without pyarrow installed is often slower and will become unsupported in the future. Please install the pyarrow package. df = row_iterator.to_dataframe() tests/unit/test_table.py::TestRowIterator::test_to_dataframe_w_no_results_wo_pyarrow /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test_table.py:2525: PyarrowMissingWarning: Converting to a dataframe without pyarrow installed is often slower and will become unsupported in the future. Please install the pyarrow package. df = row_iterator.to_dataframe() -- Docs: https://docs.pytest.org/en/latest/warnings.html 1434 passed, 3 skipped, 25 warnings in 37.53s ```
1.0
Testing: reduce / remove warning spew - ```python $ nox -re unit-3.6 nox > Running session unit-3.6 nox > Re-using existing virtual environment at .nox/unit-3-6. nox > pip install mock pytest google-cloud-testutils pytest-cov freezegun nox > pip install grpcio nox > pip install -e .[all,fastparquet] nox > pip install ipython nox > py.test --quiet --cov=google.cloud.bigquery --cov=tests.unit --cov-append --cov-config=.coveragerc --cov-report= --cov-fail-under=0 tests/unit ........................................................................ [ 5%] ........................................................................ [ 10%] ........................................................................ [ 15%] ........................................................................ [ 20%] ........................................................................ [ 25%] ........................................................................ [ 30%] ..ss.....s.............................................................. [ 35%] ........................................................................ [ 40%] ........................................................................ [ 45%] ........................................................................ [ 50%] ........................................................................ [ 55%] ........................................................................ [ 60%] ........................................................................ [ 65%] ........................................................................ [ 70%] ........................................................................ [ 75%] ........................................................................ [ 80%] ........................................................................ [ 85%] ........................................................................ [ 90%] ........................................................................ [ 95%] ..................................................................... [100%] =============================== warnings summary =============================== tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_struct[RECORD] tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_struct[record] tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_struct[STRUCT] tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_struct[struct] /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test__pandas_helpers.py:302: FutureWarning: num_children is deprecated, use num_fields assert actual.num_children == len(fields) tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_array_struct[RECORD] tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_array_struct[record] tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_array_struct[STRUCT] tests/unit/test__pandas_helpers.py::test_bq_to_arrow_data_type_w_array_struct[struct] /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test__pandas_helpers.py:347: FutureWarning: num_children is deprecated, use num_fields assert actual.value_type.num_children == len(fields) tests/unit/test__pandas_helpers.py::test_bq_to_arrow_schema_w_unknown_type /home/tseaver/projects/agendaless/Google/src/python-bigquery/google/cloud/bigquery/_pandas_helpers.py:195: UserWarning: Unable to determine type for field 'field3'. warnings.warn("Unable to determine type for field '{}'.".format(bq_field.name)) tests/unit/test_client.py::TestClient::test__call_api_applying_custom_retry_on_timeout tests/unit/test_client.py::TestClient::test_create_bqstorage_client_missing_dependency tests/unit/test_magics.py::test_bigquery_magic_without_optional_arguments /home/tseaver/projects/agendaless/Google/src/python-bigquery/.nox/unit-3-6/lib/python3.6/site-packages/google/auth/_default.py:69: UserWarning: Your application has authenticated using end user credentials from Google Cloud SDK without a quota project. You might receive a "quota exceeded" or "API not enabled" error. We recommend you rerun `gcloud auth application-default login` and make sure a quota project is added. Or you can use service accounts instead. For more information about service accounts, see https://cloud.google.com/docs/authentication/ warnings.warn(_CLOUD_SDK_CREDENTIALS_WARNING) tests/unit/test_client.py::TestClientUpload::test_load_table_from_dataframe_wo_pyarrow_custom_compression /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test_client.py:7477: PyarrowMissingWarning: Loading dataframe data without pyarrow installed is deprecated and will become unsupported in the future. Please install the pyarrow package. parquet_compression="gzip", tests/unit/test_client.py::TestClientUpload::test_load_table_from_dataframe_wo_pyarrow_custom_compression /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test_client.py:7477: PendingDeprecationWarning: job_config.schema is set, but not used to assist in identifying correct types for data serialization. Please install the pyarrow package. parquet_compression="gzip", tests/unit/test_job.py::TestLoadJobConfig::test_time_partitioning_hit tests/unit/test_job.py::TestLoadJobConfig::test_time_partitioning_setter /home/tseaver/projects/agendaless/Google/src/python-bigquery/google/cloud/bigquery/table.py:2034: PendingDeprecationWarning: TimePartitioning.require_partition_filter will be removed in future versions. Please use Table.require_partition_filter instead. self.require_partition_filter = require_partition_filter tests/unit/test_job.py::TestQueryJob::test_to_dataframe_column_date_dtypes_wo_pyarrow /home/tseaver/projects/agendaless/Google/src/python-bigquery/google/cloud/bigquery/job.py:3383: PyarrowMissingWarning: Converting to a dataframe without pyarrow installed is often slower and will become unsupported in the future. Please install the pyarrow package. date_as_object=date_as_object, tests/unit/test_table.py::TestRowIterator::test_to_arrow_w_unknown_type /home/tseaver/projects/agendaless/Google/src/python-bigquery/google/cloud/bigquery/_pandas_helpers.py:195: UserWarning: Unable to determine type for field 'sport'. warnings.warn("Unable to determine type for field '{}'.".format(bq_field.name)) tests/unit/test_table.py::TestRowIterator::test_to_dataframe_concat_categorical_dtype_wo_pyarrow /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test_table.py:3500: PyarrowMissingWarning: Converting to a dataframe without pyarrow installed is often slower and will become unsupported in the future. Please install the pyarrow package. categories=["low", "medium", "high"], ordered=False, tests/unit/test_table.py::TestRowIterator::test_to_dataframe_progress_bar_wo_pyarrow tests/unit/test_table.py::TestRowIterator::test_to_dataframe_progress_bar_wo_pyarrow tests/unit/test_table.py::TestRowIterator::test_to_dataframe_progress_bar_wo_pyarrow /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test_table.py:2373: PyarrowMissingWarning: Converting to a dataframe without pyarrow installed is often slower and will become unsupported in the future. Please install the pyarrow package. df = row_iterator.to_dataframe(progress_bar_type=progress_bar_type) tests/unit/test_table.py::TestRowIterator::test_to_dataframe_w_bqstorage_v1beta1_no_streams /home/tseaver/projects/agendaless/Google/src/python-bigquery/google/cloud/bigquery/_pandas_helpers.py:645: DeprecationWarning: Support for BigQuery Storage v1beta1 clients is deprecated, please consider upgrading the client to BigQuery Storage v1 stable version. category=DeprecationWarning, tests/unit/test_table.py::TestRowIterator::test_to_dataframe_w_empty_results_wo_pyarrow /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test_table.py:2502: PyarrowMissingWarning: Converting to a dataframe without pyarrow installed is often slower and will become unsupported in the future. Please install the pyarrow package. df = row_iterator.to_dataframe() tests/unit/test_table.py::TestRowIterator::test_to_dataframe_w_no_results_wo_pyarrow /home/tseaver/projects/agendaless/Google/src/python-bigquery/tests/unit/test_table.py:2525: PyarrowMissingWarning: Converting to a dataframe without pyarrow installed is often slower and will become unsupported in the future. Please install the pyarrow package. df = row_iterator.to_dataframe() -- Docs: https://docs.pytest.org/en/latest/warnings.html 1434 passed, 3 skipped, 25 warnings in 37.53s ```
process
testing reduce remove warning spew python nox re unit nox running session unit nox re using existing virtual environment at nox unit nox pip install mock pytest google cloud testutils pytest cov freezegun nox pip install grpcio nox pip install e nox pip install ipython nox py test quiet cov google cloud bigquery cov tests unit cov append cov config coveragerc cov report cov fail under tests unit ss s warnings summary tests unit test pandas helpers py test bq to arrow data type w struct tests unit test pandas helpers py test bq to arrow data type w struct tests unit test pandas helpers py test bq to arrow data type w struct tests unit test pandas helpers py test bq to arrow data type w struct home tseaver projects agendaless google src python bigquery tests unit test pandas helpers py futurewarning num children is deprecated use num fields assert actual num children len fields tests unit test pandas helpers py test bq to arrow data type w array struct tests unit test pandas helpers py test bq to arrow data type w array struct tests unit test pandas helpers py test bq to arrow data type w array struct tests unit test pandas helpers py test bq to arrow data type w array struct home tseaver projects agendaless google src python bigquery tests unit test pandas helpers py futurewarning num children is deprecated use num fields assert actual value type num children len fields tests unit test pandas helpers py test bq to arrow schema w unknown type home tseaver projects agendaless google src python bigquery google cloud bigquery pandas helpers py userwarning unable to determine type for field warnings warn unable to determine type for field format bq field name tests unit test client py testclient test call api applying custom retry on timeout tests unit test client py testclient test create bqstorage client missing dependency tests unit test magics py test bigquery magic without optional arguments home tseaver projects agendaless google src python bigquery nox unit lib site packages google auth default py userwarning your application has authenticated using end user credentials from google cloud sdk without a quota project you might receive a quota exceeded or api not enabled error we recommend you rerun gcloud auth application default login and make sure a quota project is added or you can use service accounts instead for more information about service accounts see warnings warn cloud sdk credentials warning tests unit test client py testclientupload test load table from dataframe wo pyarrow custom compression home tseaver projects agendaless google src python bigquery tests unit test client py pyarrowmissingwarning loading dataframe data without pyarrow installed is deprecated and will become unsupported in the future please install the pyarrow package parquet compression gzip tests unit test client py testclientupload test load table from dataframe wo pyarrow custom compression home tseaver projects agendaless google src python bigquery tests unit test client py pendingdeprecationwarning job config schema is set but not used to assist in identifying correct types for data serialization please install the pyarrow package parquet compression gzip tests unit test job py testloadjobconfig test time partitioning hit tests unit test job py testloadjobconfig test time partitioning setter home tseaver projects agendaless google src python bigquery google cloud bigquery table py pendingdeprecationwarning timepartitioning require partition filter will be removed in future versions please use table require partition filter instead self require partition filter require partition filter tests unit test job py testqueryjob test to dataframe column date dtypes wo pyarrow home tseaver projects agendaless google src python bigquery google cloud bigquery job py pyarrowmissingwarning converting to a dataframe without pyarrow installed is often slower and will become unsupported in the future please install the pyarrow package date as object date as object tests unit test table py testrowiterator test to arrow w unknown type home tseaver projects agendaless google src python bigquery google cloud bigquery pandas helpers py userwarning unable to determine type for field sport warnings warn unable to determine type for field format bq field name tests unit test table py testrowiterator test to dataframe concat categorical dtype wo pyarrow home tseaver projects agendaless google src python bigquery tests unit test table py pyarrowmissingwarning converting to a dataframe without pyarrow installed is often slower and will become unsupported in the future please install the pyarrow package categories ordered false tests unit test table py testrowiterator test to dataframe progress bar wo pyarrow tests unit test table py testrowiterator test to dataframe progress bar wo pyarrow tests unit test table py testrowiterator test to dataframe progress bar wo pyarrow home tseaver projects agendaless google src python bigquery tests unit test table py pyarrowmissingwarning converting to a dataframe without pyarrow installed is often slower and will become unsupported in the future please install the pyarrow package df row iterator to dataframe progress bar type progress bar type tests unit test table py testrowiterator test to dataframe w bqstorage no streams home tseaver projects agendaless google src python bigquery google cloud bigquery pandas helpers py deprecationwarning support for bigquery storage clients is deprecated please consider upgrading the client to bigquery storage stable version category deprecationwarning tests unit test table py testrowiterator test to dataframe w empty results wo pyarrow home tseaver projects agendaless google src python bigquery tests unit test table py pyarrowmissingwarning converting to a dataframe without pyarrow installed is often slower and will become unsupported in the future please install the pyarrow package df row iterator to dataframe tests unit test table py testrowiterator test to dataframe w no results wo pyarrow home tseaver projects agendaless google src python bigquery tests unit test table py pyarrowmissingwarning converting to a dataframe without pyarrow installed is often slower and will become unsupported in the future please install the pyarrow package df row iterator to dataframe docs passed skipped warnings in
1
53,519
3,040,717,694
IssuesEvent
2015-08-07 16:56:40
scamille/simc_issue_test3
https://api.github.com/repos/scamille/simc_issue_test3
closed
Some minor things in sc_warlock.cpp
bug imported Priority-Medium
_From [niarbeht@gmx.de](https://code.google.com/u/niarbeht@gmx.de/) on March 09, 2009 19:38:02_ In warlock_spell_t::target_debuff: target_multiplier *= 1.01 + p -> talents.master_conjuror * 0.01; Master Conjuror is a 150/300&#37; increase in 3.1. The multiplier is 1.02 with 1/2 (rounded down) and 1.04 with 2/2. And I wonder why you aren't doing the sim -> patch.before( 3, 1, 0 ) thing here. The talent doesn't modify the multiplier in 3.0.9 I think. In struct immolate_t: direct_power_mod += (1.0/3.0) * p -> talents.fire_and_brimstone * 0.01; tick_power_mod += (2.0/3.0) * p -> talents.fire_and_brimstone * 0.02 / num_ticks; Looks like you are doing the 1/3-2/3 splitting twice. Fire&Brimstone increases immolate's damage by 3&#37; of your spell power per point. In struct searing_pain_t: base_crit += p -> talents.improved_searing_pain * 0.04; Improved Searing Pain increases the critchance by 4/7/10&#37;. In warlock_t::init_base: attribute_multiplier_initial[ ATTR_STAMINA ] *= 1.0 + talents.demonic_embrace * 0.02; This is correct in 3.0.9, but Demonic Embrace is 4/7/10&#37; in 3.1. _Original issue: http://code.google.com/p/simulationcraft/issues/detail?id=36_
1.0
Some minor things in sc_warlock.cpp - _From [niarbeht@gmx.de](https://code.google.com/u/niarbeht@gmx.de/) on March 09, 2009 19:38:02_ In warlock_spell_t::target_debuff: target_multiplier *= 1.01 + p -> talents.master_conjuror * 0.01; Master Conjuror is a 150/300&#37; increase in 3.1. The multiplier is 1.02 with 1/2 (rounded down) and 1.04 with 2/2. And I wonder why you aren't doing the sim -> patch.before( 3, 1, 0 ) thing here. The talent doesn't modify the multiplier in 3.0.9 I think. In struct immolate_t: direct_power_mod += (1.0/3.0) * p -> talents.fire_and_brimstone * 0.01; tick_power_mod += (2.0/3.0) * p -> talents.fire_and_brimstone * 0.02 / num_ticks; Looks like you are doing the 1/3-2/3 splitting twice. Fire&Brimstone increases immolate's damage by 3&#37; of your spell power per point. In struct searing_pain_t: base_crit += p -> talents.improved_searing_pain * 0.04; Improved Searing Pain increases the critchance by 4/7/10&#37;. In warlock_t::init_base: attribute_multiplier_initial[ ATTR_STAMINA ] *= 1.0 + talents.demonic_embrace * 0.02; This is correct in 3.0.9, but Demonic Embrace is 4/7/10&#37; in 3.1. _Original issue: http://code.google.com/p/simulationcraft/issues/detail?id=36_
non_process
some minor things in sc warlock cpp from on march in warlock spell t target debuff target multiplier p talents master conjuror master conjuror is a increase in the multiplier is with rounded down and with and i wonder why you aren t doing the sim patch before thing here the talent doesn t modify the multiplier in i think in struct immolate t direct power mod p talents fire and brimstone tick power mod p talents fire and brimstone num ticks looks like you are doing the splitting twice fire brimstone increases immolate s damage by of your spell power per point in struct searing pain t base crit p talents improved searing pain improved searing pain increases the critchance by in warlock t init base attribute multiplier initial talents demonic embrace this is correct in but demonic embrace is in original issue
0
105,595
11,455,022,490
IssuesEvent
2020-02-06 18:14:29
CarlaValverde/Tarjeta-
https://api.github.com/repos/CarlaValverde/Tarjeta-
closed
Implementación de una clase
documentation enhancement implementation
Implementación de la clase Tarjeta, esta clase contará con los atributos privados id, dniTitular, pin y saldo. Además, contará con un constructor y con los métodos de obtención y establecimiento necesarios (get y set).
1.0
Implementación de una clase - Implementación de la clase Tarjeta, esta clase contará con los atributos privados id, dniTitular, pin y saldo. Además, contará con un constructor y con los métodos de obtención y establecimiento necesarios (get y set).
non_process
implementación de una clase implementación de la clase tarjeta esta clase contará con los atributos privados id dnititular pin y saldo además contará con un constructor y con los métodos de obtención y establecimiento necesarios get y set
0
270,550
20,603,070,610
IssuesEvent
2022-03-06 15:16:55
SciML/NeuralOperators.jl
https://api.github.com/repos/SciML/NeuralOperators.jl
closed
Notebooks are broken
documentation question
I tried running the super resolution example notebook and it errors with: ```julia SystemError: opening file "/Desktop/dev/NeuralOperators.jl/example/SuperResolution/src/../model/model.jld2": No such file or directory ``` It appears that the ```model.jld2``` file is actually missing from the repo? This looks like a problem that occurs across all the notebooks.
1.0
Notebooks are broken - I tried running the super resolution example notebook and it errors with: ```julia SystemError: opening file "/Desktop/dev/NeuralOperators.jl/example/SuperResolution/src/../model/model.jld2": No such file or directory ``` It appears that the ```model.jld2``` file is actually missing from the repo? This looks like a problem that occurs across all the notebooks.
non_process
notebooks are broken i tried running the super resolution example notebook and it errors with julia systemerror opening file desktop dev neuraloperators jl example superresolution src model model no such file or directory it appears that the model file is actually missing from the repo this looks like a problem that occurs across all the notebooks
0
84,802
10,564,377,051
IssuesEvent
2019-10-05 01:19:49
ScoreWin/letsgoraiding
https://api.github.com/repos/ScoreWin/letsgoraiding
closed
"Cooldown" Fatigue
designer app feature
Abilities should be able to permanently increase their own cooldown. This will specifically be useful for balancing certain ultimate abilities which we want to be available early-on in the game.
1.0
"Cooldown" Fatigue - Abilities should be able to permanently increase their own cooldown. This will specifically be useful for balancing certain ultimate abilities which we want to be available early-on in the game.
non_process
cooldown fatigue abilities should be able to permanently increase their own cooldown this will specifically be useful for balancing certain ultimate abilities which we want to be available early on in the game
0
7,946
11,137,527,183
IssuesEvent
2019-12-20 19:36:16
openopps/openopps-platform
https://api.github.com/repos/openopps/openopps-platform
closed
Update What's next page to include content on updating application
Apply Process Approved Requirements Ready State Dept.
Who: Applicants What: View information on what happens next Why: In order to provide information about how to update and submit prior to closing Acceptance Criteria: Update the What's next page to include additional information related to updating the application, clicking submit, and all before 11:59 p.m. EST New text: Can I update my application? Yes, you can update your application by visiting your dashboard. You must make all updates before 11:59 p.m. EST on the closing date. Don't forget to click submit. Design page with updated text and placement: https://opm.invisionapp.com/share/ZEPNZR09Q54#/344480824_State_-_Confirmation_-_Open_-Desktop- Current Screen Shot: ![image.png](https://images.zenhubusercontent.com/59ee08f1a468affe6df7cd6f/2ac98591-3ce9-4b9a-9fa6-3bb876889e58)
1.0
Update What's next page to include content on updating application - Who: Applicants What: View information on what happens next Why: In order to provide information about how to update and submit prior to closing Acceptance Criteria: Update the What's next page to include additional information related to updating the application, clicking submit, and all before 11:59 p.m. EST New text: Can I update my application? Yes, you can update your application by visiting your dashboard. You must make all updates before 11:59 p.m. EST on the closing date. Don't forget to click submit. Design page with updated text and placement: https://opm.invisionapp.com/share/ZEPNZR09Q54#/344480824_State_-_Confirmation_-_Open_-Desktop- Current Screen Shot: ![image.png](https://images.zenhubusercontent.com/59ee08f1a468affe6df7cd6f/2ac98591-3ce9-4b9a-9fa6-3bb876889e58)
process
update what s next page to include content on updating application who applicants what view information on what happens next why in order to provide information about how to update and submit prior to closing acceptance criteria update the what s next page to include additional information related to updating the application clicking submit and all before p m est new text can i update my application yes you can update your application by visiting your dashboard you must make all updates before p m est on the closing date don t forget to click submit design page with updated text and placement current screen shot
1
58,396
7,136,223,990
IssuesEvent
2018-01-23 05:52:38
ruoklive/lottery_bug
https://api.github.com/repos/ruoklive/lottery_bug
closed
【后台】安全版块下的登录历史,条件搜索 所在地,需求是否存在?
UI-Design fixed
1、安全版块下的登录历史,条件搜索 中包括 所在地 输入搜索,但列表中并没有栏位显示 所在地,需确认搜索条件下拉框里 的 所在地是否有必要存在? ![46kwmay qq 8mp 0 pt_2g](https://user-images.githubusercontent.com/35476103/35204151-342f6e92-ff67-11e7-8bd1-86514be8e9db.jpg)
1.0
【后台】安全版块下的登录历史,条件搜索 所在地,需求是否存在? - 1、安全版块下的登录历史,条件搜索 中包括 所在地 输入搜索,但列表中并没有栏位显示 所在地,需确认搜索条件下拉框里 的 所在地是否有必要存在? ![46kwmay qq 8mp 0 pt_2g](https://user-images.githubusercontent.com/35476103/35204151-342f6e92-ff67-11e7-8bd1-86514be8e9db.jpg)
non_process
【后台】安全版块下的登录历史,条件搜索 所在地,需求是否存在? 、安全版块下的登录历史,条件搜索 中包括 所在地 输入搜索,但列表中并没有栏位显示 所在地,需确认搜索条件下拉框里 的 所在地是否有必要存在?
0
2,686
5,536,938,050
IssuesEvent
2017-03-21 20:51:33
metabase/metabase
https://api.github.com/repos/metabase/metabase
closed
Attach card ids to dataset query executions
Query Processor
For things like #609, and #2743 it would be very useful to be able to link a card to its executions.
1.0
Attach card ids to dataset query executions - For things like #609, and #2743 it would be very useful to be able to link a card to its executions.
process
attach card ids to dataset query executions for things like and it would be very useful to be able to link a card to its executions
1
136,816
5,289,019,774
IssuesEvent
2017-02-08 16:27:15
jeveloper/jayrock
https://api.github.com/repos/jeveloper/jayrock
closed
Have JsonRpcClient use .NET 4.0 ExpandoObject to dynamically generate API
auto-migrated Priority-Medium Type-Enhancement
``` Most JSON-RPC client implementations dynamically generate their API according to what is returned by system.listMethods. Therefore instead of this: Console.WriteLine(client.Invoke("sum", new JsonObject { { "a", 123 }, { "b", 456 } })); ... one simply does this: Console.WriteLine(client.sum(123, 456)); .NET now has the ExpandoObject facility to allow runtime generation of methods. Having JsonRpcClient make use of this would make life for user code much easier. Thanks, Niall ``` Original issue reported on code.google.com by `nialldouglas14` on 20 Jul 2011 at 3:09
1.0
Have JsonRpcClient use .NET 4.0 ExpandoObject to dynamically generate API - ``` Most JSON-RPC client implementations dynamically generate their API according to what is returned by system.listMethods. Therefore instead of this: Console.WriteLine(client.Invoke("sum", new JsonObject { { "a", 123 }, { "b", 456 } })); ... one simply does this: Console.WriteLine(client.sum(123, 456)); .NET now has the ExpandoObject facility to allow runtime generation of methods. Having JsonRpcClient make use of this would make life for user code much easier. Thanks, Niall ``` Original issue reported on code.google.com by `nialldouglas14` on 20 Jul 2011 at 3:09
non_process
have jsonrpcclient use net expandoobject to dynamically generate api most json rpc client implementations dynamically generate their api according to what is returned by system listmethods therefore instead of this console writeline client invoke sum new jsonobject a b one simply does this console writeline client sum net now has the expandoobject facility to allow runtime generation of methods having jsonrpcclient make use of this would make life for user code much easier thanks niall original issue reported on code google com by on jul at
0
178,440
21,509,400,489
IssuesEvent
2022-04-28 01:37:05
bsbtd/Teste
https://api.github.com/repos/bsbtd/Teste
closed
WS-2017-3740 (Medium) detected in elasticsearchv8.0.0-alpha1 - autoclosed
security vulnerability
## WS-2017-3740 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>elasticsearchv8.0.0-alpha1</b></p></summary> <p> <p>Free and Open, Distributed, RESTful Search Engine</p> <p>Library home page: <a href=https://github.com/elastic/elasticsearch.git>https://github.com/elastic/elasticsearch.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/bsbtd/Teste/commit/ca6b7844fdb48eb6ebef193223e1da056673528a">ca6b7844fdb48eb6ebef193223e1da056673528a</a></p> </p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/elasticsearch/x-pack/plugin/core/src/main/java/org/elasticsearch/xpack/core/security/user/XPackUser.java</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/elasticsearch/x-pack/plugin/core/src/main/java/org/elasticsearch/xpack/core/security/user/XPackUser.java</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Elasticsearch before v7.0.0-alpha1 has a weak default of giving internal XPack user write permissions on the repository files and images. <p>Publish Date: 2017-11-07 <p>URL: <a href=https://github.com/elastic/elasticsearch/commit/8e5855e62ebae5fe814367fa85f9bfdc3105213a>WS-2017-3740</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/elastic/elasticsearch/commit/8e5855e62ebae5fe814367fa85f9bfdc3105213a">https://github.com/elastic/elasticsearch/commit/8e5855e62ebae5fe814367fa85f9bfdc3105213a</a></p> <p>Release Date: 2019-10-24</p> <p>Fix Resolution: v7.0.0-alpha1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
WS-2017-3740 (Medium) detected in elasticsearchv8.0.0-alpha1 - autoclosed - ## WS-2017-3740 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>elasticsearchv8.0.0-alpha1</b></p></summary> <p> <p>Free and Open, Distributed, RESTful Search Engine</p> <p>Library home page: <a href=https://github.com/elastic/elasticsearch.git>https://github.com/elastic/elasticsearch.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/bsbtd/Teste/commit/ca6b7844fdb48eb6ebef193223e1da056673528a">ca6b7844fdb48eb6ebef193223e1da056673528a</a></p> </p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/elasticsearch/x-pack/plugin/core/src/main/java/org/elasticsearch/xpack/core/security/user/XPackUser.java</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/elasticsearch/x-pack/plugin/core/src/main/java/org/elasticsearch/xpack/core/security/user/XPackUser.java</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Elasticsearch before v7.0.0-alpha1 has a weak default of giving internal XPack user write permissions on the repository files and images. <p>Publish Date: 2017-11-07 <p>URL: <a href=https://github.com/elastic/elasticsearch/commit/8e5855e62ebae5fe814367fa85f9bfdc3105213a>WS-2017-3740</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/elastic/elasticsearch/commit/8e5855e62ebae5fe814367fa85f9bfdc3105213a">https://github.com/elastic/elasticsearch/commit/8e5855e62ebae5fe814367fa85f9bfdc3105213a</a></p> <p>Release Date: 2019-10-24</p> <p>Fix Resolution: v7.0.0-alpha1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
ws medium detected in autoclosed ws medium severity vulnerability vulnerable library free and open distributed restful search engine library home page a href found in head commit a href vulnerable source files elasticsearch x pack plugin core src main java org elasticsearch xpack core security user xpackuser java elasticsearch x pack plugin core src main java org elasticsearch xpack core security user xpackuser java vulnerability details elasticsearch before has a weak default of giving internal xpack user write permissions on the repository files and images publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
7,728
10,852,841,069
IssuesEvent
2019-11-13 13:37:44
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
obsolete precomposed defense response
multi-species process obsoletion
GO:1901244 positive regulation of transcription from RNA polymerase II promoter involved in defense response to fungus
1.0
obsolete precomposed defense response - GO:1901244 positive regulation of transcription from RNA polymerase II promoter involved in defense response to fungus
process
obsolete precomposed defense response go positive regulation of transcription from rna polymerase ii promoter involved in defense response to fungus
1
21,395
29,202,232,602
IssuesEvent
2023-05-21 00:37:48
devssa/onde-codar-em-salvador
https://api.github.com/repos/devssa/onde-codar-em-salvador
closed
[Hibrido / Belo Horizonte, Minas Gerais, Brazil] Backend Java Developer (Pleno - Híbrido em Belo Horizonte - MG) na Coodesh
SALVADOR BACK-END PJ JAVA MYSQL JAVASCRIPT PLENO PRIMEFACES JSF SPRING SQL GIT HIBERNATE MAVEN REST SOAP JSON ANGULAR REQUISITOS NGINX PROCESSOS INOVAÇÃO BACKEND GITHUB APACHE UMA C DOCUMENTAÇÃO WILDFLY HTTP MANUTENÇÃO HIBRIDO ALOCADO Stale
## Descrição da vaga: Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios. Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/vagas/backend-java-developer-pleno-hibrido-em-belo-horizonte-mg-144137266?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋 <p>A<strong> Prime Results</strong> está em busca de <strong><ins>Backend Java Developer</ins></strong> para compor seu time!</p> <p>Acreditamos no poder de transformação social realizado pelas empresas. Acreditamos no poder transformador das pessoas, aliado à gestão e tecnologia. Compartilhamos nosso conhecimento para solucionar problemas complexos e gerar valor para nossos clientes.</p> <p><strong>Responsabilidades:</strong></p> <ul> <li>Desenvolvimento/implementação e manutenção de aplicações;</li> <li>Participar da análise e execução dos projetos e execução dos tickets;</li> <li>Definir as atividades necessárias para a realização de projetos, analisando os impactos em sistemas e processos através do entendimento da necessidade, conhecimento técnico e arquitetônico dos sistemas;</li> <li>Desenvolver códigos para atendimento às áreas e empresas clientes, proporcionando o esclarecimento de dúvidas relacionados ao projeto, contribuindo para uma melhor análise de impactos de processos e sistemas sob sua responsabilidade;&nbsp;</li> <li>Participar das atividades de planejamento para a liberação do produto para homologação e produção, por meio da validação de testes de aceite, assim como documentação de não conformidades avaliando e planejando a execução das correções reportadas;</li> <li>Participar da rotina de SQUADs.&nbsp;</li> </ul> ## Prime Results : <p>O Best Seller Simon Sinek, diz que a maioria das empresas sabem o que fazem, porém não sabem por que o fazem. Não é o nosso caso. A Prime Results é uma empresa especializada em gestão organizacional que usa seu potencial de transformação em empresas que geram impacto positivo na sociedade. Nossos clientes hoje, fazem a diferença na vida de mais de 250.000 brasileiros, nas áreas de proteção patrimonial, saúde e assistência 24 horas.&nbsp;</p> <p>Nosso objetivo central é criar um ambiente criativo, dinâmico e engajado, sempre aliados a métodos, processos inteligentes e muita inovação.</p><a href='https://coodesh.com/empresas/prime-results'>Veja mais no site</a> ## Habilidades: - Spring - Java - MySQL - Microsoft SQL Server - Angular - Apache - JSON - Hibernate ## Local: Belo Horizonte, Minas Gerais, Brazil ## Requisitos: - Experiência em Java: JSF, Spring, PrimeFaces, Hibernate, JasperReports; - Conhecimentos em modelagem e desenvolvimento de Bancos de Dados relacionais: MySQL, SQL Server; - Conhecimento de Arquiteturas Web e Serviços (HTTP, SOAP, REST ou JSON); - Conhecimentos nas ferramentas: GIT e Maven; - Conhecimentos técnicos em servidores de aplicação (Wildfly - J2EE), servidores web (Apache e NGINX) e Spring Boot. ## Diferenciais: - Conhecimentos em Tecnologias Web: HTML5, CSS e Frameworks JavaScript, Angular. ## Benefícios: - GymPass; - Assistência Médica após período de experiência. ## Como se candidatar: Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Backend Java Developer (Pleno - Híbrido em Belo Horizonte - MG) na Prime Results ](https://coodesh.com/vagas/backend-java-developer-pleno-hibrido-em-belo-horizonte-mg-144137266?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação. ## Labels #### Alocação Alocado #### Regime PJ #### Categoria Back-End
1.0
[Hibrido / Belo Horizonte, Minas Gerais, Brazil] Backend Java Developer (Pleno - Híbrido em Belo Horizonte - MG) na Coodesh - ## Descrição da vaga: Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios. Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/vagas/backend-java-developer-pleno-hibrido-em-belo-horizonte-mg-144137266?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋 <p>A<strong> Prime Results</strong> está em busca de <strong><ins>Backend Java Developer</ins></strong> para compor seu time!</p> <p>Acreditamos no poder de transformação social realizado pelas empresas. Acreditamos no poder transformador das pessoas, aliado à gestão e tecnologia. Compartilhamos nosso conhecimento para solucionar problemas complexos e gerar valor para nossos clientes.</p> <p><strong>Responsabilidades:</strong></p> <ul> <li>Desenvolvimento/implementação e manutenção de aplicações;</li> <li>Participar da análise e execução dos projetos e execução dos tickets;</li> <li>Definir as atividades necessárias para a realização de projetos, analisando os impactos em sistemas e processos através do entendimento da necessidade, conhecimento técnico e arquitetônico dos sistemas;</li> <li>Desenvolver códigos para atendimento às áreas e empresas clientes, proporcionando o esclarecimento de dúvidas relacionados ao projeto, contribuindo para uma melhor análise de impactos de processos e sistemas sob sua responsabilidade;&nbsp;</li> <li>Participar das atividades de planejamento para a liberação do produto para homologação e produção, por meio da validação de testes de aceite, assim como documentação de não conformidades avaliando e planejando a execução das correções reportadas;</li> <li>Participar da rotina de SQUADs.&nbsp;</li> </ul> ## Prime Results : <p>O Best Seller Simon Sinek, diz que a maioria das empresas sabem o que fazem, porém não sabem por que o fazem. Não é o nosso caso. A Prime Results é uma empresa especializada em gestão organizacional que usa seu potencial de transformação em empresas que geram impacto positivo na sociedade. Nossos clientes hoje, fazem a diferença na vida de mais de 250.000 brasileiros, nas áreas de proteção patrimonial, saúde e assistência 24 horas.&nbsp;</p> <p>Nosso objetivo central é criar um ambiente criativo, dinâmico e engajado, sempre aliados a métodos, processos inteligentes e muita inovação.</p><a href='https://coodesh.com/empresas/prime-results'>Veja mais no site</a> ## Habilidades: - Spring - Java - MySQL - Microsoft SQL Server - Angular - Apache - JSON - Hibernate ## Local: Belo Horizonte, Minas Gerais, Brazil ## Requisitos: - Experiência em Java: JSF, Spring, PrimeFaces, Hibernate, JasperReports; - Conhecimentos em modelagem e desenvolvimento de Bancos de Dados relacionais: MySQL, SQL Server; - Conhecimento de Arquiteturas Web e Serviços (HTTP, SOAP, REST ou JSON); - Conhecimentos nas ferramentas: GIT e Maven; - Conhecimentos técnicos em servidores de aplicação (Wildfly - J2EE), servidores web (Apache e NGINX) e Spring Boot. ## Diferenciais: - Conhecimentos em Tecnologias Web: HTML5, CSS e Frameworks JavaScript, Angular. ## Benefícios: - GymPass; - Assistência Médica após período de experiência. ## Como se candidatar: Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Backend Java Developer (Pleno - Híbrido em Belo Horizonte - MG) na Prime Results ](https://coodesh.com/vagas/backend-java-developer-pleno-hibrido-em-belo-horizonte-mg-144137266?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação. ## Labels #### Alocação Alocado #### Regime PJ #### Categoria Back-End
process
backend java developer pleno híbrido em belo horizonte mg na coodesh descrição da vaga esta é uma vaga de um parceiro da plataforma coodesh ao candidatar se você terá acesso as informações completas sobre a empresa e benefícios fique atento ao redirecionamento que vai te levar para uma url com o pop up personalizado de candidatura 👋 a prime results está em busca de backend java developer para compor seu time acreditamos no poder de transformação social realizado pelas empresas acreditamos no poder transformador das pessoas aliado à gestão e tecnologia compartilhamos nosso conhecimento para solucionar problemas complexos e gerar valor para nossos clientes responsabilidades desenvolvimento implementação e manutenção de aplicações participar da análise e execução dos projetos e execução dos tickets definir as atividades necessárias para a realização de projetos analisando os impactos em sistemas e processos através do entendimento da necessidade conhecimento técnico e arquitetônico dos sistemas desenvolver códigos para atendimento às áreas e empresas clientes proporcionando o esclarecimento de dúvidas relacionados ao projeto contribuindo para uma melhor análise de impactos de processos e sistemas sob sua responsabilidade nbsp participar das atividades de planejamento para a liberação do produto para homologação e produção por meio da validação de testes de aceite assim como documentação de não conformidades avaliando e planejando a execução das correções reportadas participar da rotina de squads nbsp prime results o best seller simon sinek diz que a maioria das empresas sabem o que fazem porém não sabem por que o fazem não é o nosso caso a prime results é uma empresa especializada em gestão organizacional que usa seu potencial de transformação em empresas que geram impacto positivo na sociedade nossos clientes hoje fazem a diferença na vida de mais de brasileiros nas áreas de proteção patrimonial saúde e assistência horas nbsp nosso objetivo central é criar um ambiente criativo dinâmico e engajado sempre aliados a métodos processos inteligentes e muita inovação habilidades spring java mysql microsoft sql server angular apache json hibernate local belo horizonte minas gerais brazil requisitos experiência em java jsf spring primefaces hibernate jasperreports conhecimentos em modelagem e desenvolvimento de bancos de dados relacionais mysql sql server conhecimento de arquiteturas web e serviços http soap rest ou json conhecimentos nas ferramentas git e maven conhecimentos técnicos em servidores de aplicação wildfly servidores web apache e nginx e spring boot diferenciais conhecimentos em tecnologias web css e frameworks javascript angular benefícios gympass assistência médica após período de experiência como se candidatar candidatar se exclusivamente através da plataforma coodesh no link a seguir após candidatar se via plataforma coodesh e validar o seu login você poderá acompanhar e receber todas as interações do processo por lá utilize a opção pedir feedback entre uma etapa e outra na vaga que se candidatou isso fará com que a pessoa recruiter responsável pelo processo na empresa receba a notificação labels alocação alocado regime pj categoria back end
1
310,074
23,320,130,038
IssuesEvent
2022-08-08 15:38:21
atsign-foundation/at_libraries
https://api.github.com/repos/atsign-foundation/at_libraries
closed
Unlist at_server_status on Pub.dev
documentation
Unlist the following packages ~ @nickelskevin to provide further details - [ ] at_server_status
1.0
Unlist at_server_status on Pub.dev - Unlist the following packages ~ @nickelskevin to provide further details - [ ] at_server_status
non_process
unlist at server status on pub dev unlist the following packages nickelskevin to provide further details at server status
0
1,256
3,789,946,068
IssuesEvent
2016-03-21 19:42:57
opattison/olivermakes
https://api.github.com/repos/opattison/olivermakes
opened
Patterns: add configuration details
content maintenance process
- [ ] options - [ ] Liquid-specific implementation Link to code where appropriate.
1.0
Patterns: add configuration details - - [ ] options - [ ] Liquid-specific implementation Link to code where appropriate.
process
patterns add configuration details options liquid specific implementation link to code where appropriate
1
215,247
24,154,771,121
IssuesEvent
2022-09-22 06:36:02
opfab/operatorfabric-core
https://api.github.com/repos/opfab/operatorfabric-core
closed
CVE-2022-25857 (High) detected in snakeyaml-1.30.jar, snakeyaml-1.29.jar - autoclosed
security vulnerability
## CVE-2022-25857 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>snakeyaml-1.30.jar</b>, <b>snakeyaml-1.29.jar</b></p></summary> <p> <details><summary><b>snakeyaml-1.30.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="https://bitbucket.org/snakeyaml/snakeyaml">https://bitbucket.org/snakeyaml/snakeyaml</a></p> <p>Path to dependency file: /tools/spring/spring-oauth2-utilities/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-test-2.7.3.jar (Root Library) - spring-boot-starter-2.7.3.jar - :x: **snakeyaml-1.30.jar** (Vulnerable Library) </details> <details><summary><b>snakeyaml-1.29.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p> <p>Path to dependency file: /src/test/api/karate/karateTests.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.29/6d0cdafb2010f1297e574656551d7145240f6e25/snakeyaml-1.29.jar</p> <p> Dependency Hierarchy: - karate-junit5-1.2.0.jar (Root Library) - karate-core-1.2.0.jar - :x: **snakeyaml-1.29.jar** (Vulnerable Library) </details> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The package org.yaml:snakeyaml from 0 and before 1.31 are vulnerable to Denial of Service (DoS) due missing to nested depth limitation for collections. <p>Publish Date: 2022-08-30 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-25857>CVE-2022-25857</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25857">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25857</a></p> <p>Release Date: 2022-08-30</p> <p>Fix Resolution: org.yaml:snakeyaml:1.31</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-25857 (High) detected in snakeyaml-1.30.jar, snakeyaml-1.29.jar - autoclosed - ## CVE-2022-25857 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>snakeyaml-1.30.jar</b>, <b>snakeyaml-1.29.jar</b></p></summary> <p> <details><summary><b>snakeyaml-1.30.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="https://bitbucket.org/snakeyaml/snakeyaml">https://bitbucket.org/snakeyaml/snakeyaml</a></p> <p>Path to dependency file: /tools/spring/spring-oauth2-utilities/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.30/8fde7fe2586328ac3c68db92045e1c8759125000/snakeyaml-1.30.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-test-2.7.3.jar (Root Library) - spring-boot-starter-2.7.3.jar - :x: **snakeyaml-1.30.jar** (Vulnerable Library) </details> <details><summary><b>snakeyaml-1.29.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p> <p>Path to dependency file: /src/test/api/karate/karateTests.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.29/6d0cdafb2010f1297e574656551d7145240f6e25/snakeyaml-1.29.jar</p> <p> Dependency Hierarchy: - karate-junit5-1.2.0.jar (Root Library) - karate-core-1.2.0.jar - :x: **snakeyaml-1.29.jar** (Vulnerable Library) </details> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The package org.yaml:snakeyaml from 0 and before 1.31 are vulnerable to Denial of Service (DoS) due missing to nested depth limitation for collections. <p>Publish Date: 2022-08-30 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-25857>CVE-2022-25857</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25857">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25857</a></p> <p>Release Date: 2022-08-30</p> <p>Fix Resolution: org.yaml:snakeyaml:1.31</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in snakeyaml jar snakeyaml jar autoclosed cve high severity vulnerability vulnerable libraries snakeyaml jar snakeyaml jar snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file tools spring spring utilities build gradle path to vulnerable library home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar dependency hierarchy spring boot starter test jar root library spring boot starter jar x snakeyaml jar vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file src test api karate karatetests gradle path to vulnerable library home wss scanner gradle caches modules files org yaml snakeyaml snakeyaml jar dependency hierarchy karate jar root library karate core jar x snakeyaml jar vulnerable library found in base branch develop vulnerability details the package org yaml snakeyaml from and before are vulnerable to denial of service dos due missing to nested depth limitation for collections publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org yaml snakeyaml step up your open source security game with mend
0
11,447
14,265,099,695
IssuesEvent
2020-11-20 16:38:29
ORNL-AMO/AMO-Tools-Suite
https://api.github.com/repos/ORNL-AMO/AMO-Tools-Suite
closed
ESC Further Backend
Calculator Process Heating
Will need to add some things to GasFlueGasMaterials.h GasFlueGasMaterials.cpp SolidLiquidFlueGasMaterials.h SolidLiquidFlueGasMaterials.cpp
1.0
ESC Further Backend - Will need to add some things to GasFlueGasMaterials.h GasFlueGasMaterials.cpp SolidLiquidFlueGasMaterials.h SolidLiquidFlueGasMaterials.cpp
process
esc further backend will need to add some things to gasfluegasmaterials h gasfluegasmaterials cpp solidliquidfluegasmaterials h solidliquidfluegasmaterials cpp
1
55,342
14,394,281,313
IssuesEvent
2020-12-03 00:55:02
scipy/scipy
https://api.github.com/repos/scipy/scipy
closed
Key appears twice in `test_optimize.test_show_options`
defect scipy.optimize
@AtsushiSakai, the maximize key appears twice in `st_optimize.test_show_options`, on line 2205. I think you just added this. Can you rejig the test as appropriate so that any key only appears once in the definition of a dict?
1.0
Key appears twice in `test_optimize.test_show_options` - @AtsushiSakai, the maximize key appears twice in `st_optimize.test_show_options`, on line 2205. I think you just added this. Can you rejig the test as appropriate so that any key only appears once in the definition of a dict?
non_process
key appears twice in test optimize test show options atsushisakai the maximize key appears twice in st optimize test show options on line i think you just added this can you rejig the test as appropriate so that any key only appears once in the definition of a dict
0
53,152
6,690,492,881
IssuesEvent
2017-10-09 09:20:10
WordPress/gutenberg
https://api.github.com/repos/WordPress/gutenberg
opened
Improve flow for inserting between blocks
Chrome Design Needs Design Feedback
There's an increasing need to improve the flow for inserting content between existing blocks. See #2752, #833, #2890, #2755, #2043. It becomes notably important when we start to look at nested blocks, or inserting images inline into paragraph blocks. A sticky inserter (as previously proposed) probably isn't accessible enough. Let's look at what a good way to do this is. A core design value of Gutenberg is that a block when _unselected_ is a preview of the contents itself. When the block is _selected_, we can show further options (see https://github.com/WordPress/gutenberg/blob/master/docs/design.md#block-design-checklist-dos-and-donts-and-examples). We have decided that the movers (up/down), configure block/trash (see also #2884) are important enough to show up on _hover_. But let's be extremely careful before we add more stuff on hover, it isn't a very mobile friendly pattern, and it also adds cognitive load when just editing or exploring a document. Dragging and dropping images works fine, that's good. A standard pattern for _editors_, online and offline, is to **put the cursor where you want to add new content, make linebreaks to add space, and insert content there**. This isn't immediately ported to Gutenberg since by design this block editor consists of _multiple editors_. But could we do this: ![insert between](https://user-images.githubusercontent.com/1204802/31331660-8c6177a8-ace3-11e7-946b-d1db16e85161.png) ![insert between block selected](https://user-images.githubusercontent.com/1204802/31331661-8de0133c-ace3-11e7-8bdf-62f86d92ab09.png) That is — when a block is selected, we: - show space to type below it - show the inserter blow the empty line of text - perhaps even let Enter take you to this new line Remember, on a newline, the slash command works for inserting content. In a past design we had the Inserter icon shown on the side. Let's discuss further, and do more mockups. I'll try and put together an animatic of the above.
2.0
Improve flow for inserting between blocks - There's an increasing need to improve the flow for inserting content between existing blocks. See #2752, #833, #2890, #2755, #2043. It becomes notably important when we start to look at nested blocks, or inserting images inline into paragraph blocks. A sticky inserter (as previously proposed) probably isn't accessible enough. Let's look at what a good way to do this is. A core design value of Gutenberg is that a block when _unselected_ is a preview of the contents itself. When the block is _selected_, we can show further options (see https://github.com/WordPress/gutenberg/blob/master/docs/design.md#block-design-checklist-dos-and-donts-and-examples). We have decided that the movers (up/down), configure block/trash (see also #2884) are important enough to show up on _hover_. But let's be extremely careful before we add more stuff on hover, it isn't a very mobile friendly pattern, and it also adds cognitive load when just editing or exploring a document. Dragging and dropping images works fine, that's good. A standard pattern for _editors_, online and offline, is to **put the cursor where you want to add new content, make linebreaks to add space, and insert content there**. This isn't immediately ported to Gutenberg since by design this block editor consists of _multiple editors_. But could we do this: ![insert between](https://user-images.githubusercontent.com/1204802/31331660-8c6177a8-ace3-11e7-946b-d1db16e85161.png) ![insert between block selected](https://user-images.githubusercontent.com/1204802/31331661-8de0133c-ace3-11e7-8bdf-62f86d92ab09.png) That is — when a block is selected, we: - show space to type below it - show the inserter blow the empty line of text - perhaps even let Enter take you to this new line Remember, on a newline, the slash command works for inserting content. In a past design we had the Inserter icon shown on the side. Let's discuss further, and do more mockups. I'll try and put together an animatic of the above.
non_process
improve flow for inserting between blocks there s an increasing need to improve the flow for inserting content between existing blocks see it becomes notably important when we start to look at nested blocks or inserting images inline into paragraph blocks a sticky inserter as previously proposed probably isn t accessible enough let s look at what a good way to do this is a core design value of gutenberg is that a block when unselected is a preview of the contents itself when the block is selected we can show further options see we have decided that the movers up down configure block trash see also are important enough to show up on hover but let s be extremely careful before we add more stuff on hover it isn t a very mobile friendly pattern and it also adds cognitive load when just editing or exploring a document dragging and dropping images works fine that s good a standard pattern for editors online and offline is to put the cursor where you want to add new content make linebreaks to add space and insert content there this isn t immediately ported to gutenberg since by design this block editor consists of multiple editors but could we do this that is — when a block is selected we show space to type below it show the inserter blow the empty line of text perhaps even let enter take you to this new line remember on a newline the slash command works for inserting content in a past design we had the inserter icon shown on the side let s discuss further and do more mockups i ll try and put together an animatic of the above
0
2,541
5,300,476,420
IssuesEvent
2017-02-10 05:08:36
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
closed
"uri and file may not be null" when having a reference to a Markdown file in DITA Map + preprocessing plugin installed
bug P1 preprocess
Having the "com.elovirta.dita.markdown_1.1.0" plugin installed in DITA OT 2.4, with a small DITA Map like this: <!DOCTYPE map PUBLIC "-//OASIS//DTD DITA Map//EN" "map.dtd"> <map> <topicref format="markdown" href="markdowntest.md"/> </map> and a markdown file "markdowntest.md" like this: # Title Para converting to XHTML fails with this exception: C:\Users\radu_coravu\Desktop\DITA OT newest\dita-ot-2.4.1\build.xml:45: The following error occurred while executing this line: C:\Users\radu_coravu\Desktop\DITA OT newest\dita-ot-2.4.1\plugins\org.dita.base\build_preprocess.xml:47: Failed to run pipeline: uri and file may not be null at org.dita.dost.invoker.ExtensibleAntInvoker.execute(ExtensibleAntInvoker.java:228) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:435) at org.apache.tools.ant.Target.performTasks(Target.java:456) at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1405) at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38) at org.apache.tools.ant.Project.executeTargets(Project.java:1260) at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:441) at org.apache.tools.ant.taskdefs.CallTarget.execute(CallTarget.java:105) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:435) at org.apache.tools.ant.Target.performTasks(Target.java:456) at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1405) at org.apache.tools.ant.Project.executeTarget(Project.java:1376) at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41) at org.apache.tools.ant.Project.executeTargets(Project.java:1260) at org.apache.tools.ant.Main.runBuild(Main.java:854) at org.apache.tools.ant.Main.startAnt(Main.java:236) at org.apache.tools.ant.launch.Launcher.run(Launcher.java:287) at org.apache.tools.ant.launch.Launcher.main(Launcher.java:113) Caused by: org.dita.dost.exception.DITAOTException: uri and file may not be null at org.dita.dost.module.GenMapAndTopicListModule.execute(GenMapAndTopicListModule.java:232) at org.dita.dost.pipeline.PipelineFacade.execute(PipelineFacade.java:70) at org.dita.dost.invoker.ExtensibleAntInvoker.execute(ExtensibleAntInvoker.java:222) ... 29 more Caused by: java.lang.IllegalStateException: uri and file may not be null at org.dita.dost.util.Job$FileInfo$Builder.build(Job.java:780) at org.dita.dost.module.GenMapAndTopicListModule.categorizeCurrentFile(GenMapAndTopicListModule.java:606) at org.dita.dost.module.GenMapAndTopicListModule.processFile(GenMapAndTopicListModule.java:448) at org.dita.dost.module.GenMapAndTopicListModule.processWaitList(GenMapAndTopicListModule.java:378) at org.dita.dost.module.GenMapAndTopicListModule.execute(GenMapAndTopicListModule.java:224) ... 31 more So it would seem that referencing markdown converted topics no longer work with the latest DITA OT 2.4. At some point there were some changes made by @jelovirt in the FileInfo class, possibly this change caused the problem: https://github.com/dita-ot/dita-ot/commit/378c0981d14fa141987df9b2e0c2336a14efd429
1.0
"uri and file may not be null" when having a reference to a Markdown file in DITA Map + preprocessing plugin installed - Having the "com.elovirta.dita.markdown_1.1.0" plugin installed in DITA OT 2.4, with a small DITA Map like this: <!DOCTYPE map PUBLIC "-//OASIS//DTD DITA Map//EN" "map.dtd"> <map> <topicref format="markdown" href="markdowntest.md"/> </map> and a markdown file "markdowntest.md" like this: # Title Para converting to XHTML fails with this exception: C:\Users\radu_coravu\Desktop\DITA OT newest\dita-ot-2.4.1\build.xml:45: The following error occurred while executing this line: C:\Users\radu_coravu\Desktop\DITA OT newest\dita-ot-2.4.1\plugins\org.dita.base\build_preprocess.xml:47: Failed to run pipeline: uri and file may not be null at org.dita.dost.invoker.ExtensibleAntInvoker.execute(ExtensibleAntInvoker.java:228) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:435) at org.apache.tools.ant.Target.performTasks(Target.java:456) at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1405) at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38) at org.apache.tools.ant.Project.executeTargets(Project.java:1260) at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:441) at org.apache.tools.ant.taskdefs.CallTarget.execute(CallTarget.java:105) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:435) at org.apache.tools.ant.Target.performTasks(Target.java:456) at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1405) at org.apache.tools.ant.Project.executeTarget(Project.java:1376) at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41) at org.apache.tools.ant.Project.executeTargets(Project.java:1260) at org.apache.tools.ant.Main.runBuild(Main.java:854) at org.apache.tools.ant.Main.startAnt(Main.java:236) at org.apache.tools.ant.launch.Launcher.run(Launcher.java:287) at org.apache.tools.ant.launch.Launcher.main(Launcher.java:113) Caused by: org.dita.dost.exception.DITAOTException: uri and file may not be null at org.dita.dost.module.GenMapAndTopicListModule.execute(GenMapAndTopicListModule.java:232) at org.dita.dost.pipeline.PipelineFacade.execute(PipelineFacade.java:70) at org.dita.dost.invoker.ExtensibleAntInvoker.execute(ExtensibleAntInvoker.java:222) ... 29 more Caused by: java.lang.IllegalStateException: uri and file may not be null at org.dita.dost.util.Job$FileInfo$Builder.build(Job.java:780) at org.dita.dost.module.GenMapAndTopicListModule.categorizeCurrentFile(GenMapAndTopicListModule.java:606) at org.dita.dost.module.GenMapAndTopicListModule.processFile(GenMapAndTopicListModule.java:448) at org.dita.dost.module.GenMapAndTopicListModule.processWaitList(GenMapAndTopicListModule.java:378) at org.dita.dost.module.GenMapAndTopicListModule.execute(GenMapAndTopicListModule.java:224) ... 31 more So it would seem that referencing markdown converted topics no longer work with the latest DITA OT 2.4. At some point there were some changes made by @jelovirt in the FileInfo class, possibly this change caused the problem: https://github.com/dita-ot/dita-ot/commit/378c0981d14fa141987df9b2e0c2336a14efd429
process
uri and file may not be null when having a reference to a markdown file in dita map preprocessing plugin installed having the com elovirta dita markdown plugin installed in dita ot with a small dita map like this and a markdown file markdowntest md like this title para converting to xhtml fails with this exception c users radu coravu desktop dita ot newest dita ot build xml the following error occurred while executing this line c users radu coravu desktop dita ot newest dita ot plugins org dita base build preprocess xml failed to run pipeline uri and file may not be null at org dita dost invoker extensibleantinvoker execute extensibleantinvoker java at org apache tools ant unknownelement execute unknownelement java at sun reflect invoke unknown source at sun reflect delegatingmethodaccessorimpl invoke unknown source at java lang reflect method invoke unknown source at org apache tools ant dispatch dispatchutils execute dispatchutils java at org apache tools ant task perform task java at org apache tools ant target execute target java at org apache tools ant target performtasks target java at org apache tools ant project executesortedtargets project java at org apache tools ant helper singlecheckexecutor executetargets singlecheckexecutor java at org apache tools ant project executetargets project java at org apache tools ant taskdefs ant execute ant java at org apache tools ant taskdefs calltarget execute calltarget java at org apache tools ant unknownelement execute unknownelement java at sun reflect invoke unknown source at sun reflect delegatingmethodaccessorimpl invoke unknown source at java lang reflect method invoke unknown source at org apache tools ant dispatch dispatchutils execute dispatchutils java at org apache tools ant task perform task java at org apache tools ant target execute target java at org apache tools ant target performtasks target java at org apache tools ant project executesortedtargets project java at org apache tools ant project executetarget project java at org apache tools ant helper defaultexecutor executetargets defaultexecutor java at org apache tools ant project executetargets project java at org apache tools ant main runbuild main java at org apache tools ant main startant main java at org apache tools ant launch launcher run launcher java at org apache tools ant launch launcher main launcher java caused by org dita dost exception ditaotexception uri and file may not be null at org dita dost module genmapandtopiclistmodule execute genmapandtopiclistmodule java at org dita dost pipeline pipelinefacade execute pipelinefacade java at org dita dost invoker extensibleantinvoker execute extensibleantinvoker java more caused by java lang illegalstateexception uri and file may not be null at org dita dost util job fileinfo builder build job java at org dita dost module genmapandtopiclistmodule categorizecurrentfile genmapandtopiclistmodule java at org dita dost module genmapandtopiclistmodule processfile genmapandtopiclistmodule java at org dita dost module genmapandtopiclistmodule processwaitlist genmapandtopiclistmodule java at org dita dost module genmapandtopiclistmodule execute genmapandtopiclistmodule java more so it would seem that referencing markdown converted topics no longer work with the latest dita ot at some point there were some changes made by jelovirt in the fileinfo class possibly this change caused the problem
1
93,468
11,782,471,328
IssuesEvent
2020-03-17 02:03:34
greatnewcls/KNLWKKGOTE62XGFSVAZIMVUH
https://api.github.com/repos/greatnewcls/KNLWKKGOTE62XGFSVAZIMVUH
reopened
kdy1odzo9IbSN7E381YyfxPUVBnCa6jEjLbHOnsDql2EgBOV+5AOj/ssG4vXzI844dDC2UdZ4Wq8jc8tOHwC8YFQ3XH42uiIKAhbLQ/qGFlLbBAeL9ZOjpWY3v66kwnyYAk6/TrjMSEVQB5xywqbJUD4xKiaQE424+mdzkxRvaM=
design
PVxJWv0+wuWJMM2i7gwPZnKMAGJVavU2SowxS5l3nMi845JrgLpyALhSwUPoUWbPMnJQxg/adiBCaKOglWFd0zPEI7nlI3MFVEDzxxexK4uWtIG27qhXete36XxGkiobM1xtkimC4b/w/dgjYQc//V3m+/P7yEA7kdgX07i3cs/Ay/CZTYmo1KBM38KlZWUlH+X3FRY1ZYtZ8nT526YwV9Sql0y5Tm/NhVJg0aEBhhy85dMQTAK/SlzKlR87yyIgwMvwmU2JqNSgTN/CpWVlJYXsAgjGSLTtgo6NRHK+qrU7h1ZM812lY0utBz57LklgiVcnrDBDetpanr1XjneQWcDL8JlNiajUoEzfwqVlZSWhjHiTAZ1u2XbJYww5xEsL1GNh6zm9R0hr6ITkqyqD4CVm9IShVMoGdUUvxcA2lxo5NyQjt2k/PJT3zwKM9jPVV5D6J7cVa0jxtn/s6PUj0HeFuq6ncDgsP7VgKTnJLRlJKcWC4TECAqHqIGOTNjYbcmTqsKkBqGTH0y9mOxlrg1X08gu7uXAonfGAj+UxmNrAy/CZTYmo1KBM38KlZWUlC92DuM8FD6WDj0XwcGxVOmeqieMZCAJCw2VtdCHR3LnAy/CZTYmo1KBM38KlZWUleZAMwqqgC+pbEcpBSv5uTWJuKT4O8zyot5OARY9zejgijAD8kt40VwXWFZu9QrXjGRLSm2VRjp2T74csfNJq69LTfR4KPD/MabqoI0K5ctE0NliAsP2R9TzTh2g/w9//mzQPJATBgtvGJNTVUtWyx0xw5oAnoGQHRT27aszSMo2Qx0EYOtOHBGZVEjomx1KFwMvwmU2JqNSgTN/CpWVlJQ9Ntc4MGXICwQ/D/mNHgPH64HxIYd/AulCJflc4/Cw7t7b/5FiO++eKRCFQhJIOeL4ocdr2XLPqcAEbH2xdWVF2qWU7tlpvjjJTwezQaRZngMsMFXL+YPty/FqsSuJPyuJGWdb6MioxwxDdgfuVLGTAy/CZTYmo1KBM38KlZWUlFriRC+FKUh3mBeYInFMYb4N+SZhgKpSscgHdQITOZFNnjPOI8UyXPonqnJkD0bOQ+S40Ia2lM7PvKb5mscu79r2pB958KLzhJ67gQMzPlUtuZByCVrqwnOiptr+XYDmNwMvwmU2JqNSgTN/CpWVlJZMajcAWDd8f0P1SoJy8gxbrHfAnzIH7JhQSZm/HtK3p5Qvk6nkuilYhWOoomdz67Zp3aty3KZWcK3kZqq7UW3NJ5QDbsI67wMhK15RRuOS7JWb0hKFUygZ1RS/FwDaXGpNFWXbmgOlwoIKbdoZRLhmfEF6zZ1VAReOc6u9TY390Jl2BLmx5lpHd0yXLJXaSRMDL8JlNiajUoEzfwqVlZSUUFYNa5RJT+ykCb8CAxghx9VHdxk1VNNeJCaAtoqh4DMDL8JlNiajUoEzfwqVlZSWkew6pm5KLOmA4kxOpNuUj+r3UjZqVhH5NxVdCWFPPZphJ2GZZwm+JltkJvMSIGV/Ay/CZTYmo1KBM38KlZWUl/lieWfL3hz/61K9nWsdFC4s75y2ug3yeCbk7JA1UJgyoTiwiIOkEwZchszE/DCtn/1Sxs9/4n6/zSF3CbIDb8IvSB+q+nS619KqneNkWhWWq7H3Qp8yXvGM5YdkOJ7G3wMvwmU2JqNSgTN/CpWVlJdpXm/IzudaUuKIbz8srUVhT87Dlr2XNfl2CMYFqB8I9wMvwmU2JqNSgTN/CpWVlJcDL8JlNiajUoEzfwqVlZSVfvvISf+e8XK4eQxP3rJR9k9FraMotXzEm4KOHu4SIJdpjz5ZXOXAhLaPpSGJQ4u/3wUIx2O0+LgJ0dqrxFj5G9tWHrmP5JbzH/HuMZZYs5hNMcUA0jgjEstQMHX3AQ6XJk/U5SRRNXgwPgBYayaSzVVooIv87iVydirZG17BQq32W50bvD39LbLdn5Ix5Qjo=
1.0
kdy1odzo9IbSN7E381YyfxPUVBnCa6jEjLbHOnsDql2EgBOV+5AOj/ssG4vXzI844dDC2UdZ4Wq8jc8tOHwC8YFQ3XH42uiIKAhbLQ/qGFlLbBAeL9ZOjpWY3v66kwnyYAk6/TrjMSEVQB5xywqbJUD4xKiaQE424+mdzkxRvaM= - PVxJWv0+wuWJMM2i7gwPZnKMAGJVavU2SowxS5l3nMi845JrgLpyALhSwUPoUWbPMnJQxg/adiBCaKOglWFd0zPEI7nlI3MFVEDzxxexK4uWtIG27qhXete36XxGkiobM1xtkimC4b/w/dgjYQc//V3m+/P7yEA7kdgX07i3cs/Ay/CZTYmo1KBM38KlZWUlH+X3FRY1ZYtZ8nT526YwV9Sql0y5Tm/NhVJg0aEBhhy85dMQTAK/SlzKlR87yyIgwMvwmU2JqNSgTN/CpWVlJYXsAgjGSLTtgo6NRHK+qrU7h1ZM812lY0utBz57LklgiVcnrDBDetpanr1XjneQWcDL8JlNiajUoEzfwqVlZSWhjHiTAZ1u2XbJYww5xEsL1GNh6zm9R0hr6ITkqyqD4CVm9IShVMoGdUUvxcA2lxo5NyQjt2k/PJT3zwKM9jPVV5D6J7cVa0jxtn/s6PUj0HeFuq6ncDgsP7VgKTnJLRlJKcWC4TECAqHqIGOTNjYbcmTqsKkBqGTH0y9mOxlrg1X08gu7uXAonfGAj+UxmNrAy/CZTYmo1KBM38KlZWUlC92DuM8FD6WDj0XwcGxVOmeqieMZCAJCw2VtdCHR3LnAy/CZTYmo1KBM38KlZWUleZAMwqqgC+pbEcpBSv5uTWJuKT4O8zyot5OARY9zejgijAD8kt40VwXWFZu9QrXjGRLSm2VRjp2T74csfNJq69LTfR4KPD/MabqoI0K5ctE0NliAsP2R9TzTh2g/w9//mzQPJATBgtvGJNTVUtWyx0xw5oAnoGQHRT27aszSMo2Qx0EYOtOHBGZVEjomx1KFwMvwmU2JqNSgTN/CpWVlJQ9Ntc4MGXICwQ/D/mNHgPH64HxIYd/AulCJflc4/Cw7t7b/5FiO++eKRCFQhJIOeL4ocdr2XLPqcAEbH2xdWVF2qWU7tlpvjjJTwezQaRZngMsMFXL+YPty/FqsSuJPyuJGWdb6MioxwxDdgfuVLGTAy/CZTYmo1KBM38KlZWUlFriRC+FKUh3mBeYInFMYb4N+SZhgKpSscgHdQITOZFNnjPOI8UyXPonqnJkD0bOQ+S40Ia2lM7PvKb5mscu79r2pB958KLzhJ67gQMzPlUtuZByCVrqwnOiptr+XYDmNwMvwmU2JqNSgTN/CpWVlJZMajcAWDd8f0P1SoJy8gxbrHfAnzIH7JhQSZm/HtK3p5Qvk6nkuilYhWOoomdz67Zp3aty3KZWcK3kZqq7UW3NJ5QDbsI67wMhK15RRuOS7JWb0hKFUygZ1RS/FwDaXGpNFWXbmgOlwoIKbdoZRLhmfEF6zZ1VAReOc6u9TY390Jl2BLmx5lpHd0yXLJXaSRMDL8JlNiajUoEzfwqVlZSUUFYNa5RJT+ykCb8CAxghx9VHdxk1VNNeJCaAtoqh4DMDL8JlNiajUoEzfwqVlZSWkew6pm5KLOmA4kxOpNuUj+r3UjZqVhH5NxVdCWFPPZphJ2GZZwm+JltkJvMSIGV/Ay/CZTYmo1KBM38KlZWUl/lieWfL3hz/61K9nWsdFC4s75y2ug3yeCbk7JA1UJgyoTiwiIOkEwZchszE/DCtn/1Sxs9/4n6/zSF3CbIDb8IvSB+q+nS619KqneNkWhWWq7H3Qp8yXvGM5YdkOJ7G3wMvwmU2JqNSgTN/CpWVlJdpXm/IzudaUuKIbz8srUVhT87Dlr2XNfl2CMYFqB8I9wMvwmU2JqNSgTN/CpWVlJcDL8JlNiajUoEzfwqVlZSVfvvISf+e8XK4eQxP3rJR9k9FraMotXzEm4KOHu4SIJdpjz5ZXOXAhLaPpSGJQ4u/3wUIx2O0+LgJ0dqrxFj5G9tWHrmP5JbzH/HuMZZYs5hNMcUA0jgjEstQMHX3AQ6XJk/U5SRRNXgwPgBYayaSzVVooIv87iVydirZG17BQq32W50bvD39LbLdn5Ix5Qjo=
non_process
mdzkxrvam w dgjyqc ay uxmnray d ypty jltkjvmsigv ay dctn q cpwvljdpxm
0
7,784
10,924,743,613
IssuesEvent
2019-11-22 10:48:38
hashicorp/packer
https://api.github.com/repos/hashicorp/packer
closed
Array of tags for `docker-tag` post processor.
enhancement good first issue post-processor/docker-tag
Hello, I create a new issue, copy of ##2097 to be able to tag a Docker image with several tags. #### Feature Description Be able to tag a Docker image with several tags using docker-tag post-processor. Currently, it's possible to use only one tag. #### Use Case(s) I want to tag a Docker image with several tags.
1.0
Array of tags for `docker-tag` post processor. - Hello, I create a new issue, copy of ##2097 to be able to tag a Docker image with several tags. #### Feature Description Be able to tag a Docker image with several tags using docker-tag post-processor. Currently, it's possible to use only one tag. #### Use Case(s) I want to tag a Docker image with several tags.
process
array of tags for docker tag post processor hello i create a new issue copy of to be able to tag a docker image with several tags feature description be able to tag a docker image with several tags using docker tag post processor currently it s possible to use only one tag use case s i want to tag a docker image with several tags
1
13,923
16,679,957,599
IssuesEvent
2021-06-07 21:40:57
google/android-fhir
https://api.github.com/repos/google/android-fhir
closed
Regarding android-fhir client
process
**Describe the Issue** [https://github.com/google/android-fhir/wiki/GSoC-Project-Ideas#android-fhir-client](url) **There are few doubts regarding the project.** 1. I think the project will act as a combination of reference app and dc gallery app? am I correct here. How should we get started with it? 2. Should we make a sample design for the app.
1.0
Regarding android-fhir client - **Describe the Issue** [https://github.com/google/android-fhir/wiki/GSoC-Project-Ideas#android-fhir-client](url) **There are few doubts regarding the project.** 1. I think the project will act as a combination of reference app and dc gallery app? am I correct here. How should we get started with it? 2. Should we make a sample design for the app.
process
regarding android fhir client describe the issue url there are few doubts regarding the project i think the project will act as a combination of reference app and dc gallery app am i correct here how should we get started with it should we make a sample design for the app
1
2,280
5,106,939,704
IssuesEvent
2017-01-05 13:22:47
Hurence/logisland
https://api.github.com/repos/Hurence/logisland
opened
design a facial recognition processor with Convolutional Neural Network
feature help wanted processor
need a lot of design https://opendatascience.com/blog/an-intuitive-explanation-of-convolutional-neural-networks/
1.0
design a facial recognition processor with Convolutional Neural Network - need a lot of design https://opendatascience.com/blog/an-intuitive-explanation-of-convolutional-neural-networks/
process
design a facial recognition processor with convolutional neural network need a lot of design
1
312,917
23,447,840,556
IssuesEvent
2022-08-15 21:40:50
Interacao-Humano-Computador/2022.1-Faculdade-de-Arquitetura-e-Urbanismo
https://api.github.com/repos/Interacao-Humano-Computador/2022.1-Faculdade-de-Arquitetura-e-Urbanismo
closed
Criar documento do planejamento da avaliação do protótipo de papel
documentation entrega5
Realizar criação do documento do planejamento da avaliação do protótipo de papel
1.0
Criar documento do planejamento da avaliação do protótipo de papel - Realizar criação do documento do planejamento da avaliação do protótipo de papel
non_process
criar documento do planejamento da avaliação do protótipo de papel realizar criação do documento do planejamento da avaliação do protótipo de papel
0
162,613
13,892,943,180
IssuesEvent
2020-10-19 12:54:29
GroupOneIncorporated/acme-infrastructure
https://api.github.com/repos/GroupOneIncorporated/acme-infrastructure
closed
Add README files.
documentation
- Add/fix README with instructions for terraglue script - Add README with instructions for Ansible - Add README with instructions for ssh config - Add README with instructions for rke - Add README with instructions for k8s
1.0
Add README files. - - Add/fix README with instructions for terraglue script - Add README with instructions for Ansible - Add README with instructions for ssh config - Add README with instructions for rke - Add README with instructions for k8s
non_process
add readme files add fix readme with instructions for terraglue script add readme with instructions for ansible add readme with instructions for ssh config add readme with instructions for rke add readme with instructions for
0
594,609
18,049,237,075
IssuesEvent
2021-09-19 12:56:58
AY2122S1-CS2103T-W17-1/tp
https://api.github.com/repos/AY2122S1-CS2103T-W17-1/tp
opened
Complete an application
type.Story :book: priority.Medium :2nd_place_medal:
As a user, I want to mark a specified entry in my application list as completed, so that I can be clear about the status of each application.
1.0
Complete an application - As a user, I want to mark a specified entry in my application list as completed, so that I can be clear about the status of each application.
non_process
complete an application as a user i want to mark a specified entry in my application list as completed so that i can be clear about the status of each application
0
552,714
16,253,765,573
IssuesEvent
2021-05-08 00:14:59
apcountryman/picolibrary-microchip-megaavr0
https://api.github.com/repos/apcountryman/picolibrary-microchip-megaavr0
closed
Add PORT peripheral based GPIO push-pull I/O pin
priority-normal status-awaiting_development type-feature
Add PORT peripheral based GPIO push-pull I/O pin (`::picolibrary::Microchip::megaAVR0::GPIO::Push_Pull_IO_Pin<::picolibrary::Microchip::megaAVR0::Peripheral::PORT>`). - [ ] The PORT peripheral based GPIO push-pull I/O pin should be defined in the `include/picolibrary/microchip/megaavr0/gpio.h`/`source/picolibrary/microchip/megaavr0/gpio.cc` header/source file pair. The PORT peripheral based GPIO push-pull I/O pin should support the following operations: - [ ] `constexpr Push_Pull_IO_Pin() noexcept` - [ ] `Push_Pull_IO_Pin( Peripheral::PORT & port, std::uint8_t mask ) noexcept` - [ ] `constexpr `Push_Pull_IO_Pin( Push_Pull_IO_Pin && source ) noexcept` - [ ] `~Push_Pull_IO_Pin() noexcept` - [ ] `auto operator=( Push_Pull_IO_Pin && expression ) noexcept -> Push_Pull_IO_Pin &` - [ ] `auto initialize( ::picolibrary::GPIO::Initial_Pin_State initial_pin_state = ::picolibrary::GPIO::Initial_Pin_State::LOW ) noexcept -> Result<Void, Void>`: Initialize the pin's hardware - [ ] `auto state() const noexcept -> Result<::picolibrary::GPIO::Pin_State, Void>`: Get the state of the pin - [ ] `auto transition_to_high() noexcept -> Result<Void, Void>`: Transition the pin to the high state - [ ] `auto transition_to_low() noexcept -> Result<Void, Void>`: Transition the pin to the low state - [ ] `auto toggle() noexcept -> Result<Void, Void>`: Toggle the pin state
1.0
Add PORT peripheral based GPIO push-pull I/O pin - Add PORT peripheral based GPIO push-pull I/O pin (`::picolibrary::Microchip::megaAVR0::GPIO::Push_Pull_IO_Pin<::picolibrary::Microchip::megaAVR0::Peripheral::PORT>`). - [ ] The PORT peripheral based GPIO push-pull I/O pin should be defined in the `include/picolibrary/microchip/megaavr0/gpio.h`/`source/picolibrary/microchip/megaavr0/gpio.cc` header/source file pair. The PORT peripheral based GPIO push-pull I/O pin should support the following operations: - [ ] `constexpr Push_Pull_IO_Pin() noexcept` - [ ] `Push_Pull_IO_Pin( Peripheral::PORT & port, std::uint8_t mask ) noexcept` - [ ] `constexpr `Push_Pull_IO_Pin( Push_Pull_IO_Pin && source ) noexcept` - [ ] `~Push_Pull_IO_Pin() noexcept` - [ ] `auto operator=( Push_Pull_IO_Pin && expression ) noexcept -> Push_Pull_IO_Pin &` - [ ] `auto initialize( ::picolibrary::GPIO::Initial_Pin_State initial_pin_state = ::picolibrary::GPIO::Initial_Pin_State::LOW ) noexcept -> Result<Void, Void>`: Initialize the pin's hardware - [ ] `auto state() const noexcept -> Result<::picolibrary::GPIO::Pin_State, Void>`: Get the state of the pin - [ ] `auto transition_to_high() noexcept -> Result<Void, Void>`: Transition the pin to the high state - [ ] `auto transition_to_low() noexcept -> Result<Void, Void>`: Transition the pin to the low state - [ ] `auto toggle() noexcept -> Result<Void, Void>`: Toggle the pin state
non_process
add port peripheral based gpio push pull i o pin add port peripheral based gpio push pull i o pin picolibrary microchip gpio push pull io pin the port peripheral based gpio push pull i o pin should be defined in the include picolibrary microchip gpio h source picolibrary microchip gpio cc header source file pair the port peripheral based gpio push pull i o pin should support the following operations constexpr push pull io pin noexcept push pull io pin peripheral port port std t mask noexcept constexpr push pull io pin push pull io pin source noexcept push pull io pin noexcept auto operator push pull io pin expression noexcept push pull io pin auto initialize picolibrary gpio initial pin state initial pin state picolibrary gpio initial pin state low noexcept result initialize the pin s hardware auto state const noexcept result get the state of the pin auto transition to high noexcept result transition the pin to the high state auto transition to low noexcept result transition the pin to the low state auto toggle noexcept result toggle the pin state
0
254,562
21,793,877,447
IssuesEvent
2022-05-15 10:29:05
suterma/replayer-pwa
https://api.github.com/repos/suterma/replayer-pwa
closed
Opening the linked compilations in the About page does not work
bug Regression Test
No compilation is actually loaded, although it should.
1.0
Opening the linked compilations in the About page does not work - No compilation is actually loaded, although it should.
non_process
opening the linked compilations in the about page does not work no compilation is actually loaded although it should
0
38,305
6,667,002,815
IssuesEvent
2017-10-03 10:40:09
facebook/react
https://api.github.com/repos/facebook/react
closed
[website] Buttons in Live Code sections have bad styling
Component: Documentation & Website Difficulty: beginner
On the new site, the button in the "todo" example looks strange (non-standard): ![](http://ss.dan.cx/2017/09/chrome_30-20.14.50.png) Generally this happens when border styling is overridden and no nice styling is applied to the button. This appears to be the case here. Removing `border-color: inherit` fixes the styling: ![](http://ss.dan.cx/2017/09/chrome_30-20.17.29.png) ![](http://ss.dan.cx/2017/09/chrome_30-20.15.35.png)
1.0
[website] Buttons in Live Code sections have bad styling - On the new site, the button in the "todo" example looks strange (non-standard): ![](http://ss.dan.cx/2017/09/chrome_30-20.14.50.png) Generally this happens when border styling is overridden and no nice styling is applied to the button. This appears to be the case here. Removing `border-color: inherit` fixes the styling: ![](http://ss.dan.cx/2017/09/chrome_30-20.17.29.png) ![](http://ss.dan.cx/2017/09/chrome_30-20.15.35.png)
non_process
buttons in live code sections have bad styling on the new site the button in the todo example looks strange non standard generally this happens when border styling is overridden and no nice styling is applied to the button this appears to be the case here removing border color inherit fixes the styling
0
91,729
3,862,260,022
IssuesEvent
2016-04-08 01:30:57
Captianrock/android_PV
https://api.github.com/repos/Captianrock/android_PV
closed
Restructure Database
High Priority
Database should have multiple tables to account for multiple users and multiple applications for multiple traces done on the web app
1.0
Restructure Database - Database should have multiple tables to account for multiple users and multiple applications for multiple traces done on the web app
non_process
restructure database database should have multiple tables to account for multiple users and multiple applications for multiple traces done on the web app
0
21,541
29,864,298,223
IssuesEvent
2023-06-20 01:25:15
cncf/tag-security
https://api.github.com/repos/cncf/tag-security
closed
Review existing frameworks wrt. cloud-native
assessment-process inactive
This has come up in the calls that referencing a standardized model and/or vocabulary might be useful. Here's a list of frameworks that we might consider in no particular order...none of these seems immediately applicable to CN projects, but can be mapped to CN or a derivative vocabulary/model can learn from these: - [MITRE ATT&CK](https://www.mitre.org/sites/default/files/publications/pr-18-0944-11-mitre-attack-design-and-philosophy.pdf) - [Parkerian Hexad/CIA Triad](https://en.wikipedia.org/wiki/Parkerian_Hexad) - [Lockheed Martin Cyber Kill Chain®](https://www.lockheedmartin.com/en-us/capabilities/cyber/cyber-kill-chain.html) - [Microsoft STRIDE](https://en.wikipedia.org/wiki/STRIDE_(security)) and/or DREAD - [CVSS](https://www.first.org/cvss/cvss-v30-specification-v1.7.pdf) - [Trike](http://www.octotrike.org/) - [MIL-STD-882E](https://www.system-safety.org/Documents/MIL-STD-882E.pdf) - OCTAVE - VAST - PASTA - [Others](https://insights.sei.cmu.edu/sei_blog/2018/12/threat-modeling-12-available-methods.html)
1.0
Review existing frameworks wrt. cloud-native - This has come up in the calls that referencing a standardized model and/or vocabulary might be useful. Here's a list of frameworks that we might consider in no particular order...none of these seems immediately applicable to CN projects, but can be mapped to CN or a derivative vocabulary/model can learn from these: - [MITRE ATT&CK](https://www.mitre.org/sites/default/files/publications/pr-18-0944-11-mitre-attack-design-and-philosophy.pdf) - [Parkerian Hexad/CIA Triad](https://en.wikipedia.org/wiki/Parkerian_Hexad) - [Lockheed Martin Cyber Kill Chain®](https://www.lockheedmartin.com/en-us/capabilities/cyber/cyber-kill-chain.html) - [Microsoft STRIDE](https://en.wikipedia.org/wiki/STRIDE_(security)) and/or DREAD - [CVSS](https://www.first.org/cvss/cvss-v30-specification-v1.7.pdf) - [Trike](http://www.octotrike.org/) - [MIL-STD-882E](https://www.system-safety.org/Documents/MIL-STD-882E.pdf) - OCTAVE - VAST - PASTA - [Others](https://insights.sei.cmu.edu/sei_blog/2018/12/threat-modeling-12-available-methods.html)
process
review existing frameworks wrt cloud native this has come up in the calls that referencing a standardized model and or vocabulary might be useful here s a list of frameworks that we might consider in no particular order none of these seems immediately applicable to cn projects but can be mapped to cn or a derivative vocabulary model can learn from these and or dread octave vast pasta
1
18,953
24,913,677,760
IssuesEvent
2022-10-30 06:04:29
osamhack2022-v2/APP_FreshPlus_TakeCareMyRefrigerator
https://api.github.com/repos/osamhack2022-v2/APP_FreshPlus_TakeCareMyRefrigerator
closed
AI camera security issues
question imgae process
군대내에서 카메라를 상시 설치해 놓는 것에 관하여 보안이슈가 제기될 수 있을 것으로 생각합니다. 개인적으로는 1. 냉장고 내부를 주로 볼 수 있도록 카메라가 설치되는 점 2. 카메라에서 찍히는 사진은 로컬외에 저장 및 처리되는 곳이 없다는 점 3. AI로 처리된 물품 위치만이 서버에 전송된다는 점 4. 정도로 대응을 하면 좋을 것 같습니다. 다른 좋은 아이디어 있으시면 제시해주시면 좋겠습니다!
1.0
AI camera security issues - 군대내에서 카메라를 상시 설치해 놓는 것에 관하여 보안이슈가 제기될 수 있을 것으로 생각합니다. 개인적으로는 1. 냉장고 내부를 주로 볼 수 있도록 카메라가 설치되는 점 2. 카메라에서 찍히는 사진은 로컬외에 저장 및 처리되는 곳이 없다는 점 3. AI로 처리된 물품 위치만이 서버에 전송된다는 점 4. 정도로 대응을 하면 좋을 것 같습니다. 다른 좋은 아이디어 있으시면 제시해주시면 좋겠습니다!
process
ai camera security issues 군대내에서 카메라를 상시 설치해 놓는 것에 관하여 보안이슈가 제기될 수 있을 것으로 생각합니다 개인적으로는 냉장고 내부를 주로 볼 수 있도록 카메라가 설치되는 점 카메라에서 찍히는 사진은 로컬외에 저장 및 처리되는 곳이 없다는 점 ai로 처리된 물품 위치만이 서버에 전송된다는 점 정도로 대응을 하면 좋을 것 같습니다 다른 좋은 아이디어 있으시면 제시해주시면 좋겠습니다
1
20,111
26,650,714,509
IssuesEvent
2023-01-25 13:29:35
MPMG-DCC-UFMG/C01
https://api.github.com/repos/MPMG-DCC-UFMG/C01
closed
Correção da exceção: '_UnixSelectorEventLoop' object has no attribute '_closed' em coletas dinâmicas
[1] Bug [2] Alta Prioridade [0] Desenvolvimento [3] Processamento Dinâmico
## Comportamento Esperado O sistema deve executar sem nenhum tipo de exceção, porém, como apontado na descrição da issue #782, em algumas coletas essa exceção é gerada. Embora seja ignorada pelo sistema. ## Comportamento atual A exceção citada ocorre em algumas coletas, conforme apontado na issue #782. ## Passos para reproduzir o erro Instancie uma das coletas apontadas na issue #782 onde ocorre essa exceção na solução distribuída. ## Sistema Branch `distributed-system`.
1.0
Correção da exceção: '_UnixSelectorEventLoop' object has no attribute '_closed' em coletas dinâmicas - ## Comportamento Esperado O sistema deve executar sem nenhum tipo de exceção, porém, como apontado na descrição da issue #782, em algumas coletas essa exceção é gerada. Embora seja ignorada pelo sistema. ## Comportamento atual A exceção citada ocorre em algumas coletas, conforme apontado na issue #782. ## Passos para reproduzir o erro Instancie uma das coletas apontadas na issue #782 onde ocorre essa exceção na solução distribuída. ## Sistema Branch `distributed-system`.
process
correção da exceção unixselectoreventloop object has no attribute closed em coletas dinâmicas comportamento esperado o sistema deve executar sem nenhum tipo de exceção porém como apontado na descrição da issue em algumas coletas essa exceção é gerada embora seja ignorada pelo sistema comportamento atual a exceção citada ocorre em algumas coletas conforme apontado na issue passos para reproduzir o erro instancie uma das coletas apontadas na issue onde ocorre essa exceção na solução distribuída sistema branch distributed system
1
17,976
24,822,868,676
IssuesEvent
2022-10-25 17:57:32
elementor/elementor
https://api.github.com/repos/elementor/elementor
closed
[⌛ Awaiting Feedback] 🧐 Possible Bug: Icon Box not able to display its own uploaded SVG
compatibility/3rd_party status/needs_feedback compatibility/assets component/icon-box
### Prerequisites - [X] I have searched for similar issues in both open and closed tickets and cannot find a duplicate. - [X] The issue still exists against the latest stable version of Elementor. ### Description I am working on a few pages that use a grid of Icon Box widgets. Since Elementor has not updated to Font Awesome 6 (See: #18723) we are uploading SVGs of some of the FA6 glyphs, but then when we try to use them, subsequent Icon Box widgets don't render the uploaded SVG: ![CleanShot 2022-10-17 at 16 03 34@2x](https://user-images.githubusercontent.com/24182/196300205-8f0fc5f1-12f6-474a-9ef4-9412bc99c58b.png) Initially I thought maybe it was related to copy and paste of styles but each Icon Box on this test page was created manually. ### Steps to reproduce 1. Create a page 2. Add an Inner Section 3. In the first column, put an Icon Box 4. Set the first Icon Box icon to an uploaded SVG 5. In the second column, put an Icon Box 6. Set the second Icon Box icon to a different uploaded SVG **Expected Results:** Each Icon Box uses its own uploaded SVG **Actual Results:** SVG from the first Icon Box is used in the second Icon Box. (See screenshot above.) When I add a third Icon Box to the start of the list, _that_ Icon Box's image applies to _all_ the Icon Box widgets on the page: ![CleanShot 2022-10-17 at 16 14 59@2x](https://user-images.githubusercontent.com/24182/196301087-4bdc6985-056e-4bf4-af8d-8d3725a8d7a9.png) If I change an icon to a Font Awesome icon, the correct icon appears: ![CleanShot 2022-10-17 at 16 15 51@2x](https://user-images.githubusercontent.com/24182/196301157-b069ce92-d416-4935-9023-a2c20c9e1ad6.png) ### Isolating the problem - [ ] This bug happens with only Elementor plugin active (and Elementor Pro). - [ ] This bug happens with a Blank WordPress theme active ([Hello theme](https://wordpress.org/themes/hello-elementor/)). - [X] I can reproduce this bug consistently using the steps above. ### System Info ``` == Server Environment == Operating System: Linux Software: Apache MySQL version: Percona Server (GPL), Release '42', Revision 'b0a7dc2da2e' v5.7.39-42 PHP Version: 8.0.23 PHP Memory Limit: 512M PHP Max Input Vars: 10000 PHP Max Post Size: 100M GD Installed: Yes ZIP Installed: Yes Write Permissions: All right Elementor Library: Connected == WordPress Environment == Version: 6.0.2 Site URL: https://kodehealthstg.wpengine.com Home URL: https://kodehealthstg.wpengine.com WP Multisite: No Max Upload Size: 50 MB Memory limit: 40M Max Memory limit: 512M Permalink Structure: /%postname%/ Language: en-US Timezone: 0 Admin Email: nate.miller@mutuallyhuman.com Debug Mode: Inactive == Theme == Name: Astra Version: 3.9.1 Author: Brainstorm Force Child Theme: No == User == Role: administrator WP Profile lang: en_US User Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:106.0) Gecko/20100101 Firefox/106.0 == Active Plugins == ActiveCampaign Postmark (Official) Version: 1.17.2 Author: Andrew Yates & Matt Gibbs Disable XML-RPC Version: 1.0.1 Author: Philip Erb Elementor Version: 3.7.8 Author: Elementor.com Elementor Pro Version: 3.7.7 Author: Elementor.com Google Analytics for WordPress by MonsterInsights Version: 8.8.2 Author: MonsterInsights Limit Login Attempts Reloaded Version: 2.25.5 Author: Limit Login Attempts Reloaded Safe SVG Version: 2.0.3 Author: 10up Starter Templates Version: 3.1.15 Author: Brainstorm Force WPS Hide Login Version: 1.9.6 Author: WPServeur, NicolasKulka, wpformation Yoast SEO Version: 19.6 Author: Team Yoast Yoast SEO Premium Version: 19.2 Author: Team Yoast == Must-Use Plugins == Force Strong Passwords - WPE Edition Version: 1.8.0 Author: Jason Cosper WP Engine Cache Plugin Version: 1.0.11 Author: WP Engine WP Engine Seamless Login Plugin Version: 1.6.0 Author: WP Engine WP Engine Security Auditor Version: 1.0.10 Author: wpengine WP Engine System Version: 5.0.1 Author: WP Engine == Elementor Experiments == Optimized DOM Output: Active by default Improved Asset Loading: Active by default Improved CSS Loading: Active by default Inline Font Icons: Inactive by default Accessibility Improvements: Active by default Additional Custom Breakpoints: Active by default Import Export Template Kit: Active by default Hide native WordPress widgets from search results: Active by default admin_menu_rearrangement: Inactive by default Flexbox Container: Inactive by default Default to New Theme Builder: Active by default Landing Pages: Active by default Color Sampler: Active by default Favorite Widgets: Active by default Admin Top Bar: Active by default Page Transitions: Active by default Notes: Active by default Form Submissions: Active by default Scroll Snap: Active by default == Log == Log: showing 20 of 222022-08-29 20:09:20 [info] Elementor data updater process has been queued. [array ( 'plugin' => 'Elementor Pro', 'from' => '3.7.3', 'to' => '3.7.4', )] 2022-08-29 20:12:26 [info] elementor-pro::elementor_pro_updater Started 2022-08-29 20:12:26 [info] Elementor Pro/Upgrades - _on_each_version Start 2022-08-29 20:12:26 [info] Elementor Pro/Upgrades - _on_each_version Finished 2022-08-29 20:12:26 [info] Elementor data updater process has been completed. [array ( 'plugin' => 'Elementor Pro', 'from' => '3.7.3', 'to' => '3.7.4', )] 2022-08-29 20:12:28 [info] Elementor data updater process has been queued. [array ( 'plugin' => 'Elementor Pro', 'from' => '3.7.3', 'to' => '3.7.4', )] 2022-08-29 20:12:43 [info] elementor-pro::elementor_pro_updater Started 2022-08-29 20:12:43 [info] Elementor Pro/Upgrades - _on_each_version Start 2022-08-29 20:12:43 [info] Elementor Pro/Upgrades - _on_each_version Finished 2022-08-29 20:12:43 [info] Elementor data updater process has been completed. [array ( 'plugin' => 'Elementor Pro', 'from' => '3.7.3', 'to' => '3.7.4', )] 2022-10-10 21:28:25 [info] elementor-pro::elementor_pro_updater Started 2022-10-10 21:28:25 [info] Elementor Pro/Upgrades - _on_each_version Start 2022-10-10 21:28:25 [info] Elementor Pro/Upgrades - _on_each_version Finished 2022-10-10 21:28:25 [info] Elementor data updater process has been completed. [array ( 'plugin' => 'Elementor Pro', 'from' => '3.7.4', 'to' => '3.7.7', )] 2022-10-10 21:28:25 [info] Elementor data updater process has been queued. [array ( 'plugin' => 'Elementor Pro', 'from' => '3.7.4', 'to' => '3.7.7', )] 2022-10-17 23:26:52 [info] elementor::elementor_updater Started 2022-10-17 23:26:52 [info] Elementor/Upgrades - _on_each_version Start 2022-10-17 23:26:52 [info] Elementor/Upgrades - _on_each_version Finished 2022-10-17 23:26:52 [info] Elementor data updater process has been completed. [array ( 'plugin' => 'Elementor', 'from' => '3.7.3', 'to' => '3.7.8', )] 2022-10-17 23:26:52 [info] Elementor data updater process has been queued. [array ( 'plugin' => 'Elementor', 'from' => '3.7.3', 'to' => '3.7.8', )] PHP: showing 11 of 11PHP: 2022-08-29 20:13:47 [warning X 956][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1483] Trying to access array offset on value of type null [array ( 'trace' => ' #0: Elementor\Core\Logger\Manager -> shutdown() ', )] PHP: 2022-09-19 16:45:39 [warning X 285][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1446] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1446): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(733): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(534): Elementor\Widget_Base -> get_raw_data() ', )] PHP: 2022-09-19 16:45:39 [warning X 285][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1459] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1459): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(733): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(534): Elementor\Widget_Base -> get_raw_data() ', )] PHP: 2022-09-19 16:45:39 [warning X 285][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1470] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1470): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(733): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(534): Elementor\Widget_Base -> get_raw_data() ', )] PHP: 2022-09-19 16:45:39 [warning X 285][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1483] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1483): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(733): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(534): Elementor\Widget_Base -> get_raw_data() ', )] PHP: 2022-09-21 16:04:11 [warning X 70][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1446] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1446): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(750): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(447): Elementor\Widget_Base -> print_content() ', )] PHP: 2022-09-21 16:04:11 [warning X 70][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1459] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1459): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(750): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(447): Elementor\Widget_Base -> print_content() ', )] PHP: 2022-09-21 16:04:11 [warning X 70][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1470] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1470): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(750): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(447): Elementor\Widget_Base -> print_content() ', )] PHP: 2022-09-21 16:04:11 [warning X 70][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1483] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1483): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(750): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(447): Elementor\Widget_Base -> print_content() ', )] PHP: 2022-10-08 06:05:11 [warning X 1][/nas/content/live/kodehealthstg/wp-content/plugins/elementor/core/common/modules/ajax/module.php::171] Undefined array key "data" [array ( 'trace' => ' #0: Elementor\Core\Logger\Manager -> shutdown() ', )] PHP: 2022-10-10 17:53:14 [warning X 1][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/assets-manager/asset-types/icons/custom-icons.php::195] unlink(Font Awesome 6 Pro-Solid-900.otf): No such file or directory [array ( 'trace' => ' #0: Elementor\Core\Logger\Manager -> shutdown() ', )] JS: showing 10 of 10JS: 2022-09-21 17:17:30 [error X 23][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/common-modules.min.js?ver=3.7.3:2:51367] undefined is not an object (evaluating \'q.functionName.includes\') JS: 2022-09-21 17:17:43 [error X 2][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/editor.min.js?ver=3.7.3:3:646671] this.model.isValidChild is not a function. (In \'this.model.isValidChild(C)\', \'this.model.isValidChild\' is undefined) JS: 2022-09-21 17:59:01 [error X 14][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/editor.min.js?ver=3.7.3:3:1060433] null is not an object (evaluating \'$.getBoundingClientRect\') JS: 2022-09-21 19:18:34 [error X 5][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/editor.min.js?ver=3.7.3:3:691806] T.getContainer is not a function. (In \'T.getContainer()\', \'T.getContainer\' is undefined) JS: 2022-09-30 17:36:27 [error X 41][https://kodehealthstg.wpengine.com/wp-includes/js/jquery/jquery.min.js?ver=3.6.0:2:40874] Attempted to assign to readonly property. JS: 2022-10-10 18:04:20 [error X 819][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/lib/pickr/pickr.min.js?ver=1.5.0:2:19643] can\'t access property \"toLowerCase\", this.getColorRepresentation() is null JS: 2022-10-10 22:15:30 [error X 5][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/editor.min.js?ver=3.7.3:3:1060432] can\'t access property \"getBoundingClientRect\", $ is null JS: 2022-10-12 06:07:38 [error X 1][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/editor.min.js?ver=3.7.3:3:905570] can\'t access property \"settings\", T is undefined JS: 2022-10-12 06:09:36 [error X 5][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/frontend.min.js?ver=3.7.3:2:30926] can\'t access property \"model\", elementor.settings.page is undefined JS: 2022-10-17 22:59:40 [error X 2][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/editor.min.js?ver=3.7.3:3:766747] can\'t access property \"toLowerCase\", T.originalEvent.key is undefined == Elementor - Compatibility Tag == Elementor Pro: Compatibility not specified == Elementor Pro - Compatibility Tag == ```
True
[⌛ Awaiting Feedback] 🧐 Possible Bug: Icon Box not able to display its own uploaded SVG - ### Prerequisites - [X] I have searched for similar issues in both open and closed tickets and cannot find a duplicate. - [X] The issue still exists against the latest stable version of Elementor. ### Description I am working on a few pages that use a grid of Icon Box widgets. Since Elementor has not updated to Font Awesome 6 (See: #18723) we are uploading SVGs of some of the FA6 glyphs, but then when we try to use them, subsequent Icon Box widgets don't render the uploaded SVG: ![CleanShot 2022-10-17 at 16 03 34@2x](https://user-images.githubusercontent.com/24182/196300205-8f0fc5f1-12f6-474a-9ef4-9412bc99c58b.png) Initially I thought maybe it was related to copy and paste of styles but each Icon Box on this test page was created manually. ### Steps to reproduce 1. Create a page 2. Add an Inner Section 3. In the first column, put an Icon Box 4. Set the first Icon Box icon to an uploaded SVG 5. In the second column, put an Icon Box 6. Set the second Icon Box icon to a different uploaded SVG **Expected Results:** Each Icon Box uses its own uploaded SVG **Actual Results:** SVG from the first Icon Box is used in the second Icon Box. (See screenshot above.) When I add a third Icon Box to the start of the list, _that_ Icon Box's image applies to _all_ the Icon Box widgets on the page: ![CleanShot 2022-10-17 at 16 14 59@2x](https://user-images.githubusercontent.com/24182/196301087-4bdc6985-056e-4bf4-af8d-8d3725a8d7a9.png) If I change an icon to a Font Awesome icon, the correct icon appears: ![CleanShot 2022-10-17 at 16 15 51@2x](https://user-images.githubusercontent.com/24182/196301157-b069ce92-d416-4935-9023-a2c20c9e1ad6.png) ### Isolating the problem - [ ] This bug happens with only Elementor plugin active (and Elementor Pro). - [ ] This bug happens with a Blank WordPress theme active ([Hello theme](https://wordpress.org/themes/hello-elementor/)). - [X] I can reproduce this bug consistently using the steps above. ### System Info ``` == Server Environment == Operating System: Linux Software: Apache MySQL version: Percona Server (GPL), Release '42', Revision 'b0a7dc2da2e' v5.7.39-42 PHP Version: 8.0.23 PHP Memory Limit: 512M PHP Max Input Vars: 10000 PHP Max Post Size: 100M GD Installed: Yes ZIP Installed: Yes Write Permissions: All right Elementor Library: Connected == WordPress Environment == Version: 6.0.2 Site URL: https://kodehealthstg.wpengine.com Home URL: https://kodehealthstg.wpengine.com WP Multisite: No Max Upload Size: 50 MB Memory limit: 40M Max Memory limit: 512M Permalink Structure: /%postname%/ Language: en-US Timezone: 0 Admin Email: nate.miller@mutuallyhuman.com Debug Mode: Inactive == Theme == Name: Astra Version: 3.9.1 Author: Brainstorm Force Child Theme: No == User == Role: administrator WP Profile lang: en_US User Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:106.0) Gecko/20100101 Firefox/106.0 == Active Plugins == ActiveCampaign Postmark (Official) Version: 1.17.2 Author: Andrew Yates & Matt Gibbs Disable XML-RPC Version: 1.0.1 Author: Philip Erb Elementor Version: 3.7.8 Author: Elementor.com Elementor Pro Version: 3.7.7 Author: Elementor.com Google Analytics for WordPress by MonsterInsights Version: 8.8.2 Author: MonsterInsights Limit Login Attempts Reloaded Version: 2.25.5 Author: Limit Login Attempts Reloaded Safe SVG Version: 2.0.3 Author: 10up Starter Templates Version: 3.1.15 Author: Brainstorm Force WPS Hide Login Version: 1.9.6 Author: WPServeur, NicolasKulka, wpformation Yoast SEO Version: 19.6 Author: Team Yoast Yoast SEO Premium Version: 19.2 Author: Team Yoast == Must-Use Plugins == Force Strong Passwords - WPE Edition Version: 1.8.0 Author: Jason Cosper WP Engine Cache Plugin Version: 1.0.11 Author: WP Engine WP Engine Seamless Login Plugin Version: 1.6.0 Author: WP Engine WP Engine Security Auditor Version: 1.0.10 Author: wpengine WP Engine System Version: 5.0.1 Author: WP Engine == Elementor Experiments == Optimized DOM Output: Active by default Improved Asset Loading: Active by default Improved CSS Loading: Active by default Inline Font Icons: Inactive by default Accessibility Improvements: Active by default Additional Custom Breakpoints: Active by default Import Export Template Kit: Active by default Hide native WordPress widgets from search results: Active by default admin_menu_rearrangement: Inactive by default Flexbox Container: Inactive by default Default to New Theme Builder: Active by default Landing Pages: Active by default Color Sampler: Active by default Favorite Widgets: Active by default Admin Top Bar: Active by default Page Transitions: Active by default Notes: Active by default Form Submissions: Active by default Scroll Snap: Active by default == Log == Log: showing 20 of 222022-08-29 20:09:20 [info] Elementor data updater process has been queued. [array ( 'plugin' => 'Elementor Pro', 'from' => '3.7.3', 'to' => '3.7.4', )] 2022-08-29 20:12:26 [info] elementor-pro::elementor_pro_updater Started 2022-08-29 20:12:26 [info] Elementor Pro/Upgrades - _on_each_version Start 2022-08-29 20:12:26 [info] Elementor Pro/Upgrades - _on_each_version Finished 2022-08-29 20:12:26 [info] Elementor data updater process has been completed. [array ( 'plugin' => 'Elementor Pro', 'from' => '3.7.3', 'to' => '3.7.4', )] 2022-08-29 20:12:28 [info] Elementor data updater process has been queued. [array ( 'plugin' => 'Elementor Pro', 'from' => '3.7.3', 'to' => '3.7.4', )] 2022-08-29 20:12:43 [info] elementor-pro::elementor_pro_updater Started 2022-08-29 20:12:43 [info] Elementor Pro/Upgrades - _on_each_version Start 2022-08-29 20:12:43 [info] Elementor Pro/Upgrades - _on_each_version Finished 2022-08-29 20:12:43 [info] Elementor data updater process has been completed. [array ( 'plugin' => 'Elementor Pro', 'from' => '3.7.3', 'to' => '3.7.4', )] 2022-10-10 21:28:25 [info] elementor-pro::elementor_pro_updater Started 2022-10-10 21:28:25 [info] Elementor Pro/Upgrades - _on_each_version Start 2022-10-10 21:28:25 [info] Elementor Pro/Upgrades - _on_each_version Finished 2022-10-10 21:28:25 [info] Elementor data updater process has been completed. [array ( 'plugin' => 'Elementor Pro', 'from' => '3.7.4', 'to' => '3.7.7', )] 2022-10-10 21:28:25 [info] Elementor data updater process has been queued. [array ( 'plugin' => 'Elementor Pro', 'from' => '3.7.4', 'to' => '3.7.7', )] 2022-10-17 23:26:52 [info] elementor::elementor_updater Started 2022-10-17 23:26:52 [info] Elementor/Upgrades - _on_each_version Start 2022-10-17 23:26:52 [info] Elementor/Upgrades - _on_each_version Finished 2022-10-17 23:26:52 [info] Elementor data updater process has been completed. [array ( 'plugin' => 'Elementor', 'from' => '3.7.3', 'to' => '3.7.8', )] 2022-10-17 23:26:52 [info] Elementor data updater process has been queued. [array ( 'plugin' => 'Elementor', 'from' => '3.7.3', 'to' => '3.7.8', )] PHP: showing 11 of 11PHP: 2022-08-29 20:13:47 [warning X 956][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1483] Trying to access array offset on value of type null [array ( 'trace' => ' #0: Elementor\Core\Logger\Manager -> shutdown() ', )] PHP: 2022-09-19 16:45:39 [warning X 285][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1446] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1446): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(733): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(534): Elementor\Widget_Base -> get_raw_data() ', )] PHP: 2022-09-19 16:45:39 [warning X 285][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1459] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1459): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(733): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(534): Elementor\Widget_Base -> get_raw_data() ', )] PHP: 2022-09-19 16:45:39 [warning X 285][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1470] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1470): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(733): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(534): Elementor\Widget_Base -> get_raw_data() ', )] PHP: 2022-09-19 16:45:39 [warning X 285][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1483] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1483): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(733): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(534): Elementor\Widget_Base -> get_raw_data() ', )] PHP: 2022-09-21 16:04:11 [warning X 70][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1446] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1446): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(750): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(447): Elementor\Widget_Base -> print_content() ', )] PHP: 2022-09-21 16:04:11 [warning X 70][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1459] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1459): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(750): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(447): Elementor\Widget_Base -> print_content() ', )] PHP: 2022-09-21 16:04:11 [warning X 70][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1470] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1470): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(750): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(447): Elementor\Widget_Base -> print_content() ', )] PHP: 2022-09-21 16:04:11 [warning X 70][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php::1483] Trying to access array offset on value of type null [array ( 'trace' => ' #0: /nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/nav-menu/widgets/nav-menu.php(1483): Elementor\Core\Logger\Manager -> rest_error_handler() #1: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/controls-stack.php(2223): ElementorPro\Modules\NavMenu\Widgets\Nav_Menu -> render() #2: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(609): Elementor\Controls_Stack -> render_by_mode() #3: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/widget-base.php(750): Elementor\Widget_Base -> render_content() #4: /nas/content/live/kodehealthstg/wp-content/plugins/elementor/includes/base/element-base.php(447): Elementor\Widget_Base -> print_content() ', )] PHP: 2022-10-08 06:05:11 [warning X 1][/nas/content/live/kodehealthstg/wp-content/plugins/elementor/core/common/modules/ajax/module.php::171] Undefined array key "data" [array ( 'trace' => ' #0: Elementor\Core\Logger\Manager -> shutdown() ', )] PHP: 2022-10-10 17:53:14 [warning X 1][/nas/content/live/kodehealthstg/wp-content/plugins/elementor-pro/modules/assets-manager/asset-types/icons/custom-icons.php::195] unlink(Font Awesome 6 Pro-Solid-900.otf): No such file or directory [array ( 'trace' => ' #0: Elementor\Core\Logger\Manager -> shutdown() ', )] JS: showing 10 of 10JS: 2022-09-21 17:17:30 [error X 23][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/common-modules.min.js?ver=3.7.3:2:51367] undefined is not an object (evaluating \'q.functionName.includes\') JS: 2022-09-21 17:17:43 [error X 2][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/editor.min.js?ver=3.7.3:3:646671] this.model.isValidChild is not a function. (In \'this.model.isValidChild(C)\', \'this.model.isValidChild\' is undefined) JS: 2022-09-21 17:59:01 [error X 14][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/editor.min.js?ver=3.7.3:3:1060433] null is not an object (evaluating \'$.getBoundingClientRect\') JS: 2022-09-21 19:18:34 [error X 5][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/editor.min.js?ver=3.7.3:3:691806] T.getContainer is not a function. (In \'T.getContainer()\', \'T.getContainer\' is undefined) JS: 2022-09-30 17:36:27 [error X 41][https://kodehealthstg.wpengine.com/wp-includes/js/jquery/jquery.min.js?ver=3.6.0:2:40874] Attempted to assign to readonly property. JS: 2022-10-10 18:04:20 [error X 819][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/lib/pickr/pickr.min.js?ver=1.5.0:2:19643] can\'t access property \"toLowerCase\", this.getColorRepresentation() is null JS: 2022-10-10 22:15:30 [error X 5][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/editor.min.js?ver=3.7.3:3:1060432] can\'t access property \"getBoundingClientRect\", $ is null JS: 2022-10-12 06:07:38 [error X 1][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/editor.min.js?ver=3.7.3:3:905570] can\'t access property \"settings\", T is undefined JS: 2022-10-12 06:09:36 [error X 5][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/frontend.min.js?ver=3.7.3:2:30926] can\'t access property \"model\", elementor.settings.page is undefined JS: 2022-10-17 22:59:40 [error X 2][https://kodehealthstg.wpengine.com/wp-content/plugins/elementor/assets/js/editor.min.js?ver=3.7.3:3:766747] can\'t access property \"toLowerCase\", T.originalEvent.key is undefined == Elementor - Compatibility Tag == Elementor Pro: Compatibility not specified == Elementor Pro - Compatibility Tag == ```
non_process
🧐 possible bug icon box not able to display its own uploaded svg prerequisites i have searched for similar issues in both open and closed tickets and cannot find a duplicate the issue still exists against the latest stable version of elementor description i am working on a few pages that use a grid of icon box widgets since elementor has not updated to font awesome see we are uploading svgs of some of the glyphs but then when we try to use them subsequent icon box widgets don t render the uploaded svg initially i thought maybe it was related to copy and paste of styles but each icon box on this test page was created manually steps to reproduce create a page add an inner section in the first column put an icon box set the first icon box icon to an uploaded svg in the second column put an icon box set the second icon box icon to a different uploaded svg expected results each icon box uses its own uploaded svg actual results svg from the first icon box is used in the second icon box see screenshot above when i add a third icon box to the start of the list that icon box s image applies to all the icon box widgets on the page if i change an icon to a font awesome icon the correct icon appears isolating the problem this bug happens with only elementor plugin active and elementor pro this bug happens with a blank wordpress theme active i can reproduce this bug consistently using the steps above system info server environment operating system linux software apache mysql version percona server gpl release revision php version php memory limit php max input vars php max post size gd installed yes zip installed yes write permissions all right elementor library connected wordpress environment version site url home url wp multisite no max upload size mb memory limit max memory limit permalink structure postname language en us timezone admin email nate miller mutuallyhuman com debug mode inactive theme name astra version author brainstorm force child theme no user role administrator wp profile lang en us user agent mozilla macintosh intel mac os x rv gecko firefox active plugins activecampaign postmark official version author andrew yates matt gibbs disable xml rpc version author philip erb elementor version author elementor com elementor pro version author elementor com google analytics for wordpress by monsterinsights version author monsterinsights limit login attempts reloaded version author limit login attempts reloaded safe svg version author starter templates version author brainstorm force wps hide login version author wpserveur nicolaskulka wpformation yoast seo version author team yoast yoast seo premium version author team yoast must use plugins force strong passwords wpe edition version author jason cosper wp engine cache plugin version author wp engine wp engine seamless login plugin version author wp engine wp engine security auditor version author wpengine wp engine system version author wp engine elementor experiments optimized dom output active by default improved asset loading active by default improved css loading active by default inline font icons inactive by default accessibility improvements active by default additional custom breakpoints active by default import export template kit active by default hide native wordpress widgets from search results active by default admin menu rearrangement inactive by default flexbox container inactive by default default to new theme builder active by default landing pages active by default color sampler active by default favorite widgets active by default admin top bar active by default page transitions active by default notes active by default form submissions active by default scroll snap active by default log log showing of elementor data updater process has been queued array plugin elementor pro from to elementor pro elementor pro updater started elementor pro upgrades on each version start elementor pro upgrades on each version finished elementor data updater process has been completed array plugin elementor pro from to elementor data updater process has been queued array plugin elementor pro from to elementor pro elementor pro updater started elementor pro upgrades on each version start elementor pro upgrades on each version finished elementor data updater process has been completed array plugin elementor pro from to elementor pro elementor pro updater started elementor pro upgrades on each version start elementor pro upgrades on each version finished elementor data updater process has been completed array plugin elementor pro from to elementor data updater process has been queued array plugin elementor pro from to elementor elementor updater started elementor upgrades on each version start elementor upgrades on each version finished elementor data updater process has been completed array plugin elementor from to elementor data updater process has been queued array plugin elementor from to php showing of trying to access array offset on value of type null array trace elementor core logger manager shutdown php trying to access array offset on value of type null array trace nas content live kodehealthstg wp content plugins elementor pro modules nav menu widgets nav menu php elementor core logger manager rest error handler nas content live kodehealthstg wp content plugins elementor includes base controls stack php elementorpro modules navmenu widgets nav menu render nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor controls stack render by mode nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor widget base render content nas content live kodehealthstg wp content plugins elementor includes base element base php elementor widget base get raw data php trying to access array offset on value of type null array trace nas content live kodehealthstg wp content plugins elementor pro modules nav menu widgets nav menu php elementor core logger manager rest error handler nas content live kodehealthstg wp content plugins elementor includes base controls stack php elementorpro modules navmenu widgets nav menu render nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor controls stack render by mode nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor widget base render content nas content live kodehealthstg wp content plugins elementor includes base element base php elementor widget base get raw data php trying to access array offset on value of type null array trace nas content live kodehealthstg wp content plugins elementor pro modules nav menu widgets nav menu php elementor core logger manager rest error handler nas content live kodehealthstg wp content plugins elementor includes base controls stack php elementorpro modules navmenu widgets nav menu render nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor controls stack render by mode nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor widget base render content nas content live kodehealthstg wp content plugins elementor includes base element base php elementor widget base get raw data php trying to access array offset on value of type null array trace nas content live kodehealthstg wp content plugins elementor pro modules nav menu widgets nav menu php elementor core logger manager rest error handler nas content live kodehealthstg wp content plugins elementor includes base controls stack php elementorpro modules navmenu widgets nav menu render nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor controls stack render by mode nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor widget base render content nas content live kodehealthstg wp content plugins elementor includes base element base php elementor widget base get raw data php trying to access array offset on value of type null array trace nas content live kodehealthstg wp content plugins elementor pro modules nav menu widgets nav menu php elementor core logger manager rest error handler nas content live kodehealthstg wp content plugins elementor includes base controls stack php elementorpro modules navmenu widgets nav menu render nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor controls stack render by mode nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor widget base render content nas content live kodehealthstg wp content plugins elementor includes base element base php elementor widget base print content php trying to access array offset on value of type null array trace nas content live kodehealthstg wp content plugins elementor pro modules nav menu widgets nav menu php elementor core logger manager rest error handler nas content live kodehealthstg wp content plugins elementor includes base controls stack php elementorpro modules navmenu widgets nav menu render nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor controls stack render by mode nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor widget base render content nas content live kodehealthstg wp content plugins elementor includes base element base php elementor widget base print content php trying to access array offset on value of type null array trace nas content live kodehealthstg wp content plugins elementor pro modules nav menu widgets nav menu php elementor core logger manager rest error handler nas content live kodehealthstg wp content plugins elementor includes base controls stack php elementorpro modules navmenu widgets nav menu render nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor controls stack render by mode nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor widget base render content nas content live kodehealthstg wp content plugins elementor includes base element base php elementor widget base print content php trying to access array offset on value of type null array trace nas content live kodehealthstg wp content plugins elementor pro modules nav menu widgets nav menu php elementor core logger manager rest error handler nas content live kodehealthstg wp content plugins elementor includes base controls stack php elementorpro modules navmenu widgets nav menu render nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor controls stack render by mode nas content live kodehealthstg wp content plugins elementor includes base widget base php elementor widget base render content nas content live kodehealthstg wp content plugins elementor includes base element base php elementor widget base print content php undefined array key data array trace elementor core logger manager shutdown php unlink font awesome pro solid otf no such file or directory array trace elementor core logger manager shutdown js showing of undefined is not an object evaluating q functionname includes js this model isvalidchild is not a function in this model isvalidchild c this model isvalidchild is undefined js null is not an object evaluating getboundingclientrect js t getcontainer is not a function in t getcontainer t getcontainer is undefined js attempted to assign to readonly property js can t access property tolowercase this getcolorrepresentation is null js can t access property getboundingclientrect is null js can t access property settings t is undefined js can t access property model elementor settings page is undefined js can t access property tolowercase t originalevent key is undefined elementor compatibility tag elementor pro compatibility not specified elementor pro compatibility tag
0
15,436
19,648,405,837
IssuesEvent
2022-01-10 01:38:33
plazi/community
https://api.github.com/repos/plazi/community
closed
to be processed: 10.1080/00305316.2021.2023056
process request
* please process, including holotype and GBIF First record of the ant genus Agraulomyrmex Prins, 1983 (Formicidae: Formicinae) from India, with description of a new species [oritentalInsects.56.2021.2023056.pdf](https://github.com/plazi/community/files/7835576/oritentalInsects.56.2021.2023056.pdf)
1.0
to be processed: 10.1080/00305316.2021.2023056 - * please process, including holotype and GBIF First record of the ant genus Agraulomyrmex Prins, 1983 (Formicidae: Formicinae) from India, with description of a new species [oritentalInsects.56.2021.2023056.pdf](https://github.com/plazi/community/files/7835576/oritentalInsects.56.2021.2023056.pdf)
process
to be processed please process including holotype and gbif first record of the ant genus agraulomyrmex prins formicidae formicinae from india with description of a new species
1
19,204
25,338,614,036
IssuesEvent
2022-11-18 19:11:37
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
How to get valgrind xml report
under investigation type: support / not a bug (process) team-OSS
Bazel version is 5.3.1. Here is my test job `bazel test --test_verbose_timeout_warnings --run_under='valgrind --tool=memcheck --leak-check=full --xml=yes --xml-file=valgrind.xml' --test_output=streamed //...` The tests running is ok, but where is the valgrind.xml that should be produced? Anyone can help? Thanks.
1.0
How to get valgrind xml report - Bazel version is 5.3.1. Here is my test job `bazel test --test_verbose_timeout_warnings --run_under='valgrind --tool=memcheck --leak-check=full --xml=yes --xml-file=valgrind.xml' --test_output=streamed //...` The tests running is ok, but where is the valgrind.xml that should be produced? Anyone can help? Thanks.
process
how to get valgrind xml report bazel version is here is my test job bazel test test verbose timeout warnings run under valgrind tool memcheck leak check full xml yes xml file valgrind xml test output streamed the tests running is ok but where is the valgrind xml that should be produced anyone can help thanks
1
108,666
11,597,868,741
IssuesEvent
2020-02-24 21:47:13
MinimineLP/dart-gson
https://api.github.com/repos/MinimineLP/dart-gson
closed
Documentation Error
documentation
In your documentation you put decode where I think you meant encode. ![image](https://user-images.githubusercontent.com/43582537/75190245-bcf6d480-5704-11ea-823e-d673e9a1b688.png)
1.0
Documentation Error - In your documentation you put decode where I think you meant encode. ![image](https://user-images.githubusercontent.com/43582537/75190245-bcf6d480-5704-11ea-823e-d673e9a1b688.png)
non_process
documentation error in your documentation you put decode where i think you meant encode
0
17,913
3,647,512,622
IssuesEvent
2016-02-16 01:18:10
EyeSeeTea/dhis2-android-sdk
https://api.github.com/repos/EyeSeeTea/dhis2-android-sdk
reopened
Retrieve OrgUnit levels with the SDK
high priority testing
There is an [api call](https://hnqis-dev-staging.psi-mis.org/api/organisationUnitLevels) to get the OrgUnit levels. The idea would be to get those OrgUnit levels, combine that information with the downloaded OrgUnits looking at the "ancestors information" of each [OrgUnit api call](https://hnqis-dev-staging.psi-mis.org/api/organisationUnits/noqKF88cFyi) to build the dependencies between levels and build the tree in our DB
1.0
Retrieve OrgUnit levels with the SDK - There is an [api call](https://hnqis-dev-staging.psi-mis.org/api/organisationUnitLevels) to get the OrgUnit levels. The idea would be to get those OrgUnit levels, combine that information with the downloaded OrgUnits looking at the "ancestors information" of each [OrgUnit api call](https://hnqis-dev-staging.psi-mis.org/api/organisationUnits/noqKF88cFyi) to build the dependencies between levels and build the tree in our DB
non_process
retrieve orgunit levels with the sdk there is an to get the orgunit levels the idea would be to get those orgunit levels combine that information with the downloaded orgunits looking at the ancestors information of each to build the dependencies between levels and build the tree in our db
0
18,685
24,594,929,192
IssuesEvent
2022-10-14 07:28:25
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[DID] Questionnaire responses > Responses are not getting mapped into DID datastore even though DID is Enabled
Bug Blocker P0 Response datastore Process: Fixed Process: Tested dev
Steps: 1. Signup or sign in to mobile app 2. Enroll to the study 3. Submit the responses only for active tasks 4. Go to the Response server 5. Go to DID datastore 6. Click on QuestionnaireResponse and Patient and observe AR: Questionnaire responses > Responses are not getting mapped into DID datastore ER: Questionnaire responses > Responses should get mapped into DID datastore
2.0
[DID] Questionnaire responses > Responses are not getting mapped into DID datastore even though DID is Enabled - Steps: 1. Signup or sign in to mobile app 2. Enroll to the study 3. Submit the responses only for active tasks 4. Go to the Response server 5. Go to DID datastore 6. Click on QuestionnaireResponse and Patient and observe AR: Questionnaire responses > Responses are not getting mapped into DID datastore ER: Questionnaire responses > Responses should get mapped into DID datastore
process
questionnaire responses responses are not getting mapped into did datastore even though did is enabled steps signup or sign in to mobile app enroll to the study submit the responses only for active tasks go to the response server go to did datastore click on questionnaireresponse and patient and observe ar questionnaire responses responses are not getting mapped into did datastore er questionnaire responses responses should get mapped into did datastore
1
96,815
28,018,980,332
IssuesEvent
2023-03-28 02:42:48
wikimedia/mediawiki-libs-node-cssjanus
https://api.github.com/repos/wikimedia/mediawiki-libs-node-cssjanus
closed
Supporting testing border-radius with three values
Type: Enhancement Category: Build
For #20 I would like to have some test with only three values. Example: ``` [ "border-radius: 5px 9px 7px", "border-radius: 9px 5px 9px 7px" ], [ "border-radius: 5px 9px 7px / 3px 4px", "border-radius: 9px 5px 9px 7px / 4px 3px" ] ``` But since cf8f1e0c434c132f2dd5cd3eb804a9fdf656e406 tests are perfoming round-trip assertions too, which dosn't work for the cases above. Can we add a per-case option to disable these type of tests?
1.0
Supporting testing border-radius with three values - For #20 I would like to have some test with only three values. Example: ``` [ "border-radius: 5px 9px 7px", "border-radius: 9px 5px 9px 7px" ], [ "border-radius: 5px 9px 7px / 3px 4px", "border-radius: 9px 5px 9px 7px / 4px 3px" ] ``` But since cf8f1e0c434c132f2dd5cd3eb804a9fdf656e406 tests are perfoming round-trip assertions too, which dosn't work for the cases above. Can we add a per-case option to disable these type of tests?
non_process
supporting testing border radius with three values for i would like to have some test with only three values example border radius border radius border radius border radius but since tests are perfoming round trip assertions too which dosn t work for the cases above can we add a per case option to disable these type of tests
0
12,881
15,278,653,774
IssuesEvent
2021-02-23 02:00:43
DevExpress/testcafe-hammerhead
https://api.github.com/repos/DevExpress/testcafe-hammerhead
closed
Links in google bar are not proxied on https://calendar.google.com/calendar/
AREA: client AREA: server STATE: Stale SYSTEM: URL processing TYPE: enhancement
Go to https://calendar.google.com/calendar/ and try links in GoogleBar (menu with links to other Google services) - they are not proxied
1.0
Links in google bar are not proxied on https://calendar.google.com/calendar/ - Go to https://calendar.google.com/calendar/ and try links in GoogleBar (menu with links to other Google services) - they are not proxied
process
links in google bar are not proxied on go to and try links in googlebar menu with links to other google services they are not proxied
1
86,580
3,727,266,905
IssuesEvent
2016-03-06 05:34:04
markfinger/unfort
https://api.github.com/repos/markfinger/unfort
closed
Proxy-based hot swaps
Difficulty: Medium Priority Requires long walks and musing
Currently we're using prototype manipulation for ES modules. Seems to work well in practice as we're pretty defensive as to when we apply it. It does have some caveats though: - doesn't work with circular deps (the proxy is only injected after the initial binding) - firefox spits out warnings (probably justifiably) about prototype manipulation Chrome should be landing `Proxy` support in another week or two, so that's probably worth investigating as an alternative mechanism. Something like ``` const exports = Proxy({}, { set(target, prop, value) { target[prop] = value; }, get(target, prop) { if (!mod.executed) { executeModule(mod.name); } return target[prop]; } }); ``` Babel sets `__esModule` before any imports are run, so we should be able to use that as a flag. Might be messy as it uses `Object.defineProperty`. Using proxies will also allow lazy evaluation of hot swapped modules, similarly to how erlang handles hot swaps. This would work well in combination with `module.hot.changes` as you would evaluate the new modules once everything's been swapped. It is a bit more opaque though, might need to consider the edge-cases and provide some hooks (maybe something to force evaluation on swap?).
1.0
Proxy-based hot swaps - Currently we're using prototype manipulation for ES modules. Seems to work well in practice as we're pretty defensive as to when we apply it. It does have some caveats though: - doesn't work with circular deps (the proxy is only injected after the initial binding) - firefox spits out warnings (probably justifiably) about prototype manipulation Chrome should be landing `Proxy` support in another week or two, so that's probably worth investigating as an alternative mechanism. Something like ``` const exports = Proxy({}, { set(target, prop, value) { target[prop] = value; }, get(target, prop) { if (!mod.executed) { executeModule(mod.name); } return target[prop]; } }); ``` Babel sets `__esModule` before any imports are run, so we should be able to use that as a flag. Might be messy as it uses `Object.defineProperty`. Using proxies will also allow lazy evaluation of hot swapped modules, similarly to how erlang handles hot swaps. This would work well in combination with `module.hot.changes` as you would evaluate the new modules once everything's been swapped. It is a bit more opaque though, might need to consider the edge-cases and provide some hooks (maybe something to force evaluation on swap?).
non_process
proxy based hot swaps currently we re using prototype manipulation for es modules seems to work well in practice as we re pretty defensive as to when we apply it it does have some caveats though doesn t work with circular deps the proxy is only injected after the initial binding firefox spits out warnings probably justifiably about prototype manipulation chrome should be landing proxy support in another week or two so that s probably worth investigating as an alternative mechanism something like const exports proxy set target prop value target value get target prop if mod executed executemodule mod name return target babel sets esmodule before any imports are run so we should be able to use that as a flag might be messy as it uses object defineproperty using proxies will also allow lazy evaluation of hot swapped modules similarly to how erlang handles hot swaps this would work well in combination with module hot changes as you would evaluate the new modules once everything s been swapped it is a bit more opaque though might need to consider the edge cases and provide some hooks maybe something to force evaluation on swap
0
217,124
24,312,798,349
IssuesEvent
2022-09-30 01:20:24
peterwkc85/Spring_Boot
https://api.github.com/repos/peterwkc85/Spring_Boot
opened
CVE-2022-38749 (Medium) detected in snakeyaml-1.17.jar
security vulnerability
## CVE-2022-38749 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>snakeyaml-1.17.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p> <p>Path to dependency file: /spring-boot-soap-client/spring-boot-soap-client/pom.xml</p> <p>Path to vulnerable library: /root/.m2/repository/org/yaml/snakeyaml/1.17/snakeyaml-1.17.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-web-services-1.5.7.RELEASE.jar (Root Library) - spring-boot-starter-1.5.7.RELEASE.jar - :x: **snakeyaml-1.17.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Using snakeYAML to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stackoverflow. <p>Publish Date: 2022-09-05 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-38749>CVE-2022-38749</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bitbucket.org/snakeyaml/snakeyaml/issues/526/stackoverflow-oss-fuzz-47027">https://bitbucket.org/snakeyaml/snakeyaml/issues/526/stackoverflow-oss-fuzz-47027</a></p> <p>Release Date: 2022-09-05</p> <p>Fix Resolution: org.yaml:snakeyaml:1.31</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-38749 (Medium) detected in snakeyaml-1.17.jar - ## CVE-2022-38749 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>snakeyaml-1.17.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p> <p>Path to dependency file: /spring-boot-soap-client/spring-boot-soap-client/pom.xml</p> <p>Path to vulnerable library: /root/.m2/repository/org/yaml/snakeyaml/1.17/snakeyaml-1.17.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-web-services-1.5.7.RELEASE.jar (Root Library) - spring-boot-starter-1.5.7.RELEASE.jar - :x: **snakeyaml-1.17.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Using snakeYAML to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stackoverflow. <p>Publish Date: 2022-09-05 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-38749>CVE-2022-38749</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bitbucket.org/snakeyaml/snakeyaml/issues/526/stackoverflow-oss-fuzz-47027">https://bitbucket.org/snakeyaml/snakeyaml/issues/526/stackoverflow-oss-fuzz-47027</a></p> <p>Release Date: 2022-09-05</p> <p>Fix Resolution: org.yaml:snakeyaml:1.31</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in snakeyaml jar cve medium severity vulnerability vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file spring boot soap client spring boot soap client pom xml path to vulnerable library root repository org yaml snakeyaml snakeyaml jar dependency hierarchy spring boot starter web services release jar root library spring boot starter release jar x snakeyaml jar vulnerable library vulnerability details using snakeyaml to parse untrusted yaml files may be vulnerable to denial of service attacks dos if the parser is running on user supplied input an attacker may supply content that causes the parser to crash by stackoverflow publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org yaml snakeyaml step up your open source security game with mend
0
423,677
12,300,279,187
IssuesEvent
2020-05-11 13:45:24
bisq-network/bisq
https://api.github.com/repos/bisq-network/bisq
closed
Cannot edit offer: after a few edits of the "sell BTC" offer, the BTC amount has increased
a:bug is:critical is:priority
From https://bisq.community/t/glitch-fixed-amount-of-btc-increased-from-0-0624-to-0-0626-after-a-few-edits-of-the-sell-btc-offer-editing-is-no-longer-available/7565 Summary: after a few edits, the BTC amount in a "sell" offer increased from 0.0624 to 0.0626 and editing became unavailable. I noticed that when I enter the first four digits in the “fixed price in EUR for 1 BTC” field and then click outside of the text field, the amount of BTC shifts. It shifts in both directions and I haven't noticed a pattern. Still, the “confirm” button is greyed out and I disabled the offer for the time being. It’s starting to look more and more like a bug to me. ![Screenshot from 2019-05-02 15-46-53](https://user-images.githubusercontent.com/45173186/57076325-0d262580-6cf2-11e9-8daf-9508930572e6.png) ![Screenshot from 2019-05-02 15-46-59](https://user-images.githubusercontent.com/45173186/57076326-0dbebc00-6cf2-11e9-8b5d-a90677c97d5d.png) ![Screenshot from 2019-05-02 15-47-38](https://user-images.githubusercontent.com/45173186/57076327-0dbebc00-6cf2-11e9-80fa-9bd4f29fcd66.png)
1.0
Cannot edit offer: after a few edits of the "sell BTC" offer, the BTC amount has increased - From https://bisq.community/t/glitch-fixed-amount-of-btc-increased-from-0-0624-to-0-0626-after-a-few-edits-of-the-sell-btc-offer-editing-is-no-longer-available/7565 Summary: after a few edits, the BTC amount in a "sell" offer increased from 0.0624 to 0.0626 and editing became unavailable. I noticed that when I enter the first four digits in the “fixed price in EUR for 1 BTC” field and then click outside of the text field, the amount of BTC shifts. It shifts in both directions and I haven't noticed a pattern. Still, the “confirm” button is greyed out and I disabled the offer for the time being. It’s starting to look more and more like a bug to me. ![Screenshot from 2019-05-02 15-46-53](https://user-images.githubusercontent.com/45173186/57076325-0d262580-6cf2-11e9-8daf-9508930572e6.png) ![Screenshot from 2019-05-02 15-46-59](https://user-images.githubusercontent.com/45173186/57076326-0dbebc00-6cf2-11e9-8b5d-a90677c97d5d.png) ![Screenshot from 2019-05-02 15-47-38](https://user-images.githubusercontent.com/45173186/57076327-0dbebc00-6cf2-11e9-80fa-9bd4f29fcd66.png)
non_process
cannot edit offer after a few edits of the sell btc offer the btc amount has increased from summary after a few edits the btc amount in a sell offer increased from to and editing became unavailable i noticed that when i enter the first four digits in the “fixed price in eur for btc” field and then click outside of the text field the amount of btc shifts it shifts in both directions and i haven t noticed a pattern still the “confirm” button is greyed out and i disabled the offer for the time being it’s starting to look more and more like a bug to me
0
4,220
7,179,255,685
IssuesEvent
2018-01-31 19:04:37
dotnet/corefx
https://api.github.com/repos/dotnet/corefx
closed
Process class doesn't expose some memory usage properties on Linux/MacOs
area-System.Diagnostics.Process enhancement os-linux tenet-compatibility
Tried on Ubuntu in a docker image and MacOs without docker, and I get this when running a simple web app: ``` WorkingSet64: 70  PeakWorkingSet64:   VirtualMemorySize64: 23,789  PeakVirtualMemorySize64:   PagedSystemMemorySize64:   PeakPagedMemorySize64: ``` The values on Windows however are ``` WorkingSet64: 121 PeakWorkingSet64: 121 VirtualMemorySize64: 2,112,855 PeakVirtualMemorySize64: 2,112,856 PagedSystemMemorySize64: PeakPagedMemorySize64: 179 ``` Some __Peak__ values are not provided, and the list differs between windows and linux/mac. The code to repro is here: https://github.com/sebastienros/memoryusage PS: If I start a new `Process` dynamically, the value of `WorkingSet64` of the returned instance doesn't change overtime even if the application is used intensively with memory allocations. The value remains very low (something like 3 MB) though the system reports more than 400 MB of used memory for the app. I'll file an issue based on your feedback, with a repro if necessary.
1.0
Process class doesn't expose some memory usage properties on Linux/MacOs - Tried on Ubuntu in a docker image and MacOs without docker, and I get this when running a simple web app: ``` WorkingSet64: 70  PeakWorkingSet64:   VirtualMemorySize64: 23,789  PeakVirtualMemorySize64:   PagedSystemMemorySize64:   PeakPagedMemorySize64: ``` The values on Windows however are ``` WorkingSet64: 121 PeakWorkingSet64: 121 VirtualMemorySize64: 2,112,855 PeakVirtualMemorySize64: 2,112,856 PagedSystemMemorySize64: PeakPagedMemorySize64: 179 ``` Some __Peak__ values are not provided, and the list differs between windows and linux/mac. The code to repro is here: https://github.com/sebastienros/memoryusage PS: If I start a new `Process` dynamically, the value of `WorkingSet64` of the returned instance doesn't change overtime even if the application is used intensively with memory allocations. The value remains very low (something like 3 MB) though the system reports more than 400 MB of used memory for the app. I'll file an issue based on your feedback, with a repro if necessary.
process
process class doesn t expose some memory usage properties on linux macos tried on ubuntu in a docker image and macos without docker and i get this when running a simple web app              the values on windows however are some peak values are not provided and the list differs between windows and linux mac the code to repro is here ps if i start a new process dynamically the value of of the returned instance doesn t change overtime even if the application is used intensively with memory allocations the value remains very low something like mb though the system reports more than mb of used memory for the app i ll file an issue based on your feedback with a repro if necessary
1
7,496
10,583,516,408
IssuesEvent
2019-10-08 13:51:42
kubeflow/pipelines
https://api.github.com/repos/kubeflow/pipelines
opened
Verify KFP 0.31 in Kubeflow 0.7
area/pipelines kind/process priority/p0
Kubeflow has been updated on master to ship the KFP 0.31. See kubeflow/pipelines#2239. Opening this issue to track if there is any automatic or manually testing that should be done to verify everything works. One big change is that Kubeflow 0.7 will be enabling workload identity on GKE (see also #1962). For backwards compatibility the secrets should still be created in the Kubeflow namespace which I believe is where pipelines is running but this will change in subsequent releases. /cc @jessiezcc @IronPan
1.0
Verify KFP 0.31 in Kubeflow 0.7 - Kubeflow has been updated on master to ship the KFP 0.31. See kubeflow/pipelines#2239. Opening this issue to track if there is any automatic or manually testing that should be done to verify everything works. One big change is that Kubeflow 0.7 will be enabling workload identity on GKE (see also #1962). For backwards compatibility the secrets should still be created in the Kubeflow namespace which I believe is where pipelines is running but this will change in subsequent releases. /cc @jessiezcc @IronPan
process
verify kfp in kubeflow kubeflow has been updated on master to ship the kfp see kubeflow pipelines opening this issue to track if there is any automatic or manually testing that should be done to verify everything works one big change is that kubeflow will be enabling workload identity on gke see also for backwards compatibility the secrets should still be created in the kubeflow namespace which i believe is where pipelines is running but this will change in subsequent releases cc jessiezcc ironpan
1
104,828
9,011,108,445
IssuesEvent
2019-02-05 13:54:58
enonic/app-contentstudio
https://api.github.com/repos/enonic/app-contentstudio
closed
Browse Panel - Publish button is not refreshed after publishing Modified content
Bug Not in Changelog Test is Failing
@sgauruseu commented on [Tue Dec 18 2018](https://github.com/enonic/app-contentstudio-old/issues/631) **Case 1:** 1. Select a **Modified** folder in root (CLICK on checkbox!) 2. Click on Publish button and publish the folder Bug the folder is published, but Publish button is not refreshed (still enabled) ![image](https://user-images.githubusercontent.com/3728712/50158198-82193580-02e4-11e9-9062-a227d59a04f4.png) Click on the button , publishing wizard appears: ![image](https://user-images.githubusercontent.com/3728712/50158157-63b33a00-02e4-11e9-9d5a-f0f78d72ef00.png) **Case 2** - Select an `Online` content (Click on the **checkbox** )and press Delete button on the toolbar , confirm it . Content is `Deleted` now Bug : Publish button should be enabled and buttons on the toolbar should update its statuses ![image](https://user-images.githubusercontent.com/3728712/50159700-62840c00-02e8-11e9-8265-230bfebe72ea.png) Open the context-menu for the content. Bug - context menu is not correct for the `deleted` content: ![image](https://user-images.githubusercontent.com/3728712/50279257-4607e100-045a-11e9-94e9-19df4a257ae2.png)
1.0
Browse Panel - Publish button is not refreshed after publishing Modified content - @sgauruseu commented on [Tue Dec 18 2018](https://github.com/enonic/app-contentstudio-old/issues/631) **Case 1:** 1. Select a **Modified** folder in root (CLICK on checkbox!) 2. Click on Publish button and publish the folder Bug the folder is published, but Publish button is not refreshed (still enabled) ![image](https://user-images.githubusercontent.com/3728712/50158198-82193580-02e4-11e9-9062-a227d59a04f4.png) Click on the button , publishing wizard appears: ![image](https://user-images.githubusercontent.com/3728712/50158157-63b33a00-02e4-11e9-9d5a-f0f78d72ef00.png) **Case 2** - Select an `Online` content (Click on the **checkbox** )and press Delete button on the toolbar , confirm it . Content is `Deleted` now Bug : Publish button should be enabled and buttons on the toolbar should update its statuses ![image](https://user-images.githubusercontent.com/3728712/50159700-62840c00-02e8-11e9-8265-230bfebe72ea.png) Open the context-menu for the content. Bug - context menu is not correct for the `deleted` content: ![image](https://user-images.githubusercontent.com/3728712/50279257-4607e100-045a-11e9-94e9-19df4a257ae2.png)
non_process
browse panel publish button is not refreshed after publishing modified content sgauruseu commented on case select a modified folder in root click on checkbox click on publish button and publish the folder bug the folder is published but publish button is not refreshed still enabled click on the button publishing wizard appears case select an online content click on the checkbox and press delete button on the toolbar confirm it content is deleted now bug publish button should be enabled and buttons on the toolbar should update its statuses open the context menu for the content bug context menu is not correct for the deleted content
0
1,936
4,763,973,844
IssuesEvent
2016-10-25 15:45:19
opentrials/opentrials
https://api.github.com/repos/opentrials/opentrials
closed
"sync_text_from_documentcloud" is skipping all documents, including public ones
4. Ready for Review bug Processors
On https://github.com/opentrials/processors/blob/2d0a23c/processors/sync_text_from_documentcloud/processor.py#L42-L47, we try to do `doc.get_full_text()` and skip the document if it raises a `NotImplementedError`. When developing that, this error was thrown when trying to extract text from a private document in DocumentCloud (their API only allows extracting from public documents). However, now all our documents are public, but we're still getting this error. Not sure if it's an issue with our code or their API.
1.0
"sync_text_from_documentcloud" is skipping all documents, including public ones - On https://github.com/opentrials/processors/blob/2d0a23c/processors/sync_text_from_documentcloud/processor.py#L42-L47, we try to do `doc.get_full_text()` and skip the document if it raises a `NotImplementedError`. When developing that, this error was thrown when trying to extract text from a private document in DocumentCloud (their API only allows extracting from public documents). However, now all our documents are public, but we're still getting this error. Not sure if it's an issue with our code or their API.
process
sync text from documentcloud is skipping all documents including public ones on we try to do doc get full text and skip the document if it raises a notimplementederror when developing that this error was thrown when trying to extract text from a private document in documentcloud their api only allows extracting from public documents however now all our documents are public but we re still getting this error not sure if it s an issue with our code or their api
1
3,463
6,545,301,838
IssuesEvent
2017-09-04 03:31:10
nion-software/nionswift
https://api.github.com/repos/nion-software/nionswift
opened
Converting to scalar needs to give complex conversion options
f - processing type - enhancement w3 - waiting
This requires the processing descriptors to be able to present a list of choices in a combo box.
1.0
Converting to scalar needs to give complex conversion options - This requires the processing descriptors to be able to present a list of choices in a combo box.
process
converting to scalar needs to give complex conversion options this requires the processing descriptors to be able to present a list of choices in a combo box
1
11,764
14,596,115,378
IssuesEvent
2020-12-20 14:32:04
symfony/symfony
https://api.github.com/repos/symfony/symfony
closed
[Process] add a process manager allowing to run commands in parallel (queueing up if needed)
Feature Process Stalled
A common pattern when writing batch scripts importing huge number of existing data into an app is to write the batch script so that it can execute its work in parallel. Running the script with eg. 8 instances in parallel allows to finish the import task in a fraction of the time. Depending on the platform in use, the developer might use forking, threading, pnctl and a myriad other techniques to achieve parallelism. There's no rocket science in that code, but it still quite a chore and bug-prone. I suggest that the Sf Process component gets extended with a process manager class, which can be used to execute multiple processes in parallel. Sample code - not based on Sf currently - is available at https://gist.github.com/gggeek/5956177 for discussion
1.0
[Process] add a process manager allowing to run commands in parallel (queueing up if needed) - A common pattern when writing batch scripts importing huge number of existing data into an app is to write the batch script so that it can execute its work in parallel. Running the script with eg. 8 instances in parallel allows to finish the import task in a fraction of the time. Depending on the platform in use, the developer might use forking, threading, pnctl and a myriad other techniques to achieve parallelism. There's no rocket science in that code, but it still quite a chore and bug-prone. I suggest that the Sf Process component gets extended with a process manager class, which can be used to execute multiple processes in parallel. Sample code - not based on Sf currently - is available at https://gist.github.com/gggeek/5956177 for discussion
process
add a process manager allowing to run commands in parallel queueing up if needed a common pattern when writing batch scripts importing huge number of existing data into an app is to write the batch script so that it can execute its work in parallel running the script with eg instances in parallel allows to finish the import task in a fraction of the time depending on the platform in use the developer might use forking threading pnctl and a myriad other techniques to achieve parallelism there s no rocket science in that code but it still quite a chore and bug prone i suggest that the sf process component gets extended with a process manager class which can be used to execute multiple processes in parallel sample code not based on sf currently is available at for discussion
1
14,463
17,569,595,983
IssuesEvent
2021-08-14 11:58:19
oasis-tcs/csaf
https://api.github.com/repos/oasis-tcs/csaf
closed
Support TC admin to publish CSAF CSD 01 / CSDPR 01
csaf 2.0 editorial oasis_tc_process
As a follow-up to the successful resolution of #331 the created package (zip archive) as GitHub pre-release may trigger non-material / formal changes per TC process and TC admin publication rules. This issue shall track the modification in a publicly accessible and transparent way. The starting package is at [csd-01-20210805-rc1](https://github.com/oasis-tcs/csaf/tree/csd-01-20210805-rc1) or per direct link: * [csd-01-20210805-rc1.zip](https://github.com/oasis-tcs/csaf/releases/download/csd-01-20210805-rc1/csd-01-20210805-rc1.zip)
1.0
Support TC admin to publish CSAF CSD 01 / CSDPR 01 - As a follow-up to the successful resolution of #331 the created package (zip archive) as GitHub pre-release may trigger non-material / formal changes per TC process and TC admin publication rules. This issue shall track the modification in a publicly accessible and transparent way. The starting package is at [csd-01-20210805-rc1](https://github.com/oasis-tcs/csaf/tree/csd-01-20210805-rc1) or per direct link: * [csd-01-20210805-rc1.zip](https://github.com/oasis-tcs/csaf/releases/download/csd-01-20210805-rc1/csd-01-20210805-rc1.zip)
process
support tc admin to publish csaf csd csdpr as a follow up to the successful resolution of the created package zip archive as github pre release may trigger non material formal changes per tc process and tc admin publication rules this issue shall track the modification in a publicly accessible and transparent way the starting package is at or per direct link
1
18,401
3,054,873,271
IssuesEvent
2015-08-13 07:32:52
testing-cabal/mock
https://api.github.com/repos/testing-cabal/mock
closed
Unexpected behaviour of assert_has_calls with empty list as argument
auto-migrated Priority-Medium Type-Defect upstream
``` What steps will reproduce the problem? Example: import mock mmock = mock.MagicMock() mmock.foobar("baz") mmock.assert_has_calls([]) # No exception raised. Why? mmock.assert_has_calls(['x']) # Exception raised as expected. What is the expected output? What do you see instead? Expected an exception when called assert_has_calls with empty list. What version of the product are you using? On what operating system? mock==1.0.1 Ubuntu 14.04.1 LTS Please provide any additional information below. _CallList -> __contains__ might check for empty list as value? ``` Original issue reported on code.google.com by `andras.s...@gmail.com` on 9 Jan 2015 at 1:47
1.0
Unexpected behaviour of assert_has_calls with empty list as argument - ``` What steps will reproduce the problem? Example: import mock mmock = mock.MagicMock() mmock.foobar("baz") mmock.assert_has_calls([]) # No exception raised. Why? mmock.assert_has_calls(['x']) # Exception raised as expected. What is the expected output? What do you see instead? Expected an exception when called assert_has_calls with empty list. What version of the product are you using? On what operating system? mock==1.0.1 Ubuntu 14.04.1 LTS Please provide any additional information below. _CallList -> __contains__ might check for empty list as value? ``` Original issue reported on code.google.com by `andras.s...@gmail.com` on 9 Jan 2015 at 1:47
non_process
unexpected behaviour of assert has calls with empty list as argument what steps will reproduce the problem example import mock mmock mock magicmock mmock foobar baz mmock assert has calls no exception raised why mmock assert has calls exception raised as expected what is the expected output what do you see instead expected an exception when called assert has calls with empty list what version of the product are you using on what operating system mock ubuntu lts please provide any additional information below calllist contains might check for empty list as value original issue reported on code google com by andras s gmail com on jan at
0
282,320
21,315,482,079
IssuesEvent
2022-04-16 07:37:19
e0543517/pe
https://api.github.com/repos/e0543517/pe
opened
DG: Missing extension for UC 08
severity.Low type.DocumentationBug
![image.png](https://raw.githubusercontent.com/e0543517/pe/main/files/472543ea-64c5-4077-ba6f-8509d4cccdb5.png) ![image.png](https://raw.githubusercontent.com/e0543517/pe/main/files/2c4ae660-265c-4c16-8d9e-4190298b5af0.png) UC08 in the DG does not state an important extension when the user tries to pass an interview where the position openings is less than the offered. <!--session: 1650086529810-4a6ab9b1-a0e0-42c8-82e5-120c135e3283--> <!--Version: Web v3.4.2-->
1.0
DG: Missing extension for UC 08 - ![image.png](https://raw.githubusercontent.com/e0543517/pe/main/files/472543ea-64c5-4077-ba6f-8509d4cccdb5.png) ![image.png](https://raw.githubusercontent.com/e0543517/pe/main/files/2c4ae660-265c-4c16-8d9e-4190298b5af0.png) UC08 in the DG does not state an important extension when the user tries to pass an interview where the position openings is less than the offered. <!--session: 1650086529810-4a6ab9b1-a0e0-42c8-82e5-120c135e3283--> <!--Version: Web v3.4.2-->
non_process
dg missing extension for uc in the dg does not state an important extension when the user tries to pass an interview where the position openings is less than the offered
0
22,104
30,635,276,941
IssuesEvent
2023-07-24 17:16:32
microsoft/vscode
https://api.github.com/repos/microsoft/vscode
reopened
Terminal process title not updating
bug upstream important macos confirmed regression unreleased terminal-process
<!-- ⚠️⚠️ Do Not Delete This! bug_report_template ⚠️⚠️ --> <!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ --> <!-- 🕮 Read our guide about submitting issues: https://github.com/microsoft/vscode/wiki/Submitting-Bugs-and-Suggestions --> <!-- 🔎 Search existing issues to avoid creating duplicates. --> <!-- 🧪 Test using the latest Insiders build to see if your issue has already been fixed: https://code.visualstudio.com/insiders/ --> <!-- 💡 Instead of creating your report here, use 'Report Issue' from the 'Help' menu in VS Code to pre-fill useful information. --> <!-- 🔧 Launch with `code --disable-extensions` to check. --> Does this issue occur when all extensions are disabled?: Yes <!-- 🪓 If you answered No above, use 'Help: Start Extension Bisect' from Command Palette to try to identify the cause. --> <!-- 📣 Issues caused by an extension need to be reported directly to the extension publisher. The 'Help > Report Issue' dialog can assist with this. --> - VS Code Version: 1.74.3 (Universal) - OS Version: Darwin arm64 22.2.0 Steps to Reproduce: 1. Launch a terminal tab from the integrated terminal 2. Start a process (e.g. node REPL) I started noticing recently that the terminal tab titles don't update with new processes anymore: ![SCR-20230130-gtn](https://user-images.githubusercontent.com/134339/215546088-eccb3f19-040c-433e-910d-33b63c6f9909.png) The title on the right remains `zsh` and not `node`. This occurs in both zsh and bash terminals. My `terminal.integrated.tabs.title` is set to `${process}`. Changing it to `${process} test` adds the word `test` correctly but doesn't update the process.
1.0
Terminal process title not updating - <!-- ⚠️⚠️ Do Not Delete This! bug_report_template ⚠️⚠️ --> <!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ --> <!-- 🕮 Read our guide about submitting issues: https://github.com/microsoft/vscode/wiki/Submitting-Bugs-and-Suggestions --> <!-- 🔎 Search existing issues to avoid creating duplicates. --> <!-- 🧪 Test using the latest Insiders build to see if your issue has already been fixed: https://code.visualstudio.com/insiders/ --> <!-- 💡 Instead of creating your report here, use 'Report Issue' from the 'Help' menu in VS Code to pre-fill useful information. --> <!-- 🔧 Launch with `code --disable-extensions` to check. --> Does this issue occur when all extensions are disabled?: Yes <!-- 🪓 If you answered No above, use 'Help: Start Extension Bisect' from Command Palette to try to identify the cause. --> <!-- 📣 Issues caused by an extension need to be reported directly to the extension publisher. The 'Help > Report Issue' dialog can assist with this. --> - VS Code Version: 1.74.3 (Universal) - OS Version: Darwin arm64 22.2.0 Steps to Reproduce: 1. Launch a terminal tab from the integrated terminal 2. Start a process (e.g. node REPL) I started noticing recently that the terminal tab titles don't update with new processes anymore: ![SCR-20230130-gtn](https://user-images.githubusercontent.com/134339/215546088-eccb3f19-040c-433e-910d-33b63c6f9909.png) The title on the right remains `zsh` and not `node`. This occurs in both zsh and bash terminals. My `terminal.integrated.tabs.title` is set to `${process}`. Changing it to `${process} test` adds the word `test` correctly but doesn't update the process.
process
terminal process title not updating does this issue occur when all extensions are disabled yes report issue dialog can assist with this vs code version universal os version darwin steps to reproduce launch a terminal tab from the integrated terminal start a process e g node repl i started noticing recently that the terminal tab titles don t update with new processes anymore the title on the right remains zsh and not node this occurs in both zsh and bash terminals my terminal integrated tabs title is set to process changing it to process test adds the word test correctly but doesn t update the process
1
291,793
8,949,635,619
IssuesEvent
2019-01-25 08:22:22
conan-io/conan
https://api.github.com/repos/conan-io/conan
closed
system_requirements files not shared between host/docker
complex: low priority: medium stage: review type: feature
I have an application depends on a library libA. libA's recipe has a system_requirements() method which installs a system library. libA's artifact is stored remotely. When I built the applications, it failed to find the system library installed by the libA's recipe. Is system_requirements() transitive? If so, how to specify in the conanfile.txt or conanfile.py of application? By the way, I don't want to build libA from source.
1.0
system_requirements files not shared between host/docker - I have an application depends on a library libA. libA's recipe has a system_requirements() method which installs a system library. libA's artifact is stored remotely. When I built the applications, it failed to find the system library installed by the libA's recipe. Is system_requirements() transitive? If so, how to specify in the conanfile.txt or conanfile.py of application? By the way, I don't want to build libA from source.
non_process
system requirements files not shared between host docker i have an application depends on a library liba liba s recipe has a system requirements method which installs a system library liba s artifact is stored remotely when i built the applications it failed to find the system library installed by the liba s recipe is system requirements transitive if so how to specify in the conanfile txt or conanfile py of application by the way i don t want to build liba from source
0
10,331
13,162,979,480
IssuesEvent
2020-08-10 22:59:08
googleapis/google-cloud-ruby
https://api.github.com/repos/googleapis/google-cloud-ruby
closed
Migrate google-cloud-error_reporting to the microgenerator
type: process
Migrate google-cloud-error_reporting to the microgenerator. This involves the following steps: * [x] Write synth file and generate `google-cloud-error_reporting-v1beta1` * [x] Make sure the new libraries are configured in kokoro * [x] Release `google-cloud-error_reporting-v1beta1` * [ ] Switch `google-cloud-error_reporting` backend to the versioned gems. That is: * Rip out synth and all the generated code * Add `google-cloud-error_reporting-v1beta1` as a dependency * Update the veneer code to the microgenerator usage * [ ] Release `google-cloud-error_reporting` update, but remain pre-1.0. We are not GA-ing this library. I do not believe samples need to be updated, unless they invoke the low-level interface directly.
1.0
Migrate google-cloud-error_reporting to the microgenerator - Migrate google-cloud-error_reporting to the microgenerator. This involves the following steps: * [x] Write synth file and generate `google-cloud-error_reporting-v1beta1` * [x] Make sure the new libraries are configured in kokoro * [x] Release `google-cloud-error_reporting-v1beta1` * [ ] Switch `google-cloud-error_reporting` backend to the versioned gems. That is: * Rip out synth and all the generated code * Add `google-cloud-error_reporting-v1beta1` as a dependency * Update the veneer code to the microgenerator usage * [ ] Release `google-cloud-error_reporting` update, but remain pre-1.0. We are not GA-ing this library. I do not believe samples need to be updated, unless they invoke the low-level interface directly.
process
migrate google cloud error reporting to the microgenerator migrate google cloud error reporting to the microgenerator this involves the following steps write synth file and generate google cloud error reporting make sure the new libraries are configured in kokoro release google cloud error reporting switch google cloud error reporting backend to the versioned gems that is rip out synth and all the generated code add google cloud error reporting as a dependency update the veneer code to the microgenerator usage release google cloud error reporting update but remain pre we are not ga ing this library i do not believe samples need to be updated unless they invoke the low level interface directly
1
7,105
10,261,038,385
IssuesEvent
2019-08-22 08:58:39
dzhw/zofar
https://api.github.com/repos/dzhw/zofar
closed
the key for encrypted exports has to be repared
1 category: secondary.exporter category: service.processes category: zofar prio: 2 status: development status: discussion type: bug
the export doesn't work if I want it to be encrypted.
1.0
the key for encrypted exports has to be repared - the export doesn't work if I want it to be encrypted.
process
the key for encrypted exports has to be repared the export doesn t work if i want it to be encrypted
1
125,053
17,795,928,707
IssuesEvent
2021-08-31 22:13:11
ghc-dev/Ashley-Campos
https://api.github.com/repos/ghc-dev/Ashley-Campos
opened
CVE-2017-1000228 (High) detected in Nodev0.11, ejs-0.8.8.tgz
security vulnerability
## CVE-2017-1000228 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>Nodev0.11</b>, <b>ejs-0.8.8.tgz</b></p></summary> <p> <details><summary><b>ejs-0.8.8.tgz</b></p></summary> <p>Embedded JavaScript templates</p> <p>Library home page: <a href="https://registry.npmjs.org/ejs/-/ejs-0.8.8.tgz">https://registry.npmjs.org/ejs/-/ejs-0.8.8.tgz</a></p> <p>Path to dependency file: Ashley-Campos/package.json</p> <p>Path to vulnerable library: Ashley-Campos/node_modules/ejs-locals/node_modules/ejs/package.json</p> <p> Dependency Hierarchy: - ejs-locals-1.0.2.tgz (Root Library) - :x: **ejs-0.8.8.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Ashley-Campos/commit/93bde7a441bf435e2b02a4e5c2c7e6e2e2ebb8e0">93bde7a441bf435e2b02a4e5c2c7e6e2e2ebb8e0</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> nodejs ejs versions older than 2.5.3 is vulnerable to remote code execution due to weak input validation in ejs.renderFile() function <p>Publish Date: 2017-11-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-1000228>CVE-2017-1000228</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-1000228">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-1000228</a></p> <p>Release Date: 2017-11-17</p> <p>Fix Resolution: 2.5.3</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"ejs","packageVersion":"0.8.8","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"ejs-locals:1.0.2;ejs:0.8.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.5.3"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2017-1000228","vulnerabilityDetails":"nodejs ejs versions older than 2.5.3 is vulnerable to remote code execution due to weak input validation in ejs.renderFile() function","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-1000228","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2017-1000228 (High) detected in Nodev0.11, ejs-0.8.8.tgz - ## CVE-2017-1000228 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>Nodev0.11</b>, <b>ejs-0.8.8.tgz</b></p></summary> <p> <details><summary><b>ejs-0.8.8.tgz</b></p></summary> <p>Embedded JavaScript templates</p> <p>Library home page: <a href="https://registry.npmjs.org/ejs/-/ejs-0.8.8.tgz">https://registry.npmjs.org/ejs/-/ejs-0.8.8.tgz</a></p> <p>Path to dependency file: Ashley-Campos/package.json</p> <p>Path to vulnerable library: Ashley-Campos/node_modules/ejs-locals/node_modules/ejs/package.json</p> <p> Dependency Hierarchy: - ejs-locals-1.0.2.tgz (Root Library) - :x: **ejs-0.8.8.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Ashley-Campos/commit/93bde7a441bf435e2b02a4e5c2c7e6e2e2ebb8e0">93bde7a441bf435e2b02a4e5c2c7e6e2e2ebb8e0</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> nodejs ejs versions older than 2.5.3 is vulnerable to remote code execution due to weak input validation in ejs.renderFile() function <p>Publish Date: 2017-11-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-1000228>CVE-2017-1000228</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-1000228">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-1000228</a></p> <p>Release Date: 2017-11-17</p> <p>Fix Resolution: 2.5.3</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"ejs","packageVersion":"0.8.8","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"ejs-locals:1.0.2;ejs:0.8.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.5.3"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2017-1000228","vulnerabilityDetails":"nodejs ejs versions older than 2.5.3 is vulnerable to remote code execution due to weak input validation in ejs.renderFile() function","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-1000228","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in ejs tgz cve high severity vulnerability vulnerable libraries ejs tgz ejs tgz embedded javascript templates library home page a href path to dependency file ashley campos package json path to vulnerable library ashley campos node modules ejs locals node modules ejs package json dependency hierarchy ejs locals tgz root library x ejs tgz vulnerable library found in head commit a href found in base branch master vulnerability details nodejs ejs versions older than is vulnerable to remote code execution due to weak input validation in ejs renderfile function publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree ejs locals ejs isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier cve vulnerabilitydetails nodejs ejs versions older than is vulnerable to remote code execution due to weak input validation in ejs renderfile function vulnerabilityurl
0
10,155
13,044,162,611
IssuesEvent
2020-07-29 03:47:34
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `Database` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `Database` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @iosmanthus ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `Database` from TiDB - ## Description Port the scalar function `Database` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @iosmanthus ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function database from tidb description port the scalar function database from tidb to coprocessor score mentor s iosmanthus recommended skills rust programming learning materials already implemented expressions ported from tidb
1
4,086
7,043,020,132
IssuesEvent
2017-12-30 21:44:31
brucemiller/LaTeXML
https://api.github.com/repos/brucemiller/LaTeXML
closed
Bloated dbfile on macOS High Sierra
postprocessing question
I am experiencing a weird issue with bloated database files on macOS High Sierra (10.13)... Using a minimal laTeX file `HelloLaTeXML.tex` containing ``` \documentclass{article} \begin{document} Hello LaTeXML! \end{document} ``` and executing `latexml --dest=HelloLaTeXML.xml HelloLaTeXML.tex` followed by `latexmlpost --prescan --dbfile=HelloLaTeXML.db --dest=HelloLaTeXML.xhtml HelloLaTeXML.xml ` creates a database file `HelloLaTeXML.db` of over 40MB! Adding two more words to the tex-file results in over 70MB... As a consequence, for any reasonable tex-file the database quickly grows to GB's and the scan eventually needs to be aborted. Everything works as expected on my other machine with macOS Sierra (10.12). In the above example the database amounts to about 16kB and the size does not change by adding two more words. Just in case, I have installed lateXML 0.8.2 through macports and get the same results with TeXLive installed through macports or a separate installation of MacTeX (with the appropriate variant `port install latexml +mactex`). This is a mystery to me...
1.0
Bloated dbfile on macOS High Sierra - I am experiencing a weird issue with bloated database files on macOS High Sierra (10.13)... Using a minimal laTeX file `HelloLaTeXML.tex` containing ``` \documentclass{article} \begin{document} Hello LaTeXML! \end{document} ``` and executing `latexml --dest=HelloLaTeXML.xml HelloLaTeXML.tex` followed by `latexmlpost --prescan --dbfile=HelloLaTeXML.db --dest=HelloLaTeXML.xhtml HelloLaTeXML.xml ` creates a database file `HelloLaTeXML.db` of over 40MB! Adding two more words to the tex-file results in over 70MB... As a consequence, for any reasonable tex-file the database quickly grows to GB's and the scan eventually needs to be aborted. Everything works as expected on my other machine with macOS Sierra (10.12). In the above example the database amounts to about 16kB and the size does not change by adding two more words. Just in case, I have installed lateXML 0.8.2 through macports and get the same results with TeXLive installed through macports or a separate installation of MacTeX (with the appropriate variant `port install latexml +mactex`). This is a mystery to me...
process
bloated dbfile on macos high sierra i am experiencing a weird issue with bloated database files on macos high sierra using a minimal latex file hellolatexml tex containing documentclass article begin document hello latexml end document and executing latexml dest hellolatexml xml hellolatexml tex followed by latexmlpost prescan dbfile hellolatexml db dest hellolatexml xhtml hellolatexml xml creates a database file hellolatexml db of over adding two more words to the tex file results in over as a consequence for any reasonable tex file the database quickly grows to gb s and the scan eventually needs to be aborted everything works as expected on my other machine with macos sierra in the above example the database amounts to about and the size does not change by adding two more words just in case i have installed latexml through macports and get the same results with texlive installed through macports or a separate installation of mactex with the appropriate variant port install latexml mactex this is a mystery to me
1
276,220
23,976,224,034
IssuesEvent
2022-09-13 11:50:15
elastic/elasticsearch
https://api.github.com/repos/elastic/elasticsearch
closed
[CI] SnapshotStressTestsIT testRandomActivities failing
:Distributed/Snapshot/Restore >test-failure Team:Distributed
Failed twice with the same error since Aug. 26th. Both times on 7.17. **Build scan:** https://gradle-enterprise.elastic.co/s/s7eg5nc4oyurw/tests/:server:internalClusterTest/org.elasticsearch.snapshots.SnapshotStressTestsIT/testRandomActivities **Reproduction line:** `gradlew ':server:internalClusterTest' --tests "org.elasticsearch.snapshots.SnapshotStressTestsIT.testRandomActivities" -Dtests.seed=6A9AA38A189498BB -Dtests.locale=ru-RU -Dtests.timezone=Pacific/Tarawa -Druntime.java=8` **Applicable branches:** 7.17 **Reproduces locally?:** Didn't try **Failure history:** https://gradle-enterprise.elastic.co/scans/tests?tests.container=org.elasticsearch.snapshots.SnapshotStressTestsIT&tests.test=testRandomActivities **Failure excerpt:** ``` org.elasticsearch.discovery.MasterNotDiscoveredException: (No message provided) at org.elasticsearch.action.support.master.TransportMasterNodeAction$AsyncSingleAction$2.onTimeout(TransportMasterNodeAction.java:297) at org.elasticsearch.cluster.ClusterStateObserver$ContextPreservingListener.onTimeout(ClusterStateObserver.java:345) at org.elasticsearch.cluster.ClusterStateObserver$ObserverClusterStateListener.onTimeout(ClusterStateObserver.java:263) at org.elasticsearch.cluster.service.ClusterApplierService$NotifyTimeout.run(ClusterApplierService.java:660) at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:718) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) ```
1.0
[CI] SnapshotStressTestsIT testRandomActivities failing - Failed twice with the same error since Aug. 26th. Both times on 7.17. **Build scan:** https://gradle-enterprise.elastic.co/s/s7eg5nc4oyurw/tests/:server:internalClusterTest/org.elasticsearch.snapshots.SnapshotStressTestsIT/testRandomActivities **Reproduction line:** `gradlew ':server:internalClusterTest' --tests "org.elasticsearch.snapshots.SnapshotStressTestsIT.testRandomActivities" -Dtests.seed=6A9AA38A189498BB -Dtests.locale=ru-RU -Dtests.timezone=Pacific/Tarawa -Druntime.java=8` **Applicable branches:** 7.17 **Reproduces locally?:** Didn't try **Failure history:** https://gradle-enterprise.elastic.co/scans/tests?tests.container=org.elasticsearch.snapshots.SnapshotStressTestsIT&tests.test=testRandomActivities **Failure excerpt:** ``` org.elasticsearch.discovery.MasterNotDiscoveredException: (No message provided) at org.elasticsearch.action.support.master.TransportMasterNodeAction$AsyncSingleAction$2.onTimeout(TransportMasterNodeAction.java:297) at org.elasticsearch.cluster.ClusterStateObserver$ContextPreservingListener.onTimeout(ClusterStateObserver.java:345) at org.elasticsearch.cluster.ClusterStateObserver$ObserverClusterStateListener.onTimeout(ClusterStateObserver.java:263) at org.elasticsearch.cluster.service.ClusterApplierService$NotifyTimeout.run(ClusterApplierService.java:660) at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:718) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) ```
non_process
snapshotstresstestsit testrandomactivities failing failed twice with the same error since aug both times on build scan reproduction line gradlew server internalclustertest tests org elasticsearch snapshots snapshotstresstestsit testrandomactivities dtests seed dtests locale ru ru dtests timezone pacific tarawa druntime java applicable branches reproduces locally didn t try failure history failure excerpt org elasticsearch discovery masternotdiscoveredexception no message provided at org elasticsearch action support master transportmasternodeaction asyncsingleaction ontimeout transportmasternodeaction java at org elasticsearch cluster clusterstateobserver contextpreservinglistener ontimeout clusterstateobserver java at org elasticsearch cluster clusterstateobserver observerclusterstatelistener ontimeout clusterstateobserver java at org elasticsearch cluster service clusterapplierservice notifytimeout run clusterapplierservice java at org elasticsearch common util concurrent threadcontext contextpreservingrunnable run threadcontext java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java
0