Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
16,406
21,189,410,565
IssuesEvent
2022-04-08 15:43:55
cypress-io/cypress
https://api.github.com/repos/cypress-io/cypress
closed
Remove bundled typescript from packages/server
stage: needs investigating type: performance 🏃‍♀️ process: dependencies
### Current behavior: TypeScript is bundled in `packages/server` because of `dependency-tree`. It takes about 35MB of space. I should be removed to reduce the bundle size. ### Desired behavior: TypeScript should be removed from the bundle. We need to find ways to replace `dependency-tree`. ### Steps to reproduce: (app code and test code) `npm ls typescript` in `packages/server`. ### Versions develop branch.
1.0
Remove bundled typescript from packages/server - ### Current behavior: TypeScript is bundled in `packages/server` because of `dependency-tree`. It takes about 35MB of space. I should be removed to reduce the bundle size. ### Desired behavior: TypeScript should be removed from the bundle. We need to find ways to replace `dependency-tree`. ### Steps to reproduce: (app code and test code) `npm ls typescript` in `packages/server`. ### Versions develop branch.
process
remove bundled typescript from packages server current behavior typescript is bundled in packages server because of dependency tree it takes about of space i should be removed to reduce the bundle size desired behavior typescript should be removed from the bundle we need to find ways to replace dependency tree steps to reproduce app code and test code npm ls typescript in packages server versions develop branch
1
13,958
16,738,408,552
IssuesEvent
2021-06-11 06:41:53
largecats/blog
https://api.github.com/repos/largecats/blog
opened
Stream Processing 101
/blog/2021/06/10/stream-processing-101/ Gitalk
https://largecats.github.io/blog/2021/06/10/stream-processing-101/ This post is based on a sharing I did on general concepts in stream processing.
1.0
Stream Processing 101 - https://largecats.github.io/blog/2021/06/10/stream-processing-101/ This post is based on a sharing I did on general concepts in stream processing.
process
stream processing this post is based on a sharing i did on general concepts in stream processing
1
21,968
30,463,807,084
IssuesEvent
2023-07-17 08:56:49
open-telemetry/opentelemetry-collector-contrib
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
closed
[processor/resourcedetection] Docker detector doesn't et any attributes
bug priority:p1 processor/resourcedetection
### What happened? After migrating the detection processor to the new config interface in https://github.com/open-telemetry/opentelemetry-collector-contrib/pull/23253, the docker detector stopped setting any attributes ### Collector version 0.81.0
1.0
[processor/resourcedetection] Docker detector doesn't et any attributes - ### What happened? After migrating the detection processor to the new config interface in https://github.com/open-telemetry/opentelemetry-collector-contrib/pull/23253, the docker detector stopped setting any attributes ### Collector version 0.81.0
process
docker detector doesn t et any attributes what happened after migrating the detection processor to the new config interface in the docker detector stopped setting any attributes collector version
1
489,683
14,111,391,483
IssuesEvent
2020-11-07 00:05:28
AY2021S1-CS2113-T13-3/tp
https://api.github.com/repos/AY2021S1-CS2113-T13-3/tp
closed
Improve CompareCommand such that same user should not be able to compare with himself
priority.High severity.High type.Enhancement
Currently, `CompareCommand` allows same user to compare with him- or herself, which should not be the case. Also, the current user should not appear in the userlist for comparison.
1.0
Improve CompareCommand such that same user should not be able to compare with himself - Currently, `CompareCommand` allows same user to compare with him- or herself, which should not be the case. Also, the current user should not appear in the userlist for comparison.
non_process
improve comparecommand such that same user should not be able to compare with himself currently comparecommand allows same user to compare with him or herself which should not be the case also the current user should not appear in the userlist for comparison
0
15,496
19,703,235,530
IssuesEvent
2022-01-12 18:50:13
googleapis/google-auth-library-python-httplib2
https://api.github.com/repos/googleapis/google-auth-library-python-httplib2
opened
Your .repo-metadata.json file has a problem 🤒
type: process repo-metadata: lint
You have a problem with your .repo-metadata.json file: Result of scan 📈: * client_documentation must match pattern "^https://.*" in .repo-metadata.json ☝️ Once you correct these problems, you can close this issue. Reach out to **go/github-automation** if you have any questions.
1.0
Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file: Result of scan 📈: * client_documentation must match pattern "^https://.*" in .repo-metadata.json ☝️ Once you correct these problems, you can close this issue. Reach out to **go/github-automation** if you have any questions.
process
your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 client documentation must match pattern in repo metadata json ☝️ once you correct these problems you can close this issue reach out to go github automation if you have any questions
1
4,921
3,104,778,026
IssuesEvent
2015-08-31 17:36:54
teeworlds/teeworlds
https://api.github.com/repos/teeworlds/teeworlds
closed
game layer tile set picker + brush manipulation
code review editor
In `editor.cpp` in `DoMapEditor` where the tile set picker gets active it should also copy the current layer's `m_Game` to have a proper rotate and flip behaviour: `m_TilesetPicker.m_Game = t->m_Game;`
1.0
game layer tile set picker + brush manipulation - In `editor.cpp` in `DoMapEditor` where the tile set picker gets active it should also copy the current layer's `m_Game` to have a proper rotate and flip behaviour: `m_TilesetPicker.m_Game = t->m_Game;`
non_process
game layer tile set picker brush manipulation in editor cpp in domapeditor where the tile set picker gets active it should also copy the current layer s m game to have a proper rotate and flip behaviour m tilesetpicker m game t m game
0
352,244
10,533,836,035
IssuesEvent
2019-10-01 13:47:51
jenkins-x/jx
https://api.github.com/repos/jenkins-x/jx
closed
Only release the jx binary once versions has fully passed
area/fox area/quality area/versions backlog kind/enhancement priority/critical
Right now we release the jx binary on every PR. We should make these "pre-releases" and do the final release once versions has passed. This will need to update github, chocolatey, the website, brew dockerhub and dev-env
1.0
Only release the jx binary once versions has fully passed - Right now we release the jx binary on every PR. We should make these "pre-releases" and do the final release once versions has passed. This will need to update github, chocolatey, the website, brew dockerhub and dev-env
non_process
only release the jx binary once versions has fully passed right now we release the jx binary on every pr we should make these pre releases and do the final release once versions has passed this will need to update github chocolatey the website brew dockerhub and dev env
0
14,421
17,470,591,885
IssuesEvent
2021-08-07 03:58:35
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Raster Layer Zonal Stats columns issue when raster has no CRS
Processing Bug
### What is the bug or the crash? When a raster without any defined CRS is used, the second column normally used to store the area of each region is not always well defined (obviously...) which causes problem depending on the output format: - **Temporary output**: the _area_ column name is not present thus all folowing column names are shift to the left and the values of the last column (mean) are lost. - **CSV output**: correctly loaded with a placeholder _field_2_ in the second column for _area_. The file itself however miss the second column name (colnames stored as `zone,,sum,count,min,max,mean`). - **GPKG output**: the _area_ column name is not present but this time it doesn't cause any shift, all values are where they should be. ### Steps to reproduce the issue ```python from osgeo import gdal, osr import numpy as np import tempfile tempdir = tempfile.gettempdir() # Create raster without any CRS array = np.array([[8, 8, 3, 3], [7, 7, 3, 3], [7, 4, 4, 4]]) raster_noCRS = os.path.join(tempdir, 'raster_no_CRS.tif') driver = gdal.GetDriverByName('GTiff') ds = driver.Create(raster_noCRS, xsize=4, ysize=3, bands=1, eType=gdal.GDT_Byte) ds.GetRasterBand(1).WriteArray(array) ds = None # --- # Run Raster Layer Zonal Stats tool # with TEMPORARY output params = { 'INPUT':raster_noCRS, 'BAND':1, 'ZONES':raster_noCRS, 'ZONES_BAND':1, 'REF_LAYER':0, 'OUTPUT_TABLE':'TEMPORARY_OUTPUT'} table = processing.run("native:rasterlayerzonalstats", params)['OUTPUT_TABLE'] # Resulting column names and values of first feature table.fields().names() table.getFeature(1).attributes() # ['zone', 'sum', 'count', 'min', 'max', 'mean'] # [ 8.0, 2.0, 16, 2.0, 8.0, 8.0] # [ 8.0, 16, 2.0, 8.0, 8.0, 8.0] (expected) # --- # Run Raster Layer Zonal Stats tool # with CSV output stats_csv = os.path.join(tempdir, 'stats.csv') params['OUTPUT_TABLE'] = stats_csv table = processing.run("native:rasterlayerzonalstats", params) table = QgsVectorLayer(stats_csv) # Resulting column names and values of first feature table.fields().names() table.getFeature(1).attributes() # [ 'zone', 'field_2', 'sum', 'count', 'min', 'max', 'mean'] # ['8.00000000', '2.00000000', '16.00000000', '2', '8.00000000', '8.00000000', '8.00000000'] # In plain text, the CSV looks like that # zone,,sum,count,min,max,mean # 8.00000000,2.00000000,16.00000000,"2",8.00000000,8.00000000,8.00000000 # 4.00000000,3.00000000,12.00000000,"3",4.00000000,4.00000000,4.00000000 # 7.00000000,3.00000000,21.00000000,"3",7.00000000,7.00000000,7.00000000 # 3.00000000,4.00000000,12.00000000,"4",3.00000000,3.00000000,3.00000000 # --- # Run Raster Layer Zonal Stats tool # with GPKG output stats_gpkg = os.path.join(tempdir, 'stats.gpkg') params['OUTPUT_TABLE'] = stats_gpkg processing.run("native:rasterlayerzonalstats", params) table = QgsVectorLayer(stats_gpkg) # Resulting column names and values of first feature table.fields().names() table.getFeature(1).attributes() # ['fid', 'zone', 'sum', 'count', 'min', 'max', 'mean'] # [ 1, 8.0, 16.0, 2, 8.0, 8.0, 8.0] ``` ### Versions QGIS version 3.20.0-Odense QGIS code revision decaadb Qt version 5.15.2 Python version 3.9.5 GDAL/OGR version 3.3.0 GEOS version 3.9.1-CAPI-1.14.2 Windows 10 Version 1809
1.0
Raster Layer Zonal Stats columns issue when raster has no CRS - ### What is the bug or the crash? When a raster without any defined CRS is used, the second column normally used to store the area of each region is not always well defined (obviously...) which causes problem depending on the output format: - **Temporary output**: the _area_ column name is not present thus all folowing column names are shift to the left and the values of the last column (mean) are lost. - **CSV output**: correctly loaded with a placeholder _field_2_ in the second column for _area_. The file itself however miss the second column name (colnames stored as `zone,,sum,count,min,max,mean`). - **GPKG output**: the _area_ column name is not present but this time it doesn't cause any shift, all values are where they should be. ### Steps to reproduce the issue ```python from osgeo import gdal, osr import numpy as np import tempfile tempdir = tempfile.gettempdir() # Create raster without any CRS array = np.array([[8, 8, 3, 3], [7, 7, 3, 3], [7, 4, 4, 4]]) raster_noCRS = os.path.join(tempdir, 'raster_no_CRS.tif') driver = gdal.GetDriverByName('GTiff') ds = driver.Create(raster_noCRS, xsize=4, ysize=3, bands=1, eType=gdal.GDT_Byte) ds.GetRasterBand(1).WriteArray(array) ds = None # --- # Run Raster Layer Zonal Stats tool # with TEMPORARY output params = { 'INPUT':raster_noCRS, 'BAND':1, 'ZONES':raster_noCRS, 'ZONES_BAND':1, 'REF_LAYER':0, 'OUTPUT_TABLE':'TEMPORARY_OUTPUT'} table = processing.run("native:rasterlayerzonalstats", params)['OUTPUT_TABLE'] # Resulting column names and values of first feature table.fields().names() table.getFeature(1).attributes() # ['zone', 'sum', 'count', 'min', 'max', 'mean'] # [ 8.0, 2.0, 16, 2.0, 8.0, 8.0] # [ 8.0, 16, 2.0, 8.0, 8.0, 8.0] (expected) # --- # Run Raster Layer Zonal Stats tool # with CSV output stats_csv = os.path.join(tempdir, 'stats.csv') params['OUTPUT_TABLE'] = stats_csv table = processing.run("native:rasterlayerzonalstats", params) table = QgsVectorLayer(stats_csv) # Resulting column names and values of first feature table.fields().names() table.getFeature(1).attributes() # [ 'zone', 'field_2', 'sum', 'count', 'min', 'max', 'mean'] # ['8.00000000', '2.00000000', '16.00000000', '2', '8.00000000', '8.00000000', '8.00000000'] # In plain text, the CSV looks like that # zone,,sum,count,min,max,mean # 8.00000000,2.00000000,16.00000000,"2",8.00000000,8.00000000,8.00000000 # 4.00000000,3.00000000,12.00000000,"3",4.00000000,4.00000000,4.00000000 # 7.00000000,3.00000000,21.00000000,"3",7.00000000,7.00000000,7.00000000 # 3.00000000,4.00000000,12.00000000,"4",3.00000000,3.00000000,3.00000000 # --- # Run Raster Layer Zonal Stats tool # with GPKG output stats_gpkg = os.path.join(tempdir, 'stats.gpkg') params['OUTPUT_TABLE'] = stats_gpkg processing.run("native:rasterlayerzonalstats", params) table = QgsVectorLayer(stats_gpkg) # Resulting column names and values of first feature table.fields().names() table.getFeature(1).attributes() # ['fid', 'zone', 'sum', 'count', 'min', 'max', 'mean'] # [ 1, 8.0, 16.0, 2, 8.0, 8.0, 8.0] ``` ### Versions QGIS version 3.20.0-Odense QGIS code revision decaadb Qt version 5.15.2 Python version 3.9.5 GDAL/OGR version 3.3.0 GEOS version 3.9.1-CAPI-1.14.2 Windows 10 Version 1809
process
raster layer zonal stats columns issue when raster has no crs what is the bug or the crash when a raster without any defined crs is used the second column normally used to store the area of each region is not always well defined obviously which causes problem depending on the output format temporary output the area column name is not present thus all folowing column names are shift to the left and the values of the last column mean are lost csv output correctly loaded with a placeholder field in the second column for area the file itself however miss the second column name colnames stored as zone sum count min max mean gpkg output the area column name is not present but this time it doesn t cause any shift all values are where they should be steps to reproduce the issue python from osgeo import gdal osr import numpy as np import tempfile tempdir tempfile gettempdir create raster without any crs array np array raster nocrs os path join tempdir raster no crs tif driver gdal getdriverbyname gtiff ds driver create raster nocrs xsize ysize bands etype gdal gdt byte ds getrasterband writearray array ds none run raster layer zonal stats tool with temporary output params input raster nocrs band zones raster nocrs zones band ref layer output table temporary output table processing run native rasterlayerzonalstats params resulting column names and values of first feature table fields names table getfeature attributes expected run raster layer zonal stats tool with csv output stats csv os path join tempdir stats csv params stats csv table processing run native rasterlayerzonalstats params table qgsvectorlayer stats csv resulting column names and values of first feature table fields names table getfeature attributes in plain text the csv looks like that zone sum count min max mean run raster layer zonal stats tool with gpkg output stats gpkg os path join tempdir stats gpkg params stats gpkg processing run native rasterlayerzonalstats params table qgsvectorlayer stats gpkg resulting column names and values of first feature table fields names table getfeature attributes versions qgis version odense qgis code revision decaadb qt version python version gdal ogr version geos version capi windows version
1
26,037
4,553,048,623
IssuesEvent
2016-09-13 02:17:10
cakephp/cakephp
https://api.github.com/repos/cakephp/cakephp
closed
getMockForModel uses default datasource when cacheMethods are disabled
Defect testing
When cacheMethods are disabled in CakePHP 2.7.* getMockForModel makes atleast once use of default datasource instead of test datasource.
1.0
getMockForModel uses default datasource when cacheMethods are disabled - When cacheMethods are disabled in CakePHP 2.7.* getMockForModel makes atleast once use of default datasource instead of test datasource.
non_process
getmockformodel uses default datasource when cachemethods are disabled when cachemethods are disabled in cakephp getmockformodel makes atleast once use of default datasource instead of test datasource
0
7,412
10,534,345,177
IssuesEvent
2019-10-01 14:40:13
codurance/open-space
https://api.github.com/repos/codurance/open-space
closed
Create tslint and .idea code style config
🐁 1 📊 process
Don't spend too much time just pick default for intellij idea and utilize a standard for tslint (like AirBnB)
1.0
Create tslint and .idea code style config - Don't spend too much time just pick default for intellij idea and utilize a standard for tslint (like AirBnB)
process
create tslint and idea code style config don t spend too much time just pick default for intellij idea and utilize a standard for tslint like airbnb
1
12,188
3,588,208,773
IssuesEvent
2016-01-30 21:24:07
JayNewstrom/ScreenSwitcher
https://api.github.com/repos/JayNewstrom/ScreenSwitcher
closed
Readme updates
Documentation
- Document `Screen` lifecycle - Comparison to fragments - What the goal of the project is (small, focused, back stack, configurable) (no presenter, etc) - Document that you can do awesome custom transitions using the Animations api
1.0
Readme updates - - Document `Screen` lifecycle - Comparison to fragments - What the goal of the project is (small, focused, back stack, configurable) (no presenter, etc) - Document that you can do awesome custom transitions using the Animations api
non_process
readme updates document screen lifecycle comparison to fragments what the goal of the project is small focused back stack configurable no presenter etc document that you can do awesome custom transitions using the animations api
0
19,557
25,878,528,262
IssuesEvent
2022-12-14 09:39:52
pytorch/pytorch
https://api.github.com/repos/pytorch/pytorch
closed
DISABLED test_terminate_exit (__main__.ForkTest)
module: multiprocessing triaged module: flaky-tests skipped
Platforms: linux This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/flakytest?name=test_terminate_exit&suite=ForkTest&file=test_multiprocessing_spawn.py) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/7313915078). Over the past 3 hours, it has been determined flaky in 1 workflow(s) with 1 red and 1 green. cc @VitalyFedyunin
1.0
DISABLED test_terminate_exit (__main__.ForkTest) - Platforms: linux This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/flakytest?name=test_terminate_exit&suite=ForkTest&file=test_multiprocessing_spawn.py) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/7313915078). Over the past 3 hours, it has been determined flaky in 1 workflow(s) with 1 red and 1 green. cc @VitalyFedyunin
process
disabled test terminate exit main forktest platforms linux this test was disabled because it is failing in ci see and the most recent trunk over the past hours it has been determined flaky in workflow s with red and green cc vitalyfedyunin
1
290,214
21,872,322,505
IssuesEvent
2022-05-19 06:58:42
fragcolor-xyz/chainblocks
https://api.github.com/repos/fragcolor-xyz/chainblocks
opened
Document how to set up and use the Chainblocks-Unity integration plugin
Project: Documentation
**Pitch** As a game developer interested in using Fragnova assets in the Unity game engine, I want to know how to set up and use the Chainblocks-Unity plugin so that I may be able to load/save Fragnova game assets in my Unity projects. **Acceptance criteria** A brief guide on how to set up and use the Chainblocks-Unity plugin. **Additional context** Document should be hosted at `https://docs.fragcolor.xyz/how-to/use-unity-plugin.
1.0
Document how to set up and use the Chainblocks-Unity integration plugin - **Pitch** As a game developer interested in using Fragnova assets in the Unity game engine, I want to know how to set up and use the Chainblocks-Unity plugin so that I may be able to load/save Fragnova game assets in my Unity projects. **Acceptance criteria** A brief guide on how to set up and use the Chainblocks-Unity plugin. **Additional context** Document should be hosted at `https://docs.fragcolor.xyz/how-to/use-unity-plugin.
non_process
document how to set up and use the chainblocks unity integration plugin pitch as a game developer interested in using fragnova assets in the unity game engine i want to know how to set up and use the chainblocks unity plugin so that i may be able to load save fragnova game assets in my unity projects acceptance criteria a brief guide on how to set up and use the chainblocks unity plugin additional context document should be hosted at
0
373,007
11,031,739,987
IssuesEvent
2019-12-06 18:28:32
PREreview/rapid-prereview
https://api.github.com/repos/PREreview/rapid-prereview
closed
data for chrome and firefox webstores
priority
- [x] icon (@halmos) must respect the exact width and height. See screenshot below and https://developer.chrome.com/webstore/images#icons <img width="1213" alt="Screen Shot 2019-12-01 at 11 56 44 AM" src="https://user-images.githubusercontent.com/1141327/69919456-ccca8980-1431-11ea-8d8a-f0a9da0df144.png"> - [x] detailed description (@dasaderi or @majohansson ) see screenshot below for instructions <img width="1162" alt="Screen Shot 2019-12-01 at 11 56 37 AM" src="https://user-images.githubusercontent.com/1141327/69919457-ccca8980-1431-11ea-9ae3-7eef454983c4.png"> - [x] provide info to chrome and firefox stores and resubmit the application (@sballesteros) ----- ## Reviewer comments: ### Chrome > Your item did not comply with the following section of our [Program Policies](https://developer.chrome.com/webstore/program_policies): > "Spam and Placement in the Store" > Item has a blank description field, or missing icons or screenshots, and appears to be suspicious. ### Firefox > Jack Thompson wrote: > Please expand this add-on listing. Describe the purpose and features/interface changes/actions of this add-on in more detail and/or have images demonstrating the features. For suggestions on how to create a better listing see https://developer.mozilla.org/Add-ons/Listing . > Thank you.
1.0
data for chrome and firefox webstores - - [x] icon (@halmos) must respect the exact width and height. See screenshot below and https://developer.chrome.com/webstore/images#icons <img width="1213" alt="Screen Shot 2019-12-01 at 11 56 44 AM" src="https://user-images.githubusercontent.com/1141327/69919456-ccca8980-1431-11ea-8d8a-f0a9da0df144.png"> - [x] detailed description (@dasaderi or @majohansson ) see screenshot below for instructions <img width="1162" alt="Screen Shot 2019-12-01 at 11 56 37 AM" src="https://user-images.githubusercontent.com/1141327/69919457-ccca8980-1431-11ea-9ae3-7eef454983c4.png"> - [x] provide info to chrome and firefox stores and resubmit the application (@sballesteros) ----- ## Reviewer comments: ### Chrome > Your item did not comply with the following section of our [Program Policies](https://developer.chrome.com/webstore/program_policies): > "Spam and Placement in the Store" > Item has a blank description field, or missing icons or screenshots, and appears to be suspicious. ### Firefox > Jack Thompson wrote: > Please expand this add-on listing. Describe the purpose and features/interface changes/actions of this add-on in more detail and/or have images demonstrating the features. For suggestions on how to create a better listing see https://developer.mozilla.org/Add-ons/Listing . > Thank you.
non_process
data for chrome and firefox webstores icon halmos must respect the exact width and height see screenshot below and img width alt screen shot at am src detailed description dasaderi or majohansson see screenshot below for instructions img width alt screen shot at am src provide info to chrome and firefox stores and resubmit the application sballesteros reviewer comments chrome your item did not comply with the following section of our spam and placement in the store item has a blank description field or missing icons or screenshots and appears to be suspicious firefox jack thompson wrote please expand this add on listing describe the purpose and features interface changes actions of this add on in more detail and or have images demonstrating the features for suggestions on how to create a better listing see thank you
0
3,542
6,583,927,215
IssuesEvent
2017-09-13 08:16:56
DynareTeam/dynare
https://api.github.com/repos/DynareTeam/dynare
closed
Block histval from accepting positive lags
bug preprocessor
`histval` serves for setting past values, not future ones. The Matlab-files are not able to handle this case. The code ``` % Test whether preprocessor recognizes state variables introduced by optimal policy Github #1193 var pai, c, n, r, a; varexo u; parameters beta, rho, epsilon, omega, phi, gamma; beta=0.99; gamma=3; omega=17; epsilon=8; phi=1; rho=0.95; model; a = rho*a(-1)+u; 1/c = beta*r/(c(+1)*pai(+1)); pai*(pai-1)/c = beta*pai(+1)*(pai(+1)-1)/c(+1)+epsilon*phi*n^(gamma+1)/omega -exp(a)*n*(epsilon-1)/(omega*c); exp(a)*n = c+(omega/2)*(pai-1)^2; end; initval; r=1; end; histval; a(5)=1; end; steady_state_model; a = 0; pai = beta*r; c = find_c(0.96,pai,beta,epsilon,phi,gamma,omega); n = c+(omega/2)*(pai-1)^2; end; shocks; var u; stderr 0.008; var u; periods 1; values 1; end; options_.dr_display_tol=0; planner_objective(ln(c)-phi*((n^(1+gamma))/(1+gamma))); ramsey_policy(planner_discount=0.99,order=1,instruments=(r),periods=500); ``` causes a crash in `simult_` because of the `a(5)` in the `histval`-block
1.0
Block histval from accepting positive lags - `histval` serves for setting past values, not future ones. The Matlab-files are not able to handle this case. The code ``` % Test whether preprocessor recognizes state variables introduced by optimal policy Github #1193 var pai, c, n, r, a; varexo u; parameters beta, rho, epsilon, omega, phi, gamma; beta=0.99; gamma=3; omega=17; epsilon=8; phi=1; rho=0.95; model; a = rho*a(-1)+u; 1/c = beta*r/(c(+1)*pai(+1)); pai*(pai-1)/c = beta*pai(+1)*(pai(+1)-1)/c(+1)+epsilon*phi*n^(gamma+1)/omega -exp(a)*n*(epsilon-1)/(omega*c); exp(a)*n = c+(omega/2)*(pai-1)^2; end; initval; r=1; end; histval; a(5)=1; end; steady_state_model; a = 0; pai = beta*r; c = find_c(0.96,pai,beta,epsilon,phi,gamma,omega); n = c+(omega/2)*(pai-1)^2; end; shocks; var u; stderr 0.008; var u; periods 1; values 1; end; options_.dr_display_tol=0; planner_objective(ln(c)-phi*((n^(1+gamma))/(1+gamma))); ramsey_policy(planner_discount=0.99,order=1,instruments=(r),periods=500); ``` causes a crash in `simult_` because of the `a(5)` in the `histval`-block
process
block histval from accepting positive lags histval serves for setting past values not future ones the matlab files are not able to handle this case the code test whether preprocessor recognizes state variables introduced by optimal policy github var pai c n r a varexo u parameters beta rho epsilon omega phi gamma beta gamma omega epsilon phi rho model a rho a u c beta r c pai pai pai c beta pai pai c epsilon phi n gamma omega exp a n epsilon omega c exp a n c omega pai end initval r end histval a end steady state model a pai beta r c find c pai beta epsilon phi gamma omega n c omega pai end shocks var u stderr var u periods values end options dr display tol planner objective ln c phi n gamma gamma ramsey policy planner discount order instruments r periods causes a crash in simult because of the a in the histval block
1
294,914
9,050,393,574
IssuesEvent
2019-02-12 08:33:29
Activiti/Activiti
https://api.github.com/repos/Activiti/Activiti
closed
modeling UI does not show all process variables
priority1
I loaded some [example process variable json](https://github.com/Activiti/activiti-cloud-runtime-bundle-service/blob/develop/activiti-cloud-starter-runtime-bundle/src/test/resources/processes/variable-mapping-extensions.json#L11) into the modeling UI. After [changing all the values of the variables to be strings (which I shouldn't have to do)](https://github.com/Activiti/Activiti/issues/2488) then I was able to set up the variables and save them. But I can't then see them all in the UI, only a subset: ![image](https://user-images.githubusercontent.com/22754674/52584288-8f0c5e00-2e29-11e9-9da2-8761a8cfac6b.png) If I then use the '+' button to add another then that one appears in the json but also not in the table... it only shows the first three
1.0
modeling UI does not show all process variables - I loaded some [example process variable json](https://github.com/Activiti/activiti-cloud-runtime-bundle-service/blob/develop/activiti-cloud-starter-runtime-bundle/src/test/resources/processes/variable-mapping-extensions.json#L11) into the modeling UI. After [changing all the values of the variables to be strings (which I shouldn't have to do)](https://github.com/Activiti/Activiti/issues/2488) then I was able to set up the variables and save them. But I can't then see them all in the UI, only a subset: ![image](https://user-images.githubusercontent.com/22754674/52584288-8f0c5e00-2e29-11e9-9da2-8761a8cfac6b.png) If I then use the '+' button to add another then that one appears in the json but also not in the table... it only shows the first three
non_process
modeling ui does not show all process variables i loaded some into the modeling ui after then i was able to set up the variables and save them but i can t then see them all in the ui only a subset if i then use the button to add another then that one appears in the json but also not in the table it only shows the first three
0
20,949
27,808,697,648
IssuesEvent
2023-03-17 23:23:03
hsmusic/hsmusic-data
https://api.github.com/repos/hsmusic/hsmusic-data
opened
Three in the Morning (4 1/3 Hours Late Remix)/(Ngame's Bowmix) fixes
scope: fandom type: data fix type: involved process what: art tags what: lyrics what: references & leitmotifs/samples
(4 1/3 Hours Late Remix): - missing reference to Michael Bowman Remix (Ngame's Bowmix): - missing lyrics - missing reference to Three in the Morning (4 1/3 Hours Late Remix; CaNon edit) - missing reference/sample to newly included material Both: - remove Snowman art tag Prereq (makes things easier): - #140
1.0
Three in the Morning (4 1/3 Hours Late Remix)/(Ngame's Bowmix) fixes - (4 1/3 Hours Late Remix): - missing reference to Michael Bowman Remix (Ngame's Bowmix): - missing lyrics - missing reference to Three in the Morning (4 1/3 Hours Late Remix; CaNon edit) - missing reference/sample to newly included material Both: - remove Snowman art tag Prereq (makes things easier): - #140
process
three in the morning hours late remix ngame s bowmix fixes hours late remix missing reference to michael bowman remix ngame s bowmix missing lyrics missing reference to three in the morning hours late remix canon edit missing reference sample to newly included material both remove snowman art tag prereq makes things easier
1
4,518
7,360,194,192
IssuesEvent
2018-03-10 16:05:35
ODiogoSilva/assemblerflow
https://api.github.com/repos/ODiogoSilva/assemblerflow
opened
Add better error detection and handling for assembler processes
bug process
As of now, failure in the assembler processes may trigger an error in the `.status` file but it does not trigger the error for the nextflow process. This leads to the assembly not being performed due to some reason and the resume cannot repeat the process, since it terminated successfully as far as nextflow knows. The `optional true` tag should be removed from these processes to force the error in these processes and the error strategy could be changed to `ignore`, so that it does not compromise the rest of the pipeline. Coupled with the adaptive RAM directive in #26 , this should allow for a more robust and informative approach.
1.0
Add better error detection and handling for assembler processes - As of now, failure in the assembler processes may trigger an error in the `.status` file but it does not trigger the error for the nextflow process. This leads to the assembly not being performed due to some reason and the resume cannot repeat the process, since it terminated successfully as far as nextflow knows. The `optional true` tag should be removed from these processes to force the error in these processes and the error strategy could be changed to `ignore`, so that it does not compromise the rest of the pipeline. Coupled with the adaptive RAM directive in #26 , this should allow for a more robust and informative approach.
process
add better error detection and handling for assembler processes as of now failure in the assembler processes may trigger an error in the status file but it does not trigger the error for the nextflow process this leads to the assembly not being performed due to some reason and the resume cannot repeat the process since it terminated successfully as far as nextflow knows the optional true tag should be removed from these processes to force the error in these processes and the error strategy could be changed to ignore so that it does not compromise the rest of the pipeline coupled with the adaptive ram directive in this should allow for a more robust and informative approach
1
10,970
12,992,127,640
IssuesEvent
2020-07-23 06:02:10
ValveSoftware/Proton
https://api.github.com/repos/ValveSoftware/Proton
closed
BATTLETECH (637090)
.NET Game compatibility - Unofficial
I wish to throw this out there to see if anyone knows why this Unity3D game fails to launch when forced to use proton? At present it crashes and spits out a error log similar to the one linked below. I know there is a native version but it uses OpenGL and has significant performance issues, so I want to test the d3d11 version to do some comparisons. [steam-637090.log](https://github.com/ValveSoftware/Proton/files/3766307/steam-637090.log)
True
BATTLETECH (637090) - I wish to throw this out there to see if anyone knows why this Unity3D game fails to launch when forced to use proton? At present it crashes and spits out a error log similar to the one linked below. I know there is a native version but it uses OpenGL and has significant performance issues, so I want to test the d3d11 version to do some comparisons. [steam-637090.log](https://github.com/ValveSoftware/Proton/files/3766307/steam-637090.log)
non_process
battletech i wish to throw this out there to see if anyone knows why this game fails to launch when forced to use proton at present it crashes and spits out a error log similar to the one linked below i know there is a native version but it uses opengl and has significant performance issues so i want to test the version to do some comparisons
0
584,145
17,407,536,261
IssuesEvent
2021-08-03 08:10:26
ballerina-platform/ballerina-dev-website
https://api.github.com/repos/ballerina-platform/ballerina-dev-website
closed
Enable Pushing Selected Changes to the Ballerina Live Site
Area/Backend Area/Docs Priority/Highest Type/NewFeature
**Description:** Need to find and implement the optimal way to enable pushing selected changes to the Ballerina live site. **Describe your problem(s)** **Describe your solution(s)** **Related Issues (optional):** <!-- Any related issues such as sub tasks, issues reported in other repositories (e.g component repositories), similar problems, etc. --> **Suggested Labels (optional):** <!-- Optional comma separated list of suggested labels. Non committers can’t assign labels to issues, so this will help issue creators who are not a committer to suggest possible labels--> **Suggested Assignees (optional):** <!--Optional comma separated list of suggested team members who should attend the issue. Non committers can’t assign issues to assignees, so this will help issue creators who are not a committer to suggest possible assignees-->
1.0
Enable Pushing Selected Changes to the Ballerina Live Site - **Description:** Need to find and implement the optimal way to enable pushing selected changes to the Ballerina live site. **Describe your problem(s)** **Describe your solution(s)** **Related Issues (optional):** <!-- Any related issues such as sub tasks, issues reported in other repositories (e.g component repositories), similar problems, etc. --> **Suggested Labels (optional):** <!-- Optional comma separated list of suggested labels. Non committers can’t assign labels to issues, so this will help issue creators who are not a committer to suggest possible labels--> **Suggested Assignees (optional):** <!--Optional comma separated list of suggested team members who should attend the issue. Non committers can’t assign issues to assignees, so this will help issue creators who are not a committer to suggest possible assignees-->
non_process
enable pushing selected changes to the ballerina live site description need to find and implement the optimal way to enable pushing selected changes to the ballerina live site describe your problem s describe your solution s related issues optional suggested labels optional suggested assignees optional
0
137,007
20,027,412,224
IssuesEvent
2022-02-01 23:11:28
Hyperobjekt/crs
https://api.github.com/repos/Hyperobjekt/crs
opened
Mockup map with client edits
Design
**Copy changes** “Schoolboard” → “School District” “Severity” → “Progress” “Action” → “Activity” **Shading + Key** Binary: introduced, passed (change terminology of severity) Don't have the same shape for school board and state level in key **Federal Icon** Incorporate, have a line connecting it to D.C.
1.0
Mockup map with client edits - **Copy changes** “Schoolboard” → “School District” “Severity” → “Progress” “Action” → “Activity” **Shading + Key** Binary: introduced, passed (change terminology of severity) Don't have the same shape for school board and state level in key **Federal Icon** Incorporate, have a line connecting it to D.C.
non_process
mockup map with client edits copy changes “schoolboard” → “school district” “severity” → “progress” “action” → “activity” shading key binary introduced passed change terminology of severity don t have the same shape for school board and state level in key federal icon incorporate have a line connecting it to d c
0
17,804
23,728,978,274
IssuesEvent
2022-08-30 22:49:54
googleapis/gapic-generator-python
https://api.github.com/repos/googleapis/gapic-generator-python
opened
Implement Showcase integration tests for transport=rest
type: process priority: p1
Implement Showcase integration tests and add them to CI for transport=REST: * with numeric enums turned on * with numeric enums turned off This is arguably a GA blocker for GA'ing REGAPIC. (This issue filed as a refinement of #1404)
1.0
Implement Showcase integration tests for transport=rest - Implement Showcase integration tests and add them to CI for transport=REST: * with numeric enums turned on * with numeric enums turned off This is arguably a GA blocker for GA'ing REGAPIC. (This issue filed as a refinement of #1404)
process
implement showcase integration tests for transport rest implement showcase integration tests and add them to ci for transport rest with numeric enums turned on with numeric enums turned off this is arguably a ga blocker for ga ing regapic this issue filed as a refinement of
1
272
2,700,749,578
IssuesEvent
2015-04-04 14:39:00
cs2103jan2015-f10-3c/main
https://api.github.com/repos/cs2103jan2015-f10-3c/main
closed
data processing support to show dates on top
component.DataProcessor priority.high type.task
e.g: display this month Tuesday, 24 Mar 2015 1. 07:00-08:00 goes to school 2. 12:00-13:00 lunch w OG mates Wednesday, 25 Mar 2015 1. 18:00-19:00 early dinner w besties 2. 21:00-22:00 night movies
1.0
data processing support to show dates on top - e.g: display this month Tuesday, 24 Mar 2015 1. 07:00-08:00 goes to school 2. 12:00-13:00 lunch w OG mates Wednesday, 25 Mar 2015 1. 18:00-19:00 early dinner w besties 2. 21:00-22:00 night movies
process
data processing support to show dates on top e g display this month tuesday mar goes to school lunch w og mates wednesday mar early dinner w besties night movies
1
812,988
30,441,299,745
IssuesEvent
2023-07-15 04:58:54
MDAnalysis/mdanalysis
https://api.github.com/repos/MDAnalysis/mdanalysis
closed
[docs] core/topologyattrs page format is broken
help wanted Component-Docs Priority-Critical
## Expected behavior ## All docs pages are well-formatted. ## Actual behavior ## Doc page [11.3.3. Topology attribute objects — [MDAnalysis.core.topologyattrs](https://docs.mdanalysis.org/stable/documentation_pages/core/topologyattrs.html) is not formatted correctly (see screen shot, sorry, long) - footer with authors on the side - classes and functions are partially hideen by side menu and at end instead of integrated into page ![image](https://github.com/MDAnalysis/mdanalysis/assets/237980/3b5c4d77-e4af-4eef-9cfd-2dfd1b062e13) ## Code to reproduce the behavior ## Load https://docs.mdanalysis.org/stable/documentation_pages/core/topologyattrs.html#MDAnalysis.core.topologyattrs.Masses.radius_of_gyration in desktop web browser ## Current version of MDAnalysis ## - docs version 2.5.0
1.0
[docs] core/topologyattrs page format is broken - ## Expected behavior ## All docs pages are well-formatted. ## Actual behavior ## Doc page [11.3.3. Topology attribute objects — [MDAnalysis.core.topologyattrs](https://docs.mdanalysis.org/stable/documentation_pages/core/topologyattrs.html) is not formatted correctly (see screen shot, sorry, long) - footer with authors on the side - classes and functions are partially hideen by side menu and at end instead of integrated into page ![image](https://github.com/MDAnalysis/mdanalysis/assets/237980/3b5c4d77-e4af-4eef-9cfd-2dfd1b062e13) ## Code to reproduce the behavior ## Load https://docs.mdanalysis.org/stable/documentation_pages/core/topologyattrs.html#MDAnalysis.core.topologyattrs.Masses.radius_of_gyration in desktop web browser ## Current version of MDAnalysis ## - docs version 2.5.0
non_process
core topologyattrs page format is broken expected behavior all docs pages are well formatted actual behavior doc page is not formatted correctly see screen shot sorry long footer with authors on the side classes and functions are partially hideen by side menu and at end instead of integrated into page code to reproduce the behavior load in desktop web browser current version of mdanalysis docs version
0
18,271
24,350,499,475
IssuesEvent
2022-10-02 21:58:33
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
'Delete holes' bug when 'area' argument is used
Processing Bug
### What is the bug or the crash? I discovered that 'Delete holes' does not work reliable when you use the 'area' argument i.e. when you want to delete all holes smaller than a specified area. Sometimes holes larger than the specified area are incorrectly removed. Therefore I have created a small example in an attached geopackage file which shows the problem. ### Steps to reproduce the issue Open the attached geopackage which is uploaded below within the zip file. It contains one layer named "delete_holes" defined in the Swedish CRS "SWEREF 99 TM" i.e. the unit are meters. There is only one polygon in the layer and the outer ring is a square (20x20 meters) and all four inner rings are also squares and with these sizes: 4 (2x2) , 9 (3x3) , 16 (4x4) , 36 (6x6) When 'Delete holes' is used with the area argument 12 then only the two smaller holes (with areas 4 and 9) should be removed, but as you can see in a screenshot below, the largest hole with area 36 is also removed. [delete_holes.zip](https://github.com/qgis/QGIS/files/9262143/delete_holes.zip) ![delete_holes_dialog](https://user-images.githubusercontent.com/1504507/182919790-d0cb0586-b587-4c21-8df5-b5c5469b6d8b.gif) ![delete_holes_incorrect_result](https://user-images.githubusercontent.com/1504507/182919835-88860ec4-22bb-4246-98a8-2fac3f695804.gif) ### Versions QGIS 3.22 but the problem is also in the current master and I will soon create a pull request. As you can see in an attached screenshot I am using the 'master' branch code as of github commit '4861ee8a2f' ### Supported QGIS version - [X] I'm running a supported QGIS version according to the roadmap. ### New profile - [ ] I tried with a new QGIS profile ### Additional context Regarding the QGIS version, I discovered the problem when using Windows 10 and QGIS 3.22 but the problem is still there in the 'master' branch. I know that since I decided to try and solve the problem myself, so I have setup the development environment for my Linux Ubuntu, and when debugging it **I have found the problem and will create a pull request**. So, you do not need to try figure out what the problem is since I intend to very soon (at least within 24 hours and probably sooner) create a pull request, when I have finished a unit test case, referring to this github issue.
1.0
'Delete holes' bug when 'area' argument is used - ### What is the bug or the crash? I discovered that 'Delete holes' does not work reliable when you use the 'area' argument i.e. when you want to delete all holes smaller than a specified area. Sometimes holes larger than the specified area are incorrectly removed. Therefore I have created a small example in an attached geopackage file which shows the problem. ### Steps to reproduce the issue Open the attached geopackage which is uploaded below within the zip file. It contains one layer named "delete_holes" defined in the Swedish CRS "SWEREF 99 TM" i.e. the unit are meters. There is only one polygon in the layer and the outer ring is a square (20x20 meters) and all four inner rings are also squares and with these sizes: 4 (2x2) , 9 (3x3) , 16 (4x4) , 36 (6x6) When 'Delete holes' is used with the area argument 12 then only the two smaller holes (with areas 4 and 9) should be removed, but as you can see in a screenshot below, the largest hole with area 36 is also removed. [delete_holes.zip](https://github.com/qgis/QGIS/files/9262143/delete_holes.zip) ![delete_holes_dialog](https://user-images.githubusercontent.com/1504507/182919790-d0cb0586-b587-4c21-8df5-b5c5469b6d8b.gif) ![delete_holes_incorrect_result](https://user-images.githubusercontent.com/1504507/182919835-88860ec4-22bb-4246-98a8-2fac3f695804.gif) ### Versions QGIS 3.22 but the problem is also in the current master and I will soon create a pull request. As you can see in an attached screenshot I am using the 'master' branch code as of github commit '4861ee8a2f' ### Supported QGIS version - [X] I'm running a supported QGIS version according to the roadmap. ### New profile - [ ] I tried with a new QGIS profile ### Additional context Regarding the QGIS version, I discovered the problem when using Windows 10 and QGIS 3.22 but the problem is still there in the 'master' branch. I know that since I decided to try and solve the problem myself, so I have setup the development environment for my Linux Ubuntu, and when debugging it **I have found the problem and will create a pull request**. So, you do not need to try figure out what the problem is since I intend to very soon (at least within 24 hours and probably sooner) create a pull request, when I have finished a unit test case, referring to this github issue.
process
delete holes bug when area argument is used what is the bug or the crash i discovered that delete holes does not work reliable when you use the area argument i e when you want to delete all holes smaller than a specified area sometimes holes larger than the specified area are incorrectly removed therefore i have created a small example in an attached geopackage file which shows the problem steps to reproduce the issue open the attached geopackage which is uploaded below within the zip file it contains one layer named delete holes defined in the swedish crs sweref tm i e the unit are meters there is only one polygon in the layer and the outer ring is a square meters and all four inner rings are also squares and with these sizes when delete holes is used with the area argument then only the two smaller holes with areas and should be removed but as you can see in a screenshot below the largest hole with area is also removed versions qgis but the problem is also in the current master and i will soon create a pull request as you can see in an attached screenshot i am using the master branch code as of github commit supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context regarding the qgis version i discovered the problem when using windows and qgis but the problem is still there in the master branch i know that since i decided to try and solve the problem myself so i have setup the development environment for my linux ubuntu and when debugging it i have found the problem and will create a pull request so you do not need to try figure out what the problem is since i intend to very soon at least within hours and probably sooner create a pull request when i have finished a unit test case referring to this github issue
1
67,373
27,820,781,289
IssuesEvent
2023-03-19 07:45:44
aws/aws-sdk-go-v2
https://api.github.com/repos/aws/aws-sdk-go-v2
closed
rds data api- need way to get all exceptions from BatchExecuteStatement
feature-request service-api
### Describe the feature When using BatchExecuteStatement using rdsdata api, a single error from the batch is returned in the error object, with the message stating: `. Call getNextException to see other errors in the batch.` There seems to be no way to actually call getNextException or access the rest of the errors in the batch. This is a huge issue for me since it makes it impossible to know which statements of the batch has failed/succeeded. ### Use Case This is needed whenever there is a need to know which statements of a batch has failed or succeeded, which I would say is almost always. In my case I want to retry statements that failed later, but without knowing what has succeeded or failed I have to roll back and retry the entire batch which is very inefficient. ### Proposed Solution I would prefer not having to call a function to get the next exception if possible, I think the best solution would be to have a custom error type that can be casted to which contains an array of the sub errors in the batch. It would also be nice if the sub errors were typed (fk contraint error etc) and the parameters of the statement that failed (from the parameterSets input). However, the most important thing is a way to retrieve all the errors. ### Other Information _No response_ ### Acknowledgements - [ ] I may be able to implement this feature request - [ ] This feature might incur a breaking change ### AWS Go SDK version used V2 @latest ### Go version used 1.18
1.0
rds data api- need way to get all exceptions from BatchExecuteStatement - ### Describe the feature When using BatchExecuteStatement using rdsdata api, a single error from the batch is returned in the error object, with the message stating: `. Call getNextException to see other errors in the batch.` There seems to be no way to actually call getNextException or access the rest of the errors in the batch. This is a huge issue for me since it makes it impossible to know which statements of the batch has failed/succeeded. ### Use Case This is needed whenever there is a need to know which statements of a batch has failed or succeeded, which I would say is almost always. In my case I want to retry statements that failed later, but without knowing what has succeeded or failed I have to roll back and retry the entire batch which is very inefficient. ### Proposed Solution I would prefer not having to call a function to get the next exception if possible, I think the best solution would be to have a custom error type that can be casted to which contains an array of the sub errors in the batch. It would also be nice if the sub errors were typed (fk contraint error etc) and the parameters of the statement that failed (from the parameterSets input). However, the most important thing is a way to retrieve all the errors. ### Other Information _No response_ ### Acknowledgements - [ ] I may be able to implement this feature request - [ ] This feature might incur a breaking change ### AWS Go SDK version used V2 @latest ### Go version used 1.18
non_process
rds data api need way to get all exceptions from batchexecutestatement describe the feature when using batchexecutestatement using rdsdata api a single error from the batch is returned in the error object with the message stating call getnextexception to see other errors in the batch there seems to be no way to actually call getnextexception or access the rest of the errors in the batch this is a huge issue for me since it makes it impossible to know which statements of the batch has failed succeeded use case this is needed whenever there is a need to know which statements of a batch has failed or succeeded which i would say is almost always in my case i want to retry statements that failed later but without knowing what has succeeded or failed i have to roll back and retry the entire batch which is very inefficient proposed solution i would prefer not having to call a function to get the next exception if possible i think the best solution would be to have a custom error type that can be casted to which contains an array of the sub errors in the batch it would also be nice if the sub errors were typed fk contraint error etc and the parameters of the statement that failed from the parametersets input however the most important thing is a way to retrieve all the errors other information no response acknowledgements i may be able to implement this feature request this feature might incur a breaking change aws go sdk version used latest go version used
0
553,081
16,343,374,561
IssuesEvent
2021-05-13 02:36:43
volcano-sh/volcano
https://api.github.com/repos/volcano-sh/volcano
reopened
Can Volcano job support volumeClaimTemplates
area/controllers kind/feature lifecycle/stale priority/important-soon
<!-- This form is for bug reports and feature requests ONLY! If you're looking for a help then check our [Slack Channel](http://volcano-sh.slack.com/) or have a look at our [dev mailing](https://groups.google.com/forum/#!forum/volcano-sh) --> **Is this a BUG REPORT or FEATURE REQUEST?**: /kind feature **What happened**: old version: https://github.com/volcano-sh/volcano/blob/master/docs/design/job-api.md#job-inputoutput ... (hope to keep the docs up to date) new version: https://github.com/volcano-sh/volcano/pull/498 currently, it can only set the pvc volume, that means all vj pods share the same pvc, as for now i know some platforms don't support this way to use pvc (such as Huawei CCE/EVS) and we have this case, AI dev can select the number of instances, the resource request and the volume capacity of each instance obviously, it need volcano to support each instance (pod) bind a independent pvc (with a limited capacity size) maybe, volcano can ref statefulset **volumeClaimTemplate** impl https://kubernetes.io/docs/tutorials/stateful-application/basic-stateful-set/#creating-a-statefulset
1.0
Can Volcano job support volumeClaimTemplates - <!-- This form is for bug reports and feature requests ONLY! If you're looking for a help then check our [Slack Channel](http://volcano-sh.slack.com/) or have a look at our [dev mailing](https://groups.google.com/forum/#!forum/volcano-sh) --> **Is this a BUG REPORT or FEATURE REQUEST?**: /kind feature **What happened**: old version: https://github.com/volcano-sh/volcano/blob/master/docs/design/job-api.md#job-inputoutput ... (hope to keep the docs up to date) new version: https://github.com/volcano-sh/volcano/pull/498 currently, it can only set the pvc volume, that means all vj pods share the same pvc, as for now i know some platforms don't support this way to use pvc (such as Huawei CCE/EVS) and we have this case, AI dev can select the number of instances, the resource request and the volume capacity of each instance obviously, it need volcano to support each instance (pod) bind a independent pvc (with a limited capacity size) maybe, volcano can ref statefulset **volumeClaimTemplate** impl https://kubernetes.io/docs/tutorials/stateful-application/basic-stateful-set/#creating-a-statefulset
non_process
can volcano job support volumeclaimtemplates this form is for bug reports and feature requests only if you re looking for a help then check our or have a look at our is this a bug report or feature request kind feature what happened old version hope to keep the docs up to date new version currently it can only set the pvc volume that means all vj pods share the same pvc as for now i know some platforms don t support this way to use pvc such as huawei cce evs and we have this case ai dev can select the number of instances the resource request and the volume capacity of each instance obviously it need volcano to support each instance pod bind a independent pvc with a limited capacity size maybe volcano can ref statefulset volumeclaimtemplate impl
0
590,173
17,772,692,585
IssuesEvent
2021-08-30 15:19:46
GoogleCloudPlatform/golang-samples
https://api.github.com/repos/GoogleCloudPlatform/golang-samples
closed
vision/product_search: TestGetSimilarProductsURIWithFilter failed
type: bug priority: p2 api: vision samples flakybot: issue
Note: #2189 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky. ---- commit: 88b2a1089882340cbb1a55b67718645c5ee9af99 buildURL: [Build Status](https://source.cloud.google.com/results/invocations/5217657a-08aa-4983-b787-650d10e2e338), [Sponge](http://sponge2/5217657a-08aa-4983-b787-650d10e2e338) status: failed <details><summary>Test output</summary><br><pre>get_similar_products_uri_with_filter_test.go:39: getSimilarProductsURI: ProductSearch: rpc error: code = NotFound desc = No matching products found. Please verify that the ProductSet exists, has images and has been indexed.</pre></details>
1.0
vision/product_search: TestGetSimilarProductsURIWithFilter failed - Note: #2189 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky. ---- commit: 88b2a1089882340cbb1a55b67718645c5ee9af99 buildURL: [Build Status](https://source.cloud.google.com/results/invocations/5217657a-08aa-4983-b787-650d10e2e338), [Sponge](http://sponge2/5217657a-08aa-4983-b787-650d10e2e338) status: failed <details><summary>Test output</summary><br><pre>get_similar_products_uri_with_filter_test.go:39: getSimilarProductsURI: ProductSearch: rpc error: code = NotFound desc = No matching products found. Please verify that the ProductSet exists, has images and has been indexed.</pre></details>
non_process
vision product search testgetsimilarproductsuriwithfilter failed note was also for this test but it was closed more than days ago so i didn t mark it flaky commit buildurl status failed test output get similar products uri with filter test go getsimilarproductsuri productsearch rpc error code notfound desc no matching products found please verify that the productset exists has images and has been indexed
0
3,465
6,545,833,444
IssuesEvent
2017-09-04 07:41:12
threefoldfoundation/app_backend
https://api.github.com/repos/threefoldfoundation/app_backend
closed
Page with signed ITO agreements
process_duplicate type_feature
Only visible for payment-admins (IYO sub org, next to admins). Similar as node orders, grouped by status, but with a `Mark as paid` button. Also show how much money they need to transfer (`currency_rate * token_count + " " + currency`)
1.0
Page with signed ITO agreements - Only visible for payment-admins (IYO sub org, next to admins). Similar as node orders, grouped by status, but with a `Mark as paid` button. Also show how much money they need to transfer (`currency_rate * token_count + " " + currency`)
process
page with signed ito agreements only visible for payment admins iyo sub org next to admins similar as node orders grouped by status but with a mark as paid button also show how much money they need to transfer currency rate token count currency
1
31,575
11,954,307,292
IssuesEvent
2020-04-03 23:11:35
authelia/authelia
https://api.github.com/repos/authelia/authelia
closed
Add an option to disable "keep me logged in".
P3 Security
From a security point of view, an `allow_keep_me_logged_in` boolean parameter in Authelia configuration file to toggle this feature on or off would be useful.
True
Add an option to disable "keep me logged in". - From a security point of view, an `allow_keep_me_logged_in` boolean parameter in Authelia configuration file to toggle this feature on or off would be useful.
non_process
add an option to disable keep me logged in from a security point of view an allow keep me logged in boolean parameter in authelia configuration file to toggle this feature on or off would be useful
0
493,508
14,233,858,757
IssuesEvent
2020-11-18 12:47:25
GSG-G9/IS-Autocomplete
https://api.github.com/repos/GSG-G9/IS-Autocomplete
opened
HEROKU deploy
priority-1
the project should be deployed on Heroku, and making sure that it is viewed properly
1.0
HEROKU deploy - the project should be deployed on Heroku, and making sure that it is viewed properly
non_process
heroku deploy the project should be deployed on heroku and making sure that it is viewed properly
0
161,232
13,808,247,551
IssuesEvent
2020-10-12 01:44:43
z3t0/Arduino-IRremote
https://api.github.com/repos/z3t0/Arduino-IRremote
closed
Tiny85: Configure different send pin?
Documentation
**Board:** Adafruit Trinket 5V **Library Version:** 2.1.0 **Protocol:** Any The Trinket uses a Tiny85 and the remote library is configured to use pin 1. How can I configure it to use a different output pin? Pin 1 is connected to the builtin LED which is pulsed on startup.
1.0
Tiny85: Configure different send pin? - **Board:** Adafruit Trinket 5V **Library Version:** 2.1.0 **Protocol:** Any The Trinket uses a Tiny85 and the remote library is configured to use pin 1. How can I configure it to use a different output pin? Pin 1 is connected to the builtin LED which is pulsed on startup.
non_process
configure different send pin board adafruit trinket library version protocol any the trinket uses a and the remote library is configured to use pin how can i configure it to use a different output pin pin is connected to the builtin led which is pulsed on startup
0
15,878
20,068,377,959
IssuesEvent
2022-02-04 01:17:50
nion-software/nionswift
https://api.github.com/repos/nion-software/nionswift
opened
Rectangle masks does not draw consistently when partially or entirely out of bounds
type - bug level - easy f - processing f - filtering/masking
- Ellipses seem to work - Rotated rectangles mostly work, but may throw exceptions - Rectangles mostly fail when moved off left/top edges
1.0
Rectangle masks does not draw consistently when partially or entirely out of bounds - - Ellipses seem to work - Rotated rectangles mostly work, but may throw exceptions - Rectangles mostly fail when moved off left/top edges
process
rectangle masks does not draw consistently when partially or entirely out of bounds ellipses seem to work rotated rectangles mostly work but may throw exceptions rectangles mostly fail when moved off left top edges
1
10,729
3,420,704,747
IssuesEvent
2015-12-08 15:52:47
WP-API/WP-API
https://api.github.com/repos/WP-API/WP-API
closed
"Philosophy Lesson" in "Extending" should live in Philosophy
Documentation Enhancement
It's a more human-readable version of why you should use WP-API, and why not.
1.0
"Philosophy Lesson" in "Extending" should live in Philosophy - It's a more human-readable version of why you should use WP-API, and why not.
non_process
philosophy lesson in extending should live in philosophy it s a more human readable version of why you should use wp api and why not
0
16,253
20,813,208,625
IssuesEvent
2022-03-18 06:59:29
equinor/MAD-VSM-WEB
https://api.github.com/repos/equinor/MAD-VSM-WEB
closed
Verify what code is currently running
back-end front-end process improvement
### User-story As a tester, I need to verify what features are currently running in the browser, So that I can do a good test and test the right thing. ### Background This is important to get in because of browser caching. How can someone verify that they are testing the correct build? We need to be confident that what we are pushing to some environment is what is tested before releasing. ### Suggested solution Show git commit in the dropdown. Make it so that clicking on it takes you to the commit in GitHub, where you can read more about it.
1.0
Verify what code is currently running - ### User-story As a tester, I need to verify what features are currently running in the browser, So that I can do a good test and test the right thing. ### Background This is important to get in because of browser caching. How can someone verify that they are testing the correct build? We need to be confident that what we are pushing to some environment is what is tested before releasing. ### Suggested solution Show git commit in the dropdown. Make it so that clicking on it takes you to the commit in GitHub, where you can read more about it.
process
verify what code is currently running user story as a tester i need to verify what features are currently running in the browser so that i can do a good test and test the right thing background this is important to get in because of browser caching how can someone verify that they are testing the correct build we need to be confident that what we are pushing to some environment is what is tested before releasing suggested solution show git commit in the dropdown make it so that clicking on it takes you to the commit in github where you can read more about it
1
256,265
22,042,416,501
IssuesEvent
2022-05-29 15:01:41
SAA-SDT/eac-cpf-schema
https://api.github.com/repos/SAA-SDT/eac-cpf-schema
closed
<relationType>
Element Tested by Schema Team Comments period
## Relation Type - add new optional, repeatable text element `<relationType>` as child element of `<relation>` - add **optional attributes** `@audience` `@conventationDeclarationReference` `@id` `@languageOfElement` `@localType` `@localTypeDeclarationReference` `@maintenanceEventReference` `@scriptOfElement` `@sourceReference` `@valueURI` `@vocabularySource` `@vocabularySourceURI` ## Creator of issue 1. Silke Jagodzinski 2. TS-EAS: EAC-CPF subgroup 3. silkejagodzinski@gmail.com ## Related issues / documents [Paper on Relation](https://github.com/SAA-SDT/TS-EAS-subteam-notes/tree/master/eaccpf-subteam/working-documents/topics/relations) ## EAD3 Reconciliation new EAC element ## Context new EAC element Type of relation, replaces omitted attributes `@cpfRelationType`, `@functionRelationType`, `@resourceRelationType` with their fixed values. The type of a relation can be descriped with text or defined via URI. CPF Relation Types: "identity" or "hierarchical" or "hierarchical-parent" or "hierarchical-child" or "temporal" or "temporal-earlier" or "temporal-later" or "family" or "associative" Function Relation Types: "controls" or "owns" or "performs" Resource Relation Types: "creatorOf" or "subjectOf" or "other" ## Solution documentation: _Summary_, _Description and Usage_ and _Attribute usage_ needed **May contain**: [text] **May occur within**: `<relation>` **Attributes**: `@audience` - optional (values limited to: external, internal) `@conventationDeclarationReference` - optional `@id` - optional `@languageOfElement` - optional `@localType` - optional `@localTypeDeclarationReference` - optional `@maintenanceEventReference` - optional `@scriptOfElement` - optional `@sourceReference` - optional `@valueURI` - optional `@vocabularySource` - optional `@vocabularySourceURI` - optional **Availability**: optional, repeatable ## Example encoding ``` <relation> <targetEntity targetType="person"> <part>persons name</part> </targetEntity> <relationType audience="external" conventionDeclarationReference="conventiondeclaration1" id="relationtype1" languageOfElement="en" localType="localrelationtye" localTypeDeclarationReference="localTypeDeclaration1" maintenanceEventReference="maintenancevent1" scriptOfElement="lat" sourceReference="source1" voabularySource="GND" vocabularySourceURI="https://d-nb.info/standards/elementset/gnd#" valueURI="https://d-nb.info/standards/elementset/gnd#Family">Family</relationType> </relation> ```
1.0
<relationType> - ## Relation Type - add new optional, repeatable text element `<relationType>` as child element of `<relation>` - add **optional attributes** `@audience` `@conventationDeclarationReference` `@id` `@languageOfElement` `@localType` `@localTypeDeclarationReference` `@maintenanceEventReference` `@scriptOfElement` `@sourceReference` `@valueURI` `@vocabularySource` `@vocabularySourceURI` ## Creator of issue 1. Silke Jagodzinski 2. TS-EAS: EAC-CPF subgroup 3. silkejagodzinski@gmail.com ## Related issues / documents [Paper on Relation](https://github.com/SAA-SDT/TS-EAS-subteam-notes/tree/master/eaccpf-subteam/working-documents/topics/relations) ## EAD3 Reconciliation new EAC element ## Context new EAC element Type of relation, replaces omitted attributes `@cpfRelationType`, `@functionRelationType`, `@resourceRelationType` with their fixed values. The type of a relation can be descriped with text or defined via URI. CPF Relation Types: "identity" or "hierarchical" or "hierarchical-parent" or "hierarchical-child" or "temporal" or "temporal-earlier" or "temporal-later" or "family" or "associative" Function Relation Types: "controls" or "owns" or "performs" Resource Relation Types: "creatorOf" or "subjectOf" or "other" ## Solution documentation: _Summary_, _Description and Usage_ and _Attribute usage_ needed **May contain**: [text] **May occur within**: `<relation>` **Attributes**: `@audience` - optional (values limited to: external, internal) `@conventationDeclarationReference` - optional `@id` - optional `@languageOfElement` - optional `@localType` - optional `@localTypeDeclarationReference` - optional `@maintenanceEventReference` - optional `@scriptOfElement` - optional `@sourceReference` - optional `@valueURI` - optional `@vocabularySource` - optional `@vocabularySourceURI` - optional **Availability**: optional, repeatable ## Example encoding ``` <relation> <targetEntity targetType="person"> <part>persons name</part> </targetEntity> <relationType audience="external" conventionDeclarationReference="conventiondeclaration1" id="relationtype1" languageOfElement="en" localType="localrelationtye" localTypeDeclarationReference="localTypeDeclaration1" maintenanceEventReference="maintenancevent1" scriptOfElement="lat" sourceReference="source1" voabularySource="GND" vocabularySourceURI="https://d-nb.info/standards/elementset/gnd#" valueURI="https://d-nb.info/standards/elementset/gnd#Family">Family</relationType> </relation> ```
non_process
relation type add new optional repeatable text element as child element of add optional attributes audience conventationdeclarationreference id languageofelement localtype localtypedeclarationreference maintenanceeventreference scriptofelement sourcereference valueuri vocabularysource vocabularysourceuri creator of issue silke jagodzinski ts eas eac cpf subgroup silkejagodzinski gmail com related issues documents reconciliation new eac element context new eac element type of relation replaces omitted attributes cpfrelationtype functionrelationtype resourcerelationtype with their fixed values the type of a relation can be descriped with text or defined via uri cpf relation types identity or hierarchical or hierarchical parent or hierarchical child or temporal or temporal earlier or temporal later or family or associative function relation types controls or owns or performs resource relation types creatorof or subjectof or other solution documentation summary description and usage and attribute usage needed may contain may occur within attributes audience optional values limited to external internal conventationdeclarationreference optional id optional languageofelement optional localtype optional localtypedeclarationreference optional maintenanceeventreference optional scriptofelement optional sourcereference optional valueuri optional vocabularysource optional vocabularysourceuri optional availability optional repeatable example encoding persons name relationtype audience external conventiondeclarationreference id languageofelement en localtype localrelationtye localtypedeclarationreference maintenanceeventreference scriptofelement lat sourcereference voabularysource gnd vocabularysourceuri valueuri
0
5,833
8,665,884,428
IssuesEvent
2018-11-29 01:21:41
cityofaustin/techstack
https://api.github.com/repos/cityofaustin/techstack
closed
Draft process page for How to Start a Farmers Market Business
Content Type: Process Page Department: Public Health Site Content Size: S Team: Content
Current: chrome-extension://oemmndcbldboiebfnladdacbdfmadadm/http://austintexas.gov/sites/default/files/files/Health/Environmental/Food/How_to_Start_a_Farmers_Market_Booth_Business_2018.pdf Drive:
1.0
Draft process page for How to Start a Farmers Market Business - Current: chrome-extension://oemmndcbldboiebfnladdacbdfmadadm/http://austintexas.gov/sites/default/files/files/Health/Environmental/Food/How_to_Start_a_Farmers_Market_Booth_Business_2018.pdf Drive:
process
draft process page for how to start a farmers market business current chrome extension oemmndcbldboiebfnladdacbdfmadadm drive
1
279,172
30,702,459,793
IssuesEvent
2023-07-27 01:32:00
nidhi7598/linux-3.0.35_CVE-2018-13405
https://api.github.com/repos/nidhi7598/linux-3.0.35_CVE-2018-13405
closed
CVE-2020-29368 (High) detected in linux-stable-rtv3.8.6 - autoclosed
Mend: dependency security vulnerability
## CVE-2020-29368 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv3.8.6</b></p></summary> <p> <p>Julia Cartwright's fork of linux-stable-rt.git</p> <p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/nidhi7598/linux-3.0.35_CVE-2018-13405/commit/662fbf6e1ed61fd353add2f52e2dd27e990364c7">662fbf6e1ed61fd353add2f52e2dd27e990364c7</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/mm/huge_memory.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in __split_huge_pmd in mm/huge_memory.c in the Linux kernel before 5.7.5. The copy-on-write implementation can grant unintended write access because of a race condition in a THP mapcount check, aka CID-c444eb564fb1. <p>Publish Date: 2020-11-28 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-29368>CVE-2020-29368</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.0</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-29368">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-29368</a></p> <p>Release Date: 2020-11-28</p> <p>Fix Resolution: v5.8-rc1,v5.7.5,v5.4.48</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-29368 (High) detected in linux-stable-rtv3.8.6 - autoclosed - ## CVE-2020-29368 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv3.8.6</b></p></summary> <p> <p>Julia Cartwright's fork of linux-stable-rt.git</p> <p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/nidhi7598/linux-3.0.35_CVE-2018-13405/commit/662fbf6e1ed61fd353add2f52e2dd27e990364c7">662fbf6e1ed61fd353add2f52e2dd27e990364c7</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/mm/huge_memory.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in __split_huge_pmd in mm/huge_memory.c in the Linux kernel before 5.7.5. The copy-on-write implementation can grant unintended write access because of a race condition in a THP mapcount check, aka CID-c444eb564fb1. <p>Publish Date: 2020-11-28 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-29368>CVE-2020-29368</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.0</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-29368">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-29368</a></p> <p>Release Date: 2020-11-28</p> <p>Fix Resolution: v5.8-rc1,v5.7.5,v5.4.48</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in linux stable autoclosed cve high severity vulnerability vulnerable library linux stable julia cartwright s fork of linux stable rt git library home page a href found in head commit a href found in base branch master vulnerable source files mm huge memory c vulnerability details an issue was discovered in split huge pmd in mm huge memory c in the linux kernel before the copy on write implementation can grant unintended write access because of a race condition in a thp mapcount check aka cid publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
0
210,804
7,194,830,636
IssuesEvent
2018-02-04 10:27:15
elor/tuvero
https://api.github.com/repos/elor/tuvero
closed
KO-Baum: Korrigierte Ergebnisse ändern noch nicht die nachfolgenden Spiele
backend priority 5: feature
Geänderte Ergebnisse im KO-Baum werden zwar gespeichert, aber wirken sich noch nicht auf die nachfolgend ermöglichten Spiele aus. Der Nutzer würde erwarten, dass er ein gerade eben beendetes Spiel noch korrigieren und ein laufendes Spiel rückgängig machen kann. Eine elegante Lösung wäre, alle nachfolgenden Spiele als ungespielt zu verwerfen und neu ausspielen zu lassen.
1.0
KO-Baum: Korrigierte Ergebnisse ändern noch nicht die nachfolgenden Spiele - Geänderte Ergebnisse im KO-Baum werden zwar gespeichert, aber wirken sich noch nicht auf die nachfolgend ermöglichten Spiele aus. Der Nutzer würde erwarten, dass er ein gerade eben beendetes Spiel noch korrigieren und ein laufendes Spiel rückgängig machen kann. Eine elegante Lösung wäre, alle nachfolgenden Spiele als ungespielt zu verwerfen und neu ausspielen zu lassen.
non_process
ko baum korrigierte ergebnisse ändern noch nicht die nachfolgenden spiele geänderte ergebnisse im ko baum werden zwar gespeichert aber wirken sich noch nicht auf die nachfolgend ermöglichten spiele aus der nutzer würde erwarten dass er ein gerade eben beendetes spiel noch korrigieren und ein laufendes spiel rückgängig machen kann eine elegante lösung wäre alle nachfolgenden spiele als ungespielt zu verwerfen und neu ausspielen zu lassen
0
55,658
14,619,844,349
IssuesEvent
2020-12-22 18:35:42
department-of-veterans-affairs/va.gov-team
https://api.github.com/repos/department-of-veterans-affairs/va.gov-team
opened
508-defect-2 [KEYBOARD]: Focus MUST be returned to the Edit button when Update page is pressed or clicked
508-defect-2 508-issue-focus-mgmt 508/Accessibility HLR vsa
# [508-defect-2](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/platform/accessibility/guidance/defect-severity-rubric.md#508-defect-2) <!-- Enter an issue title using the format [ERROR TYPE]: Brief description of the problem --- [SCREENREADER]: Edit buttons need aria-label for context [KEYBOARD]: Add another user link will not receive keyboard focus [AXE-CORE]: Heading levels should increase by one [COGNITION]: Error messages should be more specific [COLOR]: Blue button on blue background does not have sufficient contrast ratio --- --> <!-- It's okay to delete the instructions above, but leave the link to the 508 defect severity level for your issue. --> ## Feedback framework - **❗️ Must** for if the feedback must be applied - **⚠️ Should** if the feedback is best practice - **✔️ Consider** for suggestions/enhancements ## Definition of done 1. Review and acknowledge feedback. 1. Fix and/or document decisions made. 1. Accessibility specialist will close ticket after reviewing documented decisions / validating fix. <hr/> ## Point of Contact <!-- If this issue is being opened by a VFS team member, please add a point of contact. Usually this is the same person who enters the issue ticket. --> **VFS Point of Contact:** _Josh._ ## User Story or Problem Statement As a keyboard user, I expect a consistent experience for focus when using the edit and update page buttons. ## Details The Update page button for issues eligible for review does not return focus to the Edit button on the review page. Users need a consistent experience so they won't have to re-orient or discover where their focus is when it is not returned to Edit. Video attached below. ## Acceptance Criteria - [ ] Focus returns to the edit button upon clicking the update page button ## Environment * Operating System: any * Browser: any * Screenreading device: any * Server destination: staging ## Steps to Recreate 1. Enter `https://staging.va.gov/decision-reviews/higher-level-review/request-higher-level-review-form-20-0996/review-and-submit` in browser. 2. Complete test case 3 and reach the review page. 3. Open the issues eligible for review accordion. 3. Click "Edit". 4. Click "update page" button. 5. Confirm focus does not return to "Edit." ## Solution Return focus back to the edit button. ## WCAG or Vendor Guidance (optional) * [Keyboard: Understanding SC 2.1.1](https://www.w3.org/TR/UNDERSTANDING-WCAG20/keyboard-operation-keyboard-operable.html) ## Screenshots or Trace Logs https://user-images.githubusercontent.com/14154792/102921498-75db8c00-445a-11eb-8b84-06c7c602610c.mov
1.0
508-defect-2 [KEYBOARD]: Focus MUST be returned to the Edit button when Update page is pressed or clicked - # [508-defect-2](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/platform/accessibility/guidance/defect-severity-rubric.md#508-defect-2) <!-- Enter an issue title using the format [ERROR TYPE]: Brief description of the problem --- [SCREENREADER]: Edit buttons need aria-label for context [KEYBOARD]: Add another user link will not receive keyboard focus [AXE-CORE]: Heading levels should increase by one [COGNITION]: Error messages should be more specific [COLOR]: Blue button on blue background does not have sufficient contrast ratio --- --> <!-- It's okay to delete the instructions above, but leave the link to the 508 defect severity level for your issue. --> ## Feedback framework - **❗️ Must** for if the feedback must be applied - **⚠️ Should** if the feedback is best practice - **✔️ Consider** for suggestions/enhancements ## Definition of done 1. Review and acknowledge feedback. 1. Fix and/or document decisions made. 1. Accessibility specialist will close ticket after reviewing documented decisions / validating fix. <hr/> ## Point of Contact <!-- If this issue is being opened by a VFS team member, please add a point of contact. Usually this is the same person who enters the issue ticket. --> **VFS Point of Contact:** _Josh._ ## User Story or Problem Statement As a keyboard user, I expect a consistent experience for focus when using the edit and update page buttons. ## Details The Update page button for issues eligible for review does not return focus to the Edit button on the review page. Users need a consistent experience so they won't have to re-orient or discover where their focus is when it is not returned to Edit. Video attached below. ## Acceptance Criteria - [ ] Focus returns to the edit button upon clicking the update page button ## Environment * Operating System: any * Browser: any * Screenreading device: any * Server destination: staging ## Steps to Recreate 1. Enter `https://staging.va.gov/decision-reviews/higher-level-review/request-higher-level-review-form-20-0996/review-and-submit` in browser. 2. Complete test case 3 and reach the review page. 3. Open the issues eligible for review accordion. 3. Click "Edit". 4. Click "update page" button. 5. Confirm focus does not return to "Edit." ## Solution Return focus back to the edit button. ## WCAG or Vendor Guidance (optional) * [Keyboard: Understanding SC 2.1.1](https://www.w3.org/TR/UNDERSTANDING-WCAG20/keyboard-operation-keyboard-operable.html) ## Screenshots or Trace Logs https://user-images.githubusercontent.com/14154792/102921498-75db8c00-445a-11eb-8b84-06c7c602610c.mov
non_process
defect focus must be returned to the edit button when update page is pressed or clicked enter an issue title using the format brief description of the problem edit buttons need aria label for context add another user link will not receive keyboard focus heading levels should increase by one error messages should be more specific blue button on blue background does not have sufficient contrast ratio feedback framework ❗️ must for if the feedback must be applied ⚠️ should if the feedback is best practice ✔️ consider for suggestions enhancements definition of done review and acknowledge feedback fix and or document decisions made accessibility specialist will close ticket after reviewing documented decisions validating fix point of contact vfs point of contact josh user story or problem statement as a keyboard user i expect a consistent experience for focus when using the edit and update page buttons details the update page button for issues eligible for review does not return focus to the edit button on the review page users need a consistent experience so they won t have to re orient or discover where their focus is when it is not returned to edit video attached below acceptance criteria focus returns to the edit button upon clicking the update page button environment operating system any browser any screenreading device any server destination staging steps to recreate enter in browser complete test case and reach the review page open the issues eligible for review accordion click edit click update page button confirm focus does not return to edit solution return focus back to the edit button wcag or vendor guidance optional screenshots or trace logs
0
496,096
14,332,232,484
IssuesEvent
2020-11-27 01:45:53
dotnetcore/Natasha
https://api.github.com/repos/dotnetcore/Natasha
closed
不支持识别 static 、new static
low-priority
parent class: ```csharp public class T1 { public static void A() { } public virtual void B() { } public virtual void C() { } public virtual void D() { } public void E() { } } ``` child class: ```csharp public class T2 : T1 { public new static void A() { } public override void B() { } public new virtual void C() { } public sealed override void D() { } public new void E() { } } ``` How do I decide whether a method is static or new static? You get information by reflection, exactly the same thing ↓ ![image](https://user-images.githubusercontent.com/2189761/100304505-e0271b00-2fd9-11eb-9229-a26d4e4c7da8.png) See IL Code,same code: ``` .method public hidebysig static void A () cil managed ```
1.0
不支持识别 static 、new static - parent class: ```csharp public class T1 { public static void A() { } public virtual void B() { } public virtual void C() { } public virtual void D() { } public void E() { } } ``` child class: ```csharp public class T2 : T1 { public new static void A() { } public override void B() { } public new virtual void C() { } public sealed override void D() { } public new void E() { } } ``` How do I decide whether a method is static or new static? You get information by reflection, exactly the same thing ↓ ![image](https://user-images.githubusercontent.com/2189761/100304505-e0271b00-2fd9-11eb-9229-a26d4e4c7da8.png) See IL Code,same code: ``` .method public hidebysig static void A () cil managed ```
non_process
不支持识别 static 、new static parent class csharp public class public static void a public virtual void b public virtual void c public virtual void d public void e child class csharp public class public new static void a public override void b public new virtual void c public sealed override void d public new void e how do i decide whether a method is static or new static you get information by reflection exactly the same thing ↓ see il code same code method public hidebysig static void a cil managed
0
310,352
26,711,620,452
IssuesEvent
2023-01-28 01:14:18
stockmann-lab/shim_amp_hardware_linear
https://api.github.com/repos/stockmann-lab/shim_amp_hardware_linear
opened
Amp ctrl (D3)/power (D1) testing of design changes
testing needed
Finish testing changes before next prototyping round (ctrl board D4, power board D2) - [ ] New offset calibration circuit - Test the offset cal on all channels (inc. forcing the offset-cal on while OPA548 is enabled, to see where current flows through the collector resistors) - On "shutdown logic test" board, add 2x diodes to each channel: anodes on pins 8, 11 of DIP; cathodes all tied together - On offset-cal board, connect TP1 to gnd and run this common-diode-cathode signal to the R1/Q1 junction) - [ ] New ADC on EVB - Settling time: accuracy when switching channels, or staying on same channel - Repeat with multiple candidate op-amps - [ ] Vref stiffness measurement - float the picoscope and use a 1x probe to look at Vref_buffered - try with bare coax and measure @ testpoint, INA828 on ch.8 - see if that gets rid of all the noise from prev. measurements - using Chroma bench supply try for a 5A pulse, to see the actual capability - [ ] Separate ground for current-sense amp - Lift gnd pin of INA828 on ch.8, run copper tape from digital section, connect to gnd w/1uF @ far end and use that as gnd for the INA828 - Before and after mod, look at: CS @ ADC change (DC, transient) and Iout change (DC) for ch.8 when ch.1-7 are stepped to 1A, step/frequency response of ch.8 - [ ] Alternate current-sense amps - INA818 (lower-power) and INA821 (higher-BW) - test transfer func next to INA828 [INA821 is only one avail as of 2023-01]
1.0
Amp ctrl (D3)/power (D1) testing of design changes - Finish testing changes before next prototyping round (ctrl board D4, power board D2) - [ ] New offset calibration circuit - Test the offset cal on all channels (inc. forcing the offset-cal on while OPA548 is enabled, to see where current flows through the collector resistors) - On "shutdown logic test" board, add 2x diodes to each channel: anodes on pins 8, 11 of DIP; cathodes all tied together - On offset-cal board, connect TP1 to gnd and run this common-diode-cathode signal to the R1/Q1 junction) - [ ] New ADC on EVB - Settling time: accuracy when switching channels, or staying on same channel - Repeat with multiple candidate op-amps - [ ] Vref stiffness measurement - float the picoscope and use a 1x probe to look at Vref_buffered - try with bare coax and measure @ testpoint, INA828 on ch.8 - see if that gets rid of all the noise from prev. measurements - using Chroma bench supply try for a 5A pulse, to see the actual capability - [ ] Separate ground for current-sense amp - Lift gnd pin of INA828 on ch.8, run copper tape from digital section, connect to gnd w/1uF @ far end and use that as gnd for the INA828 - Before and after mod, look at: CS @ ADC change (DC, transient) and Iout change (DC) for ch.8 when ch.1-7 are stepped to 1A, step/frequency response of ch.8 - [ ] Alternate current-sense amps - INA818 (lower-power) and INA821 (higher-BW) - test transfer func next to INA828 [INA821 is only one avail as of 2023-01]
non_process
amp ctrl power testing of design changes finish testing changes before next prototyping round ctrl board power board new offset calibration circuit test the offset cal on all channels inc forcing the offset cal on while is enabled to see where current flows through the collector resistors on shutdown logic test board add diodes to each channel anodes on pins of dip cathodes all tied together on offset cal board connect to gnd and run this common diode cathode signal to the junction new adc on evb settling time accuracy when switching channels or staying on same channel repeat with multiple candidate op amps vref stiffness measurement float the picoscope and use a probe to look at vref buffered try with bare coax and measure testpoint on ch see if that gets rid of all the noise from prev measurements using chroma bench supply try for a pulse to see the actual capability separate ground for current sense amp lift gnd pin of on ch run copper tape from digital section connect to gnd w far end and use that as gnd for the before and after mod look at cs adc change dc transient and iout change dc for ch when ch are stepped to step frequency response of ch alternate current sense amps lower power and higher bw test transfer func next to
0
147,605
19,522,851,883
IssuesEvent
2021-12-29 22:31:46
swagger-api/swagger-codegen
https://api.github.com/repos/swagger-api/swagger-codegen
opened
CVE-2021-20190 (High) detected in multiple libraries
security vulnerability
## CVE-2021-20190 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.4.5.jar</b>, <b>jackson-databind-2.8.9.jar</b>, <b>jackson-databind-2.8.8.jar</b>, <b>jackson-databind-2.6.4.jar</b>, <b>jackson-databind-2.7.8.jar</b></p></summary> <p> <details><summary><b>jackson-databind-2.4.5.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /samples/client/petstore/scala/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.4.5/c69c0cb613128c69d84a6a0304ddb9fce82e8242/jackson-databind-2.4.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.4.5/c69c0cb613128c69d84a6a0304ddb9fce82e8242/jackson-databind-2.4.5.jar</p> <p> Dependency Hierarchy: - swagger-core-1.5.8.jar (Root Library) - :x: **jackson-databind-2.4.5.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.9.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: /home/wss-scanner/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.8.9.jar</p> <p> Dependency Hierarchy: - play-guice_2.12-2.6.3.jar (Root Library) - play_2.12-2.6.3.jar - :x: **jackson-databind-2.8.9.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: /home/wss-scanner/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.8.8.jar</p> <p> Dependency Hierarchy: - finch-circe_2.11-0.15.1.jar (Root Library) - circe-jackson28_2.11-0.8.0.jar - :x: **jackson-databind-2.8.8.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.6.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /samples/client/petstore/java/jersey1/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.6.4/f2abadd10891512268b16a1a1a6f81890f3e2976/jackson-databind-2.6.4.jar,/aches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.6.4/f2abadd10891512268b16a1a1a6f81890f3e2976/jackson-databind-2.6.4.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.6.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.7.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: /home/wss-scanner/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.7.8.jar</p> <p> Dependency Hierarchy: - lagom-scaladsl-api_2.11-1.3.8.jar (Root Library) - lagom-api_2.11-1.3.8.jar - play_2.11-2.5.13.jar - :x: **jackson-databind-2.7.8.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/swagger-api/swagger-codegen/commit/4b7a8d7d7384aa6a27d6309c35ade0916edae7ed">4b7a8d7d7384aa6a27d6309c35ade0916edae7ed</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A flaw was found in jackson-databind before 2.9.10.7. FasterXML mishandles the interaction between serialization gadgets and typing. The highest threat from this vulnerability is to data confidentiality and integrity as well as system availability. <p>Publish Date: 2021-01-19 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-20190>CVE-2021-20190</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2854">https://github.com/FasterXML/jackson-databind/issues/2854</a></p> <p>Release Date: 2021-01-19</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind-2.9.10.7</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.4.5","packageFilePaths":["/samples/client/petstore/scala/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.swagger:swagger-core:1.5.8;com.fasterxml.jackson.core:jackson-databind:2.4.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind-2.9.10.7","isBinary":false},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.9","packageFilePaths":[null],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-guice_2.12:2.6.3;com.typesafe.play:play_2.12:2.6.3;com.fasterxml.jackson.core:jackson-databind:2.8.9","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind-2.9.10.7","isBinary":false},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.8","packageFilePaths":[null],"isTransitiveDependency":true,"dependencyTree":"com.github.finagle:finch-circe_2.11:0.15.1;io.circe:circe-jackson28_2.11:0.8.0;com.fasterxml.jackson.core:jackson-databind:2.8.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind-2.9.10.7","isBinary":false},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.6.4","packageFilePaths":["/samples/client/petstore/java/jersey1/build.gradle"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.6.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind-2.9.10.7","isBinary":false},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.8","packageFilePaths":[null],"isTransitiveDependency":true,"dependencyTree":"com.lightbend.lagom:lagom-scaladsl-api_2.11:1.3.8;com.lightbend.lagom:lagom-api_2.11:1.3.8;com.typesafe.play:play_2.11:2.5.13;com.fasterxml.jackson.core:jackson-databind:2.7.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind-2.9.10.7","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-20190","vulnerabilityDetails":"A flaw was found in jackson-databind before 2.9.10.7. FasterXML mishandles the interaction between serialization gadgets and typing. The highest threat from this vulnerability is to data confidentiality and integrity as well as system availability.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-20190","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2021-20190 (High) detected in multiple libraries - ## CVE-2021-20190 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.4.5.jar</b>, <b>jackson-databind-2.8.9.jar</b>, <b>jackson-databind-2.8.8.jar</b>, <b>jackson-databind-2.6.4.jar</b>, <b>jackson-databind-2.7.8.jar</b></p></summary> <p> <details><summary><b>jackson-databind-2.4.5.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /samples/client/petstore/scala/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.4.5/c69c0cb613128c69d84a6a0304ddb9fce82e8242/jackson-databind-2.4.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.4.5/c69c0cb613128c69d84a6a0304ddb9fce82e8242/jackson-databind-2.4.5.jar</p> <p> Dependency Hierarchy: - swagger-core-1.5.8.jar (Root Library) - :x: **jackson-databind-2.4.5.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.9.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: /home/wss-scanner/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.8.9.jar</p> <p> Dependency Hierarchy: - play-guice_2.12-2.6.3.jar (Root Library) - play_2.12-2.6.3.jar - :x: **jackson-databind-2.8.9.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: /home/wss-scanner/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.8.8.jar</p> <p> Dependency Hierarchy: - finch-circe_2.11-0.15.1.jar (Root Library) - circe-jackson28_2.11-0.8.0.jar - :x: **jackson-databind-2.8.8.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.6.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /samples/client/petstore/java/jersey1/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.6.4/f2abadd10891512268b16a1a1a6f81890f3e2976/jackson-databind-2.6.4.jar,/aches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.6.4/f2abadd10891512268b16a1a1a6f81890f3e2976/jackson-databind-2.6.4.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.6.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.7.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: /home/wss-scanner/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.7.8.jar</p> <p> Dependency Hierarchy: - lagom-scaladsl-api_2.11-1.3.8.jar (Root Library) - lagom-api_2.11-1.3.8.jar - play_2.11-2.5.13.jar - :x: **jackson-databind-2.7.8.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/swagger-api/swagger-codegen/commit/4b7a8d7d7384aa6a27d6309c35ade0916edae7ed">4b7a8d7d7384aa6a27d6309c35ade0916edae7ed</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A flaw was found in jackson-databind before 2.9.10.7. FasterXML mishandles the interaction between serialization gadgets and typing. The highest threat from this vulnerability is to data confidentiality and integrity as well as system availability. <p>Publish Date: 2021-01-19 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-20190>CVE-2021-20190</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2854">https://github.com/FasterXML/jackson-databind/issues/2854</a></p> <p>Release Date: 2021-01-19</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind-2.9.10.7</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.4.5","packageFilePaths":["/samples/client/petstore/scala/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.swagger:swagger-core:1.5.8;com.fasterxml.jackson.core:jackson-databind:2.4.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind-2.9.10.7","isBinary":false},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.9","packageFilePaths":[null],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-guice_2.12:2.6.3;com.typesafe.play:play_2.12:2.6.3;com.fasterxml.jackson.core:jackson-databind:2.8.9","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind-2.9.10.7","isBinary":false},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.8","packageFilePaths":[null],"isTransitiveDependency":true,"dependencyTree":"com.github.finagle:finch-circe_2.11:0.15.1;io.circe:circe-jackson28_2.11:0.8.0;com.fasterxml.jackson.core:jackson-databind:2.8.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind-2.9.10.7","isBinary":false},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.6.4","packageFilePaths":["/samples/client/petstore/java/jersey1/build.gradle"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.6.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind-2.9.10.7","isBinary":false},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.8","packageFilePaths":[null],"isTransitiveDependency":true,"dependencyTree":"com.lightbend.lagom:lagom-scaladsl-api_2.11:1.3.8;com.lightbend.lagom:lagom-api_2.11:1.3.8;com.typesafe.play:play_2.11:2.5.13;com.fasterxml.jackson.core:jackson-databind:2.7.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind-2.9.10.7","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-20190","vulnerabilityDetails":"A flaw was found in jackson-databind before 2.9.10.7. FasterXML mishandles the interaction between serialization gadgets and typing. The highest threat from this vulnerability is to data confidentiality and integrity as well as system availability.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-20190","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file samples client petstore scala build gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy swagger core jar root library x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to vulnerable library home wss scanner cache com fasterxml jackson core jackson databind bundles jackson databind jar dependency hierarchy play guice jar root library play jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to vulnerable library home wss scanner cache com fasterxml jackson core jackson databind bundles jackson databind jar dependency hierarchy finch circe jar root library circe jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file samples client petstore java build gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar aches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to vulnerable library home wss scanner cache com fasterxml jackson core jackson databind bundles jackson databind jar dependency hierarchy lagom scaladsl api jar root library lagom api jar play jar x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details a flaw was found in jackson databind before fasterxml mishandles the interaction between serialization gadgets and typing the highest threat from this vulnerability is to data confidentiality and integrity as well as system availability publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree io swagger swagger core com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind isbinary false packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree com typesafe play play guice com typesafe play play com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind isbinary false packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree com github finagle finch circe io circe circe com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind isbinary false packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency false dependencytree com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind isbinary false packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree com lightbend lagom lagom scaladsl api com lightbend lagom lagom api com typesafe play play com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails a flaw was found in jackson databind before fasterxml mishandles the interaction between serialization gadgets and typing the highest threat from this vulnerability is to data confidentiality and integrity as well as system availability vulnerabilityurl
0
6,753
9,880,372,549
IssuesEvent
2019-06-24 12:28:18
didi/mpx
https://api.github.com/repos/didi/mpx
closed
function 加上 async 后代码发生错误也不会抛出
processing
如图,toPureObject 没有被引入,理应抛出错误。但在onShow 前面加上了 async,错误就消失了,但代码没有继续执行下去。去掉 async 会抛出错误。 ![image](https://user-images.githubusercontent.com/18588816/59972395-d1f0f600-95c0-11e9-8db2-030fda54beb0.png)
1.0
function 加上 async 后代码发生错误也不会抛出 - 如图,toPureObject 没有被引入,理应抛出错误。但在onShow 前面加上了 async,错误就消失了,但代码没有继续执行下去。去掉 async 会抛出错误。 ![image](https://user-images.githubusercontent.com/18588816/59972395-d1f0f600-95c0-11e9-8db2-030fda54beb0.png)
process
function 加上 async 后代码发生错误也不会抛出 如图,topureobject 没有被引入,理应抛出错误。但在onshow 前面加上了 async,错误就消失了,但代码没有继续执行下去。去掉 async 会抛出错误。
1
89,510
15,830,754,911
IssuesEvent
2021-04-06 12:53:39
rsoreq/zenbot
https://api.github.com/repos/rsoreq/zenbot
reopened
CVE-2019-20149 (High) detected in kind-of-6.0.2.tgz
security vulnerability
## CVE-2019-20149 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>kind-of-6.0.2.tgz</b></p></summary> <p>Get the native type of a value.</p> <p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz">https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz</a></p> <p>Path to dependency file: zenbot/package.json</p> <p>Path to vulnerable library: zenbot/node_modules/snapdragon-node/node_modules/kind-of/package.json</p> <p> Dependency Hierarchy: - webpack-4.44.1.tgz (Root Library) - micromatch-3.1.10.tgz - to-regex-3.0.2.tgz - define-property-2.0.2.tgz - is-descriptor-1.0.2.tgz - :x: **kind-of-6.0.2.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/rsoreq/zenbot/commit/7a24c0d7b98ee76e6bac827974cff490a7694378">7a24c0d7b98ee76e6bac827974cff490a7694378</a></p> <p>Found in base branch: <b>unstable</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> ctorName in index.js in kind-of v6.0.2 allows external user input to overwrite certain internal attributes via a conflicting name, as demonstrated by 'constructor': {'name':'Symbol'}. Hence, a crafted payload can overwrite this builtin attribute to manipulate the type detection result. <p>Publish Date: 2019-12-30 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20149>CVE-2019-20149</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149</a></p> <p>Release Date: 2019-12-30</p> <p>Fix Resolution: 6.0.3</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"kind-of","packageVersion":"6.0.2","isTransitiveDependency":true,"dependencyTree":"webpack:4.44.1;micromatch:3.1.10;to-regex:3.0.2;define-property:2.0.2;is-descriptor:1.0.2;kind-of:6.0.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"6.0.3"}],"vulnerabilityIdentifier":"CVE-2019-20149","vulnerabilityDetails":"ctorName in index.js in kind-of v6.0.2 allows external user input to overwrite certain internal attributes via a conflicting name, as demonstrated by \u0027constructor\u0027: {\u0027name\u0027:\u0027Symbol\u0027}. Hence, a crafted payload can overwrite this builtin attribute to manipulate the type detection result.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20149","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2019-20149 (High) detected in kind-of-6.0.2.tgz - ## CVE-2019-20149 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>kind-of-6.0.2.tgz</b></p></summary> <p>Get the native type of a value.</p> <p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz">https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz</a></p> <p>Path to dependency file: zenbot/package.json</p> <p>Path to vulnerable library: zenbot/node_modules/snapdragon-node/node_modules/kind-of/package.json</p> <p> Dependency Hierarchy: - webpack-4.44.1.tgz (Root Library) - micromatch-3.1.10.tgz - to-regex-3.0.2.tgz - define-property-2.0.2.tgz - is-descriptor-1.0.2.tgz - :x: **kind-of-6.0.2.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/rsoreq/zenbot/commit/7a24c0d7b98ee76e6bac827974cff490a7694378">7a24c0d7b98ee76e6bac827974cff490a7694378</a></p> <p>Found in base branch: <b>unstable</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> ctorName in index.js in kind-of v6.0.2 allows external user input to overwrite certain internal attributes via a conflicting name, as demonstrated by 'constructor': {'name':'Symbol'}. Hence, a crafted payload can overwrite this builtin attribute to manipulate the type detection result. <p>Publish Date: 2019-12-30 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20149>CVE-2019-20149</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149</a></p> <p>Release Date: 2019-12-30</p> <p>Fix Resolution: 6.0.3</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"kind-of","packageVersion":"6.0.2","isTransitiveDependency":true,"dependencyTree":"webpack:4.44.1;micromatch:3.1.10;to-regex:3.0.2;define-property:2.0.2;is-descriptor:1.0.2;kind-of:6.0.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"6.0.3"}],"vulnerabilityIdentifier":"CVE-2019-20149","vulnerabilityDetails":"ctorName in index.js in kind-of v6.0.2 allows external user input to overwrite certain internal attributes via a conflicting name, as demonstrated by \u0027constructor\u0027: {\u0027name\u0027:\u0027Symbol\u0027}. Hence, a crafted payload can overwrite this builtin attribute to manipulate the type detection result.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20149","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in kind of tgz cve high severity vulnerability vulnerable library kind of tgz get the native type of a value library home page a href path to dependency file zenbot package json path to vulnerable library zenbot node modules snapdragon node node modules kind of package json dependency hierarchy webpack tgz root library micromatch tgz to regex tgz define property tgz is descriptor tgz x kind of tgz vulnerable library found in head commit a href found in base branch unstable vulnerability details ctorname in index js in kind of allows external user input to overwrite certain internal attributes via a conflicting name as demonstrated by constructor name symbol hence a crafted payload can overwrite this builtin attribute to manipulate the type detection result publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails ctorname in index js in kind of allows external user input to overwrite certain internal attributes via a conflicting name as demonstrated by hence a crafted payload can overwrite this builtin attribute to manipulate the type detection result vulnerabilityurl
0
8,965
12,069,922,403
IssuesEvent
2020-04-16 16:47:36
googleapis/google-cloud-go
https://api.github.com/repos/googleapis/google-cloud-go
opened
monitoring: re-enable generation after breaking change comes through
api: monitoring type: process
There is a breaking change coming to this library. We will need to": - stop generation - Mark v1(of apiv3) as deprecated - re-enable generation under v2 after breaking change is merged - Update deprecation notice to let users know they should use v2
1.0
monitoring: re-enable generation after breaking change comes through - There is a breaking change coming to this library. We will need to": - stop generation - Mark v1(of apiv3) as deprecated - re-enable generation under v2 after breaking change is merged - Update deprecation notice to let users know they should use v2
process
monitoring re enable generation after breaking change comes through there is a breaking change coming to this library we will need to stop generation mark of as deprecated re enable generation under after breaking change is merged update deprecation notice to let users know they should use
1
10,878
13,649,116,510
IssuesEvent
2020-09-26 12:53:53
timberio/vector
https://api.github.com/repos/timberio/vector
closed
New `format_timestamp` remap function
domain: mapping domain: processing transform: remap type: feature
A `format_timestamp` function takes a timestamp as argument and formats it according to [Rust's strftime format](https://docs.rs/chrono/0.3.1/chrono/format/strftime/index.html). Note, we explicitly opted not to use the `strftime` function name, since our remap syntax function naming rules use the `format_` prefix for formatting string (#3740). ## Example ``` .timestamp_string = format_timestamp(.timestamp, "%F") ``` Would result in: ```js { "timestamp_string": "2020-08-03" } ```
1.0
New `format_timestamp` remap function - A `format_timestamp` function takes a timestamp as argument and formats it according to [Rust's strftime format](https://docs.rs/chrono/0.3.1/chrono/format/strftime/index.html). Note, we explicitly opted not to use the `strftime` function name, since our remap syntax function naming rules use the `format_` prefix for formatting string (#3740). ## Example ``` .timestamp_string = format_timestamp(.timestamp, "%F") ``` Would result in: ```js { "timestamp_string": "2020-08-03" } ```
process
new format timestamp remap function a format timestamp function takes a timestamp as argument and formats it according to note we explicitly opted not to use the strftime function name since our remap syntax function naming rules use the format prefix for formatting string example timestamp string format timestamp timestamp f would result in js timestamp string
1
9,891
12,890,275,957
IssuesEvent
2020-07-13 15:45:05
MicrosoftDocs/azure-devops-docs
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
closed
Token for the current date seems missing
Pri2 devops-cicd-process/tech devops/prod doc-enhancement
Please also include a token for the current day. We already have a token for yyyymmdd. I don't know if its not supported or missing in documentation but we needed a token for just the day (dd) component of the date. [Enter feedback here] --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: a57f8545-bb15-3a71-1876-3a9ec1a59b93 * Version Independent ID: 28c87c8d-c28d-7493-0c7c-8c38b04fbcd7 * Content: [Run (build) number - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/run-number?view=azure-devops&tabs=yaml) * Content Source: [docs/pipelines/process/run-number.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/run-number.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
1.0
Token for the current date seems missing - Please also include a token for the current day. We already have a token for yyyymmdd. I don't know if its not supported or missing in documentation but we needed a token for just the day (dd) component of the date. [Enter feedback here] --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: a57f8545-bb15-3a71-1876-3a9ec1a59b93 * Version Independent ID: 28c87c8d-c28d-7493-0c7c-8c38b04fbcd7 * Content: [Run (build) number - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/run-number?view=azure-devops&tabs=yaml) * Content Source: [docs/pipelines/process/run-number.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/run-number.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
process
token for the current date seems missing please also include a token for the current day we already have a token for yyyymmdd i don t know if its not supported or missing in documentation but we needed a token for just the day dd component of the date document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
1
1,694
4,345,741,284
IssuesEvent
2016-07-29 13:49:09
openvstorage/framework-alba-plugin
https://api.github.com/repos/openvstorage/framework-alba-plugin
closed
Speed up listing ALBA backends and presets during add vPool
priority_urgent process_wontfix type_bug
When you create a vPool and load the local backend it takes ages before the data is shown. We should make the vPool creation wizard much snappier. Some suggestions: * Load only the presets of the selected backend. * Don't always check if policies are in use * Only fetch the name of the preset instead of the preset and policies
1.0
Speed up listing ALBA backends and presets during add vPool - When you create a vPool and load the local backend it takes ages before the data is shown. We should make the vPool creation wizard much snappier. Some suggestions: * Load only the presets of the selected backend. * Don't always check if policies are in use * Only fetch the name of the preset instead of the preset and policies
process
speed up listing alba backends and presets during add vpool when you create a vpool and load the local backend it takes ages before the data is shown we should make the vpool creation wizard much snappier some suggestions load only the presets of the selected backend don t always check if policies are in use only fetch the name of the preset instead of the preset and policies
1
7,742
10,863,261,944
IssuesEvent
2019-11-14 14:49:49
openopps/openopps-platform
https://api.github.com/repos/openopps/openopps-platform
opened
Change EST to ET
Apply Process State Dept.
Who: All users What: Change EST to ET Why: To be consistent and avoid confusion Acceptance Criteria: Change instances of EST to be ET Reason: EST is only half of the year as well as EDT so ET should be used Changes should be made: - State: Next steps page - State: What's next page - State: Update application page - State: Search page in the info box that tells applicants when they can apply
1.0
Change EST to ET - Who: All users What: Change EST to ET Why: To be consistent and avoid confusion Acceptance Criteria: Change instances of EST to be ET Reason: EST is only half of the year as well as EDT so ET should be used Changes should be made: - State: Next steps page - State: What's next page - State: Update application page - State: Search page in the info box that tells applicants when they can apply
process
change est to et who all users what change est to et why to be consistent and avoid confusion acceptance criteria change instances of est to be et reason est is only half of the year as well as edt so et should be used changes should be made state next steps page state what s next page state update application page state search page in the info box that tells applicants when they can apply
1
118,945
17,602,834,027
IssuesEvent
2021-08-17 13:49:52
harrinry/nifi
https://api.github.com/repos/harrinry/nifi
opened
CVE-2020-9548 (High) detected in multiple libraries
security vulnerability
## CVE-2020-9548 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.9.7.jar</b>, <b>jackson-databind-2.9.6.jar</b>, <b>jackson-databind-2.9.5.jar</b>, <b>jackson-databind-2.3.1.jar</b>, <b>jackson-databind-2.5.4.jar</b>, <b>jackson-databind-2.9.4.jar</b>, <b>jackson-databind-2.7.8.jar</b></p></summary> <p> <details><summary><b>jackson-databind-2.9.7.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.9.7.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.9.6.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: nifi-nar-bundles/nifi-media-bundle/nifi-media-nar/target/classes/META-INF/bundled-dependencies/jackson-databind-2.9.6.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.9.6.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.9.5.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: nifi-nar-bundles/nifi-enrich-bundle/nifi-enrich-nar/target/classes/META-INF/bundled-dependencies/jackson-databind-2.9.5.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.9.5.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.3.1.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Path to vulnerable library: nifi-nar-bundles/nifi-kite-bundle/nifi-kite-nar/target/classes/META-INF/bundled-dependencies/jackson-databind-2.3.1.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.3.1.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.5.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: nifi-nar-bundles/nifi-solr-bundle/nifi-solr-nar/target/classes/META-INF/bundled-dependencies/jackson-databind-2.5.4.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.5.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.9.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: nifi-nar-bundles/nifi-standard-services/nifi-proxy-configuration-bundle/nifi-proxy-configuration-nar/target/classes/META-INF/bundled-dependencies/jackson-databind-2.9.4.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.9.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.7.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/target/classes/META-INF/bundled-dependencies/jackson-databind-2.7.8.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.7.8.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/harrinry/nifi/commit/0484b1d281ed1df9c5682361286dee2ca6c634fe">0484b1d281ed1df9c5682361286dee2ca6c634fe</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to br.com.anteros.dbcp.AnterosDBCPConfig (aka anteros-core). <p>Publish Date: 2020-03-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-9548>CVE-2020-9548</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9548">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9548</a></p> <p>Release Date: 2020-03-02</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.7","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.7","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.6","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.5","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.3.1","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.3.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.5.4","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.5.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.4","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.8","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.7.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-9548","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to br.com.anteros.dbcp.AnterosDBCPConfig (aka anteros-core).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-9548","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2020-9548 (High) detected in multiple libraries - ## CVE-2020-9548 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.9.7.jar</b>, <b>jackson-databind-2.9.6.jar</b>, <b>jackson-databind-2.9.5.jar</b>, <b>jackson-databind-2.3.1.jar</b>, <b>jackson-databind-2.5.4.jar</b>, <b>jackson-databind-2.9.4.jar</b>, <b>jackson-databind-2.7.8.jar</b></p></summary> <p> <details><summary><b>jackson-databind-2.9.7.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.9.7.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.9.6.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: nifi-nar-bundles/nifi-media-bundle/nifi-media-nar/target/classes/META-INF/bundled-dependencies/jackson-databind-2.9.6.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.9.6.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.9.5.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: nifi-nar-bundles/nifi-enrich-bundle/nifi-enrich-nar/target/classes/META-INF/bundled-dependencies/jackson-databind-2.9.5.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.9.5.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.3.1.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Path to vulnerable library: nifi-nar-bundles/nifi-kite-bundle/nifi-kite-nar/target/classes/META-INF/bundled-dependencies/jackson-databind-2.3.1.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.3.1.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.5.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: nifi-nar-bundles/nifi-solr-bundle/nifi-solr-nar/target/classes/META-INF/bundled-dependencies/jackson-databind-2.5.4.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.5.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.9.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: nifi-nar-bundles/nifi-standard-services/nifi-proxy-configuration-bundle/nifi-proxy-configuration-nar/target/classes/META-INF/bundled-dependencies/jackson-databind-2.9.4.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.9.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.7.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/target/classes/META-INF/bundled-dependencies/jackson-databind-2.7.8.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.7.8.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/harrinry/nifi/commit/0484b1d281ed1df9c5682361286dee2ca6c634fe">0484b1d281ed1df9c5682361286dee2ca6c634fe</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to br.com.anteros.dbcp.AnterosDBCPConfig (aka anteros-core). <p>Publish Date: 2020-03-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-9548>CVE-2020-9548</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9548">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9548</a></p> <p>Release Date: 2020-03-02</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.7","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.7","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.6","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.5","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.3.1","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.3.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.5.4","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.5.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.4","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.8","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.7.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-9548","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to br.com.anteros.dbcp.AnterosDBCPConfig (aka anteros-core).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-9548","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href dependency hierarchy x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to vulnerable library nifi nar bundles nifi media bundle nifi media nar target classes meta inf bundled dependencies jackson databind jar dependency hierarchy x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to vulnerable library nifi nar bundles nifi enrich bundle nifi enrich nar target classes meta inf bundled dependencies jackson databind jar dependency hierarchy x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api path to vulnerable library nifi nar bundles nifi kite bundle nifi kite nar target classes meta inf bundled dependencies jackson databind jar dependency hierarchy x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to vulnerable library nifi nar bundles nifi solr bundle nifi solr nar target classes meta inf bundled dependencies jackson databind jar dependency hierarchy x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to vulnerable library nifi nar bundles nifi standard services nifi proxy configuration bundle nifi proxy configuration nar target classes meta inf bundled dependencies jackson databind jar dependency hierarchy x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to vulnerable library nifi nar bundles nifi hadoop libraries bundle nifi hadoop libraries nar target classes meta inf bundled dependencies jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to br com anteros dbcp anterosdbcpconfig aka anteros core publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency false dependencytree com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency false dependencytree com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency false dependencytree com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency false dependencytree com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency false dependencytree com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency false dependencytree com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind basebranches vulnerabilityidentifier cve vulnerabilitydetails fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to br com anteros dbcp anterosdbcpconfig aka anteros core vulnerabilityurl
0
234,470
7,721,504,575
IssuesEvent
2018-05-24 05:47:35
WordImpress/Give
https://api.github.com/repos/WordImpress/Give
closed
fix(donation): prevent WP users from being created if donation fails
3-reported high-priority urgent
## User Story As a `Give Admin`, I am experiencing a lot of spam user registrations through my Give donation forms. I want to prevent that from happening. ## Current Behavior I currently am using both the Akismet integration and also the functionality plugin [Stop Donor Spam](https://github.com/mathetos/Stop-Donor-Spam), but these spam users are still being created. ## Expected Behavior I expect all wordpress users generated through our Give forms to be actual donors, and for spam donors to be blocked. ## Possible Solution Check on the order of actions during donation. @kevinwhoffman and I have a hunch that the user generation action is happening BEFORE the donation has been validated. Ideally, user creation should only happen after the donation itself has been completely successful and passed all validation checks. ## Steps to Reproduce 1. We have a local site that has a lot of spam users that were created via Give forms. ## Related *Customer Reports* - https://secure.helpscout.net/conversation/556699222/16607/?folderId=672197 - Search for all HS tickets with the tag `donor spam` https://secure.helpscout.net/search/?query=tag:donor%20spam * Related Github Issues - https://github.com/WordImpress/Give/issues/2930 - https://github.com/WordImpress/Give/issues/2862 - https://github.com/WordImpress/Give/issues/2568 ## Tasks - [x] Reorder the actions (if that is the cause of this issue of course) - [x] Do test donations that should intentionally get rejected by Akismet and verify whether the WP user was created while the donation was rejected. - [x] Test the Related Github issues in combination with this fix to ensure all new spam prevention methods are still working as intended.
1.0
fix(donation): prevent WP users from being created if donation fails - ## User Story As a `Give Admin`, I am experiencing a lot of spam user registrations through my Give donation forms. I want to prevent that from happening. ## Current Behavior I currently am using both the Akismet integration and also the functionality plugin [Stop Donor Spam](https://github.com/mathetos/Stop-Donor-Spam), but these spam users are still being created. ## Expected Behavior I expect all wordpress users generated through our Give forms to be actual donors, and for spam donors to be blocked. ## Possible Solution Check on the order of actions during donation. @kevinwhoffman and I have a hunch that the user generation action is happening BEFORE the donation has been validated. Ideally, user creation should only happen after the donation itself has been completely successful and passed all validation checks. ## Steps to Reproduce 1. We have a local site that has a lot of spam users that were created via Give forms. ## Related *Customer Reports* - https://secure.helpscout.net/conversation/556699222/16607/?folderId=672197 - Search for all HS tickets with the tag `donor spam` https://secure.helpscout.net/search/?query=tag:donor%20spam * Related Github Issues - https://github.com/WordImpress/Give/issues/2930 - https://github.com/WordImpress/Give/issues/2862 - https://github.com/WordImpress/Give/issues/2568 ## Tasks - [x] Reorder the actions (if that is the cause of this issue of course) - [x] Do test donations that should intentionally get rejected by Akismet and verify whether the WP user was created while the donation was rejected. - [x] Test the Related Github issues in combination with this fix to ensure all new spam prevention methods are still working as intended.
non_process
fix donation prevent wp users from being created if donation fails user story as a give admin i am experiencing a lot of spam user registrations through my give donation forms i want to prevent that from happening current behavior i currently am using both the akismet integration and also the functionality plugin but these spam users are still being created expected behavior i expect all wordpress users generated through our give forms to be actual donors and for spam donors to be blocked possible solution check on the order of actions during donation kevinwhoffman and i have a hunch that the user generation action is happening before the donation has been validated ideally user creation should only happen after the donation itself has been completely successful and passed all validation checks steps to reproduce we have a local site that has a lot of spam users that were created via give forms related customer reports search for all hs tickets with the tag donor spam related github issues tasks reorder the actions if that is the cause of this issue of course do test donations that should intentionally get rejected by akismet and verify whether the wp user was created while the donation was rejected test the related github issues in combination with this fix to ensure all new spam prevention methods are still working as intended
0
327,733
28,080,369,903
IssuesEvent
2023-03-30 05:39:39
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
opened
roachtest: sqlsmith/setup=seed/setting=no-ddl failed
C-test-failure O-robot O-roachtest branch-release-23.1
roachtest.sqlsmith/setup=seed/setting=no-ddl [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/9329923?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/9329923?buildTab=artifacts#/sqlsmith/setup=seed/setting=no-ddl) on release-23.1 @ [aec78f33d45a8376a0ecec885688bae60dbfb85c](https://github.com/cockroachdb/cockroach/commits/aec78f33d45a8376a0ecec885688bae60dbfb85c): ``` test artifacts and logs in: /artifacts/sqlsmith/setup=seed/setting=no-ddl/run_1 (sqlsmith.go:244).func3: error: pq: internal error: regproc(): ResolveFunction unimplemented stmt: WITH "w'\\U00035ED1ith381" (col2573) AS (SELECT tab1110._bool AS col2573 FROM defaultdb.public.seed@[0] AS tab1110), "?wi%03th382" (col2574) AS ( SELECT 2990964997:::OID AS col2574 FROM defaultdb.public.seed@[0] AS "tab""1111", defaultdb.public.seed@seed__int8__float8__date_idx AS tab1112, "w'\\U00035ED1ith381" AS ".cte_ref102" ) SELECT regproc(e'z\\a).+A':::STRING::STRING)::REGPROC AS "cOl2575" FROM "?wi%03th382" AS "cte_r""ef103" ORDER BY "cte_r""ef103".col2574 DESC NULLS LAST, "cte_r""ef103".col2574, "cte_r""ef103".col2574 ASC NULLS FIRST LIMIT 23:::INT8; ``` <p>Parameters: <code>ROACHTEST_cloud=gce</code> , <code>ROACHTEST_cpu=4</code> , <code>ROACHTEST_encrypted=false</code> , <code>ROACHTEST_ssd=0</code> </p> <details><summary>Help</summary> <p> See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md) See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7) </p> </details> /cc @cockroachdb/sql-queries <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*sqlsmith/setup=seed/setting=no-ddl.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub>
2.0
roachtest: sqlsmith/setup=seed/setting=no-ddl failed - roachtest.sqlsmith/setup=seed/setting=no-ddl [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/9329923?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/9329923?buildTab=artifacts#/sqlsmith/setup=seed/setting=no-ddl) on release-23.1 @ [aec78f33d45a8376a0ecec885688bae60dbfb85c](https://github.com/cockroachdb/cockroach/commits/aec78f33d45a8376a0ecec885688bae60dbfb85c): ``` test artifacts and logs in: /artifacts/sqlsmith/setup=seed/setting=no-ddl/run_1 (sqlsmith.go:244).func3: error: pq: internal error: regproc(): ResolveFunction unimplemented stmt: WITH "w'\\U00035ED1ith381" (col2573) AS (SELECT tab1110._bool AS col2573 FROM defaultdb.public.seed@[0] AS tab1110), "?wi%03th382" (col2574) AS ( SELECT 2990964997:::OID AS col2574 FROM defaultdb.public.seed@[0] AS "tab""1111", defaultdb.public.seed@seed__int8__float8__date_idx AS tab1112, "w'\\U00035ED1ith381" AS ".cte_ref102" ) SELECT regproc(e'z\\a).+A':::STRING::STRING)::REGPROC AS "cOl2575" FROM "?wi%03th382" AS "cte_r""ef103" ORDER BY "cte_r""ef103".col2574 DESC NULLS LAST, "cte_r""ef103".col2574, "cte_r""ef103".col2574 ASC NULLS FIRST LIMIT 23:::INT8; ``` <p>Parameters: <code>ROACHTEST_cloud=gce</code> , <code>ROACHTEST_cpu=4</code> , <code>ROACHTEST_encrypted=false</code> , <code>ROACHTEST_ssd=0</code> </p> <details><summary>Help</summary> <p> See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md) See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7) </p> </details> /cc @cockroachdb/sql-queries <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*sqlsmith/setup=seed/setting=no-ddl.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub>
non_process
roachtest sqlsmith setup seed setting no ddl failed roachtest sqlsmith setup seed setting no ddl with on release test artifacts and logs in artifacts sqlsmith setup seed setting no ddl run sqlsmith go error pq internal error regproc resolvefunction unimplemented stmt with w as select bool as from defaultdb public seed as wi as select oid as from defaultdb public seed as tab defaultdb public seed seed date idx as w as cte select regproc e z a a string string regproc as from wi as cte r order by cte r desc nulls last cte r cte r asc nulls first limit parameters roachtest cloud gce roachtest cpu roachtest encrypted false roachtest ssd help see see cc cockroachdb sql queries
0
303,806
23,040,578,371
IssuesEvent
2022-07-23 04:31:46
wp-graphql/wp-graphql
https://api.github.com/repos/wp-graphql/wp-graphql
closed
Have docs site pull extension docs from their repo README files
Documentation :rocket: Actionable
The docs.wpgraphql.com site should pull the README from each extension as the content for the docs shown for that extension. This way, each extension can maintain their canonical docs in their repo and be reflected in the searchable docs site. Additional docs required for extensions should be maintained separately by said extension.
1.0
Have docs site pull extension docs from their repo README files - The docs.wpgraphql.com site should pull the README from each extension as the content for the docs shown for that extension. This way, each extension can maintain their canonical docs in their repo and be reflected in the searchable docs site. Additional docs required for extensions should be maintained separately by said extension.
non_process
have docs site pull extension docs from their repo readme files the docs wpgraphql com site should pull the readme from each extension as the content for the docs shown for that extension this way each extension can maintain their canonical docs in their repo and be reflected in the searchable docs site additional docs required for extensions should be maintained separately by said extension
0
307,805
23,217,324,426
IssuesEvent
2022-08-02 15:02:52
woowa-techcamp-2022/android-accountbook-12
https://api.github.com/repos/woowa-techcamp-2022/android-accountbook-12
closed
Setting 화면(+ 결제수단, 카테고리 작성 화면) Component 정리 및 세부 기능 구현하기
documentation💡 feature🔥 refactor🛠 진연주🍿
### 📌 내용 Setting화면과 결제수단, 수입/지출 카테고리 작성화면 Component 분리 및 세부 기능 추가 구현하기 ### 🛠 ToDo - [x] 결제 수단 입력 화면 Component 분리하기 - [x] 수입/지출 카테고리 입력화면 Component 분리하기 - [x] 설정 화면 Component 분리하기 - [x] 결제 수단 입력 시, 이름 중복 체크 - [x] 수입/지출 입력 시, 이름 중복 체크 - [x] 이름이 중복일 시 snackbar로 알려주기 - [x] 결제수단, 수입/지출 입력 시, soft keyboard에 의해 버튼이 가려지는거 수정
1.0
Setting 화면(+ 결제수단, 카테고리 작성 화면) Component 정리 및 세부 기능 구현하기 - ### 📌 내용 Setting화면과 결제수단, 수입/지출 카테고리 작성화면 Component 분리 및 세부 기능 추가 구현하기 ### 🛠 ToDo - [x] 결제 수단 입력 화면 Component 분리하기 - [x] 수입/지출 카테고리 입력화면 Component 분리하기 - [x] 설정 화면 Component 분리하기 - [x] 결제 수단 입력 시, 이름 중복 체크 - [x] 수입/지출 입력 시, 이름 중복 체크 - [x] 이름이 중복일 시 snackbar로 알려주기 - [x] 결제수단, 수입/지출 입력 시, soft keyboard에 의해 버튼이 가려지는거 수정
non_process
setting 화면 결제수단 카테고리 작성 화면 component 정리 및 세부 기능 구현하기 📌 내용 setting화면과 결제수단 수입 지출 카테고리 작성화면 component 분리 및 세부 기능 추가 구현하기 🛠 todo 결제 수단 입력 화면 component 분리하기 수입 지출 카테고리 입력화면 component 분리하기 설정 화면 component 분리하기 결제 수단 입력 시 이름 중복 체크 수입 지출 입력 시 이름 중복 체크 이름이 중복일 시 snackbar로 알려주기 결제수단 수입 지출 입력 시 soft keyboard에 의해 버튼이 가려지는거 수정
0
34,726
30,322,347,820
IssuesEvent
2023-07-10 20:17:38
ProjectPythia/projectpythia.github.io
https://api.github.com/repos/ProjectPythia/projectpythia.github.io
closed
Issue with dataset
infrastructure
Do the CESM2_sst_data.nc and CESM2_grid_variables.nc in https://github.com/ProjectPythia/pythia-datasets/tree/main/data need to be updated. The format for date and time seem to have an issue
1.0
Issue with dataset - Do the CESM2_sst_data.nc and CESM2_grid_variables.nc in https://github.com/ProjectPythia/pythia-datasets/tree/main/data need to be updated. The format for date and time seem to have an issue
non_process
issue with dataset do the sst data nc and grid variables nc in need to be updated the format for date and time seem to have an issue
0
350,894
25,005,369,491
IssuesEvent
2022-11-03 11:23:23
tech-acs/chimera-docs
https://api.github.com/repos/tech-acs/chimera-docs
opened
Fill in sections of 'Users' documentation
documentation enhancement
Introduction section and its sub-sections User interface section and all its sub-sections
1.0
Fill in sections of 'Users' documentation - Introduction section and its sub-sections User interface section and all its sub-sections
non_process
fill in sections of users documentation introduction section and its sub sections user interface section and all its sub sections
0
2,184
5,032,365,606
IssuesEvent
2016-12-16 10:57:57
DevExpress/testcafe-hammerhead
https://api.github.com/repos/DevExpress/testcafe-hammerhead
opened
Figure out how we may to emulate meta[name="referrer"] tag behavior
AREA: client AREA: server SYSTEM: URL processing TYPE: bug
We have a problem with it on https://login.yahoo.com ![image](https://cloud.githubusercontent.com/assets/5373460/21259716/b3e30a7e-c393-11e6-9042-4d31fe831e63.png) This page contains `meta[name="referrer"]` tag because of which we have the refirer with `http://localhost:1401` value `<meta name="referrer" content="origin-when-cross-origin">` [specification](https://www.w3.org/TR/referrer-policy/#referrer-policies), [workaround for \<meta name="referrer" content="origin"\>](https://github.com/DevExpress/testcafe-hammerhead/blob/a5f8127a1c571280b792d61424b6ab33da2724f9/src/processing/resources/page.js#L102)
1.0
Figure out how we may to emulate meta[name="referrer"] tag behavior - We have a problem with it on https://login.yahoo.com ![image](https://cloud.githubusercontent.com/assets/5373460/21259716/b3e30a7e-c393-11e6-9042-4d31fe831e63.png) This page contains `meta[name="referrer"]` tag because of which we have the refirer with `http://localhost:1401` value `<meta name="referrer" content="origin-when-cross-origin">` [specification](https://www.w3.org/TR/referrer-policy/#referrer-policies), [workaround for \<meta name="referrer" content="origin"\>](https://github.com/DevExpress/testcafe-hammerhead/blob/a5f8127a1c571280b792d61424b6ab33da2724f9/src/processing/resources/page.js#L102)
process
figure out how we may to emulate meta tag behavior we have a problem with it on this page contains meta tag because of which we have the refirer with value
1
5,375
8,203,286,127
IssuesEvent
2018-09-02 19:33:24
EasyEngine/easyengine
https://api.github.com/repos/EasyEngine/easyengine
opened
Look into static analysis tools for EE
component/development-process
We can use something like - [PHPMetrics](http://www.phpmetrics.org/) or [SonarQube](https://github.com/SonarSource/sonar-php) and integrate it with our CI servers to continually monitor our code quality and other metrics like cyclometric complexity etc... so that we're aware which part of our codebase is more bloated and needs refactoring.
1.0
Look into static analysis tools for EE - We can use something like - [PHPMetrics](http://www.phpmetrics.org/) or [SonarQube](https://github.com/SonarSource/sonar-php) and integrate it with our CI servers to continually monitor our code quality and other metrics like cyclometric complexity etc... so that we're aware which part of our codebase is more bloated and needs refactoring.
process
look into static analysis tools for ee we can use something like or and integrate it with our ci servers to continually monitor our code quality and other metrics like cyclometric complexity etc so that we re aware which part of our codebase is more bloated and needs refactoring
1
21,512
29,799,390,948
IssuesEvent
2023-06-16 06:51:13
open-telemetry/opentelemetry-collector-contrib
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
closed
Metrics in k8s processor needs refactor
bug help wanted good first issue processor/k8sattributes
**Describe the bug** Metrics exposed by k8s processor lacks meaningful description, and have invalid format. **Additional context** As mentioned in comment https://github.com/open-telemetry/opentelemetry-collector-contrib/pull/2199#discussion_r573754727 there is need to provide some additional information about meaning of those metrics. Also need to change their format to fit standard.
1.0
Metrics in k8s processor needs refactor - **Describe the bug** Metrics exposed by k8s processor lacks meaningful description, and have invalid format. **Additional context** As mentioned in comment https://github.com/open-telemetry/opentelemetry-collector-contrib/pull/2199#discussion_r573754727 there is need to provide some additional information about meaning of those metrics. Also need to change their format to fit standard.
process
metrics in processor needs refactor describe the bug metrics exposed by processor lacks meaningful description and have invalid format additional context as mentioned in comment there is need to provide some additional information about meaning of those metrics also need to change their format to fit standard
1
202,622
15,287,850,778
IssuesEvent
2021-02-23 16:12:41
pangeo-data/climpred
https://api.github.com/repos/pangeo-data/climpred
opened
resolve skipped tests
testing
currently there are 17 skipped/silent tests. Ideally, there should be none.
1.0
resolve skipped tests - currently there are 17 skipped/silent tests. Ideally, there should be none.
non_process
resolve skipped tests currently there are skipped silent tests ideally there should be none
0
9,065
12,138,658,265
IssuesEvent
2020-04-23 17:36:08
GoogleCloudPlatform/python-docs-samples
https://api.github.com/repos/GoogleCloudPlatform/python-docs-samples
closed
remove gcp-devrel-py-tools from kms/api-client/requirements-test.txt
priority: p2 remove-gcp-devrel-py-tools type: process
remove gcp-devrel-py-tools from kms/api-client/requirements-test.txt
1.0
remove gcp-devrel-py-tools from kms/api-client/requirements-test.txt - remove gcp-devrel-py-tools from kms/api-client/requirements-test.txt
process
remove gcp devrel py tools from kms api client requirements test txt remove gcp devrel py tools from kms api client requirements test txt
1
14,944
8,711,287,270
IssuesEvent
2018-12-06 18:46:40
szeged/webrender
https://api.github.com/repos/szeged/webrender
closed
(Re)use the Descriptor sets more efficiently
dx12 enhancement in progress performance
Currently we allocating a lot of descriptor sets in a pool, and DX12 does not like this very much. But without it we can run out of them easily. We should be more conservative with the allocation and reuse more of the sets.
True
(Re)use the Descriptor sets more efficiently - Currently we allocating a lot of descriptor sets in a pool, and DX12 does not like this very much. But without it we can run out of them easily. We should be more conservative with the allocation and reuse more of the sets.
non_process
re use the descriptor sets more efficiently currently we allocating a lot of descriptor sets in a pool and does not like this very much but without it we can run out of them easily we should be more conservative with the allocation and reuse more of the sets
0
22,289
30,842,316,315
IssuesEvent
2023-08-02 11:32:45
NationalSecurityAgency/ghidra
https://api.github.com/repos/NationalSecurityAgency/ghidra
closed
Decompiler Ignores Possibility of NaN in Floating Point Comparison
Feature: Processor/x86 Status: Internal
**Describe the bug** The decompiler incorrectly decompiles a floating point comparison of the form: ``` ucomisd %xmm0,%xmm0 setp %al ret ``` to assume that the result is always false, discounting the possibility of xmm0 containing NaN. **To Reproduce** Steps to reproduce the behavior: 1. Open `is_nan.o` in CodeBrowser: [is_nan.zip](https://github.com/NationalSecurityAgency/ghidra/files/9543315/is_nan.zip). 2. Go to `is_nan` function. 3. Open decompiler. 4. Observe that it decompiles to: ```c undefined is_nan(void) { return 0; } ``` **Expected behavior** Should not assume that `x != x` simplifies to false in the case of floating point comparisons, and should decompile to something more like: ```c undefined is_nan(double x) { return x != x; } ``` **Attachments** Example binary: [is_nan.zip](https://github.com/NationalSecurityAgency/ghidra/files/9543315/is_nan.zip). **Environment (please complete the following information):** - OS: Arch Linux - Java Version: 18.0.2 - Ghidra Version: 10.1.5 - Ghidra Origin: official release **Additional context** This bug also affects arm64 and perhaps other architectures as well. Proof that `x != x` can return 1: https://godbolt.org/z/EzsfhEand.
1.0
Decompiler Ignores Possibility of NaN in Floating Point Comparison - **Describe the bug** The decompiler incorrectly decompiles a floating point comparison of the form: ``` ucomisd %xmm0,%xmm0 setp %al ret ``` to assume that the result is always false, discounting the possibility of xmm0 containing NaN. **To Reproduce** Steps to reproduce the behavior: 1. Open `is_nan.o` in CodeBrowser: [is_nan.zip](https://github.com/NationalSecurityAgency/ghidra/files/9543315/is_nan.zip). 2. Go to `is_nan` function. 3. Open decompiler. 4. Observe that it decompiles to: ```c undefined is_nan(void) { return 0; } ``` **Expected behavior** Should not assume that `x != x` simplifies to false in the case of floating point comparisons, and should decompile to something more like: ```c undefined is_nan(double x) { return x != x; } ``` **Attachments** Example binary: [is_nan.zip](https://github.com/NationalSecurityAgency/ghidra/files/9543315/is_nan.zip). **Environment (please complete the following information):** - OS: Arch Linux - Java Version: 18.0.2 - Ghidra Version: 10.1.5 - Ghidra Origin: official release **Additional context** This bug also affects arm64 and perhaps other architectures as well. Proof that `x != x` can return 1: https://godbolt.org/z/EzsfhEand.
process
decompiler ignores possibility of nan in floating point comparison describe the bug the decompiler incorrectly decompiles a floating point comparison of the form ucomisd setp al ret to assume that the result is always false discounting the possibility of containing nan to reproduce steps to reproduce the behavior open is nan o in codebrowser go to is nan function open decompiler observe that it decompiles to c undefined is nan void return expected behavior should not assume that x x simplifies to false in the case of floating point comparisons and should decompile to something more like c undefined is nan double x return x x attachments example binary environment please complete the following information os arch linux java version ghidra version ghidra origin official release additional context this bug also affects and perhaps other architectures as well proof that x x can return
1
42,089
17,027,035,110
IssuesEvent
2021-07-03 18:50:32
aws/aws-cdk
https://api.github.com/repos/aws/aws-cdk
closed
(ecs-service-extensions): The ability to create the taskRole
@aws-cdk-containers/ecs-service-extensions effort/small feature-request feature/pattern p2
We need the ability to create the taskRole of a taskDefinition ahead of time and not rely on the auto-generated taskRole because we deploy our CDK applications in 2 stacks (infrastructure and service). We create the infrastructure before the service so when I create things like S3 buckets with resource policies in the infrastructure stack, I want to be able to build a bucket policy with taskRole. If we could create the taskRole ourself and pass it into the infrastructure and the service stacks then everything would work. Expose the task definition props through the service props: https://github.com/aws/aws-cdk/blob/master/packages/%40aws-cdk-containers/ecs-service-extensions/lib/service.ts#L132 * [X] :wave: I may be able to implement this feature request * [ ] :warning: This feature might incur a breaking change --- This is a :rocket: Feature Request
1.0
(ecs-service-extensions): The ability to create the taskRole - We need the ability to create the taskRole of a taskDefinition ahead of time and not rely on the auto-generated taskRole because we deploy our CDK applications in 2 stacks (infrastructure and service). We create the infrastructure before the service so when I create things like S3 buckets with resource policies in the infrastructure stack, I want to be able to build a bucket policy with taskRole. If we could create the taskRole ourself and pass it into the infrastructure and the service stacks then everything would work. Expose the task definition props through the service props: https://github.com/aws/aws-cdk/blob/master/packages/%40aws-cdk-containers/ecs-service-extensions/lib/service.ts#L132 * [X] :wave: I may be able to implement this feature request * [ ] :warning: This feature might incur a breaking change --- This is a :rocket: Feature Request
non_process
ecs service extensions the ability to create the taskrole we need the ability to create the taskrole of a taskdefinition ahead of time and not rely on the auto generated taskrole because we deploy our cdk applications in stacks infrastructure and service we create the infrastructure before the service so when i create things like buckets with resource policies in the infrastructure stack i want to be able to build a bucket policy with taskrole if we could create the taskrole ourself and pass it into the infrastructure and the service stacks then everything would work expose the task definition props through the service props wave i may be able to implement this feature request warning this feature might incur a breaking change this is a rocket feature request
0
55,875
8,033,397,819
IssuesEvent
2018-07-29 05:11:18
BitcoinCashKit/BitcoinCashKit
https://api.github.com/repos/BitcoinCashKit/BitcoinCashKit
opened
Add About section to README
documentation
# Describe what you've considered? Add `About` section to describe who maintains BitcoinCashKit # Code sample / Spec The section will be like this. But the website should be in English. ``` ## About BitcoinCashKit is maintained by [Yenom Inc](https://yenom.tech). ```
1.0
Add About section to README - # Describe what you've considered? Add `About` section to describe who maintains BitcoinCashKit # Code sample / Spec The section will be like this. But the website should be in English. ``` ## About BitcoinCashKit is maintained by [Yenom Inc](https://yenom.tech). ```
non_process
add about section to readme describe what you ve considered add about section to describe who maintains bitcoincashkit code sample spec the section will be like this but the website should be in english about bitcoincashkit is maintained by
0
9,739
12,733,030,627
IssuesEvent
2020-06-25 11:31:53
prisma/prisma-client-js
https://api.github.com/repos/prisma/prisma-client-js
closed
Multiple quick `count` requests result in `Invalid prisma.web_hooks.count invocation`
bug/2-confirmed kind/bug process/next-milestone team/engines topic: broken query
## Bug description After introspecting our Discourse database (credentials on 1Password), and running Studio, Studio crashes. I ran the faulty query in a separate script and that crashes too, so I might have found a Prisma Client issue. ## How to reproduce 1. Introspect our Discourse DB (credentials on 1Password) 2. Generate client, create a TS script with content: <details> <summary>Expand</summary> ```typescript import { PrismaClient } from '@prisma/client' const prisma = new PrismaClient() const main = async () => { await Promise.all([ prisma.anonymous_users.count(), prisma.api_keys.count(), prisma.application_requests.count(), prisma.ar_internal_metadata.count(), prisma.backup_draft_posts.count(), prisma.backup_draft_topics.count(), prisma.backup_metadata.count(), prisma.badge_groupings.count(), prisma.badges.count(), prisma.badge_types.count(), prisma.bookmarks.count(), prisma.categories.count(), prisma.categories_web_hooks.count(), prisma.category_custom_fields.count(), prisma.category_featured_topics.count(), prisma.category_groups.count(), prisma.category_search_data.count(), prisma.category_tag_groups.count(), prisma.category_tags.count(), prisma.category_tag_stats.count(), prisma.category_users.count(), prisma.child_themes.count(), prisma.color_scheme_colors.count(), prisma.color_schemes.count(), prisma.custom_emojis.count(), prisma.developers.count(), prisma.directory_items.count(), prisma.drafts.count(), prisma.draft_sequences.count(), prisma.email_change_requests.count(), prisma.email_logs.count(), prisma.email_tokens.count(), prisma.embeddable_hosts.count(), prisma.github_user_infos.count(), prisma.given_daily_likes.count(), prisma.group_archived_messages.count(), prisma.group_custom_fields.count(), prisma.group_histories.count(), prisma.group_mentions.count(), prisma.group_requests.count(), prisma.groups.count(), prisma.groups_web_hooks.count(), prisma.group_users.count(), prisma.ignored_users.count(), prisma.incoming_domains.count(), prisma.incoming_emails.count(), prisma.incoming_links.count(), prisma.incoming_referers.count(), prisma.invited_groups.count(), prisma.invites.count(), prisma.javascript_caches.count(), prisma.message_bus.count(), prisma.muted_users.count(), prisma.notifications.count(), prisma.oauth2_user_infos.count(), prisma.onceoff_logs.count(), prisma.optimized_images.count(), prisma.permalinks.count(), prisma.plugin_store_rows.count(), prisma.poll_options.count(), prisma.polls.count(), prisma.poll_votes.count(), prisma.post_actions.count(), prisma.post_action_types.count(), prisma.post_custom_fields.count(), prisma.post_details.count(), prisma.post_replies.count(), prisma.post_reply_keys.count(), prisma.post_revisions.count(), prisma.posts.count(), prisma.post_search_data.count(), prisma.post_stats.count(), prisma.post_timings.count(), prisma.post_uploads.count(), prisma.published_pages.count(), prisma.push_subscriptions.count(), prisma.quoted_posts.count(), prisma.remote_themes.count(), prisma.reviewable_claimed_topics.count(), prisma.reviewable_histories.count(), prisma.reviewables.count(), prisma.reviewable_scores.count(), prisma.scheduler_stats.count(), prisma.schema_migration_details.count(), prisma.schema_migrations.count(), prisma.screened_emails.count(), prisma.screened_ip_addresses.count(), prisma.screened_urls.count(), prisma.search_logs.count(), prisma.shared_drafts.count(), prisma.single_sign_on_records.count(), prisma.site_settings.count(), prisma.skipped_email_logs.count(), prisma.stylesheet_cache.count(), prisma.tag_group_memberships.count(), prisma.tag_group_permissions.count(), prisma.tag_groups.count(), prisma.tags.count(), prisma.tag_search_data.count(), prisma.tags_web_hooks.count(), prisma.tag_users.count(), prisma.theme_fields.count(), prisma.theme_modifier_sets.count(), prisma.themes.count(), prisma.theme_settings.count(), prisma.theme_translation_overrides.count(), prisma.topic_allowed_groups.count(), prisma.topic_allowed_users.count(), prisma.topic_custom_fields.count(), prisma.topic_embeds.count(), prisma.topic_groups.count(), prisma.topic_invites.count(), prisma.topic_link_clicks.count(), prisma.topic_links.count(), prisma.topics.count(), prisma.topic_search_data.count(), prisma.topic_tags.count(), prisma.topic_timers.count(), prisma.topic_users.count(), prisma.topic_views.count(), prisma.top_topics.count(), prisma.translation_overrides.count(), prisma.unsubscribe_keys.count(), prisma.uploads.count(), prisma.user_actions.count(), prisma.user_api_keys.count(), prisma.user_archived_messages.count(), prisma.user_associated_accounts.count(), prisma.user_auth_token_logs.count(), prisma.user_auth_tokens.count(), prisma.user_avatars.count(), prisma.user_badges.count(), prisma.user_custom_fields.count(), prisma.user_emails.count(), prisma.user_exports.count(), prisma.user_field_options.count(), prisma.user_fields.count(), prisma.user_histories.count(), prisma.user_open_ids.count(), prisma.user_options.count(), prisma.user_profiles.count(), prisma.user_profile_views.count(), prisma.users.count(), prisma.user_search_data.count(), prisma.user_second_factors.count(), prisma.user_security_keys.count(), prisma.user_stats.count(), prisma.user_uploads.count(), prisma.user_visits.count(), prisma.user_warnings.count(), prisma.watched_words.count(), prisma.web_crawler_requests.count(), prisma.web_hook_events.count(), prisma.web_hook_event_types.count(), prisma.web_hook_event_types_hooks.count(), prisma.web_hooks.count(), ]) } main() .catch(async (e) => { console.log(e) }) .finally(async () => { await prisma.disconnect() }) ``` </details> 3. Run script using `ts-node index.ts`. 4. You should see this error: ``` PrismaClientUnknownRequestError: Invalid `prisma.web_hooks.count()` invocation in /Users/siddhant/Code/Tests/experiments/index.ts:163:22 159 prisma.web_crawler_requests.count(), 160 prisma.web_hook_events.count(), 161 prisma.web_hook_event_types.count(), 162 prisma.web_hook_event_types_hooks.count(), → 163 prisma.web_hooks.count() PANIC: Unable to resolve a primary identifier for model poll_votes. at PrismaClientFetcher.message (/Users/siddhant/Code/Tests/experiments/node_modules/@prisma/client/src/runtime/getPrismaClient.ts:642:46) at processTicksAndRejections (internal/process/task_queues.js:97:5) Error in Prisma Client: PANIC: Unable to resolve a primary identifier for model poll_votes. in libs/prisma-models/src/model.rs:94:30 This is a non-recoverable error which probably happens when the Prisma Query Engine has a panic. Please create an issue in https://github.com/prisma/prisma-client-js describing the last Prisma Client query you called. ``` If you try running the reported failing query individually (`prisma.web_hooks.count()`), then everything's fine, so I'm assuming this only occurs when there are a bunch of queries run at the same time (I might be wrong though) Note that this exact query is what is sent out from Studio, which causes it to crash. Running this with DEBUG=* spits out a bunch of errors (they're a bit too long to paste in here, but you should be able to see them yourself if this script crashes for you too) Since these credentials are sensitive, I cannot create a reproduction repo, but I can send over an archive on Slack if needed. ## Expected behavior No crash. ## Prisma information Same as above ## Environment & setup <!-- In which environment does the problem occur --> - OS: macOS - Database: Postgres - Prisma version: ``` @prisma/cli : 2.0.0-alpha.1142 Current platform : darwin Query Engine : query-engine 7120ecb512fc5ebe187a8cfc7b5d650bb51f7271 Migration Engine : migration-engine-cli 7120ecb512fc5ebe187a8cfc7b5d650bb51f7271 Introspection Engine : introspection-core 7120ecb512fc5ebe187a8cfc7b5d650bb51f7271 ``` - Node.js version: v13.10.1
1.0
Multiple quick `count` requests result in `Invalid prisma.web_hooks.count invocation` - ## Bug description After introspecting our Discourse database (credentials on 1Password), and running Studio, Studio crashes. I ran the faulty query in a separate script and that crashes too, so I might have found a Prisma Client issue. ## How to reproduce 1. Introspect our Discourse DB (credentials on 1Password) 2. Generate client, create a TS script with content: <details> <summary>Expand</summary> ```typescript import { PrismaClient } from '@prisma/client' const prisma = new PrismaClient() const main = async () => { await Promise.all([ prisma.anonymous_users.count(), prisma.api_keys.count(), prisma.application_requests.count(), prisma.ar_internal_metadata.count(), prisma.backup_draft_posts.count(), prisma.backup_draft_topics.count(), prisma.backup_metadata.count(), prisma.badge_groupings.count(), prisma.badges.count(), prisma.badge_types.count(), prisma.bookmarks.count(), prisma.categories.count(), prisma.categories_web_hooks.count(), prisma.category_custom_fields.count(), prisma.category_featured_topics.count(), prisma.category_groups.count(), prisma.category_search_data.count(), prisma.category_tag_groups.count(), prisma.category_tags.count(), prisma.category_tag_stats.count(), prisma.category_users.count(), prisma.child_themes.count(), prisma.color_scheme_colors.count(), prisma.color_schemes.count(), prisma.custom_emojis.count(), prisma.developers.count(), prisma.directory_items.count(), prisma.drafts.count(), prisma.draft_sequences.count(), prisma.email_change_requests.count(), prisma.email_logs.count(), prisma.email_tokens.count(), prisma.embeddable_hosts.count(), prisma.github_user_infos.count(), prisma.given_daily_likes.count(), prisma.group_archived_messages.count(), prisma.group_custom_fields.count(), prisma.group_histories.count(), prisma.group_mentions.count(), prisma.group_requests.count(), prisma.groups.count(), prisma.groups_web_hooks.count(), prisma.group_users.count(), prisma.ignored_users.count(), prisma.incoming_domains.count(), prisma.incoming_emails.count(), prisma.incoming_links.count(), prisma.incoming_referers.count(), prisma.invited_groups.count(), prisma.invites.count(), prisma.javascript_caches.count(), prisma.message_bus.count(), prisma.muted_users.count(), prisma.notifications.count(), prisma.oauth2_user_infos.count(), prisma.onceoff_logs.count(), prisma.optimized_images.count(), prisma.permalinks.count(), prisma.plugin_store_rows.count(), prisma.poll_options.count(), prisma.polls.count(), prisma.poll_votes.count(), prisma.post_actions.count(), prisma.post_action_types.count(), prisma.post_custom_fields.count(), prisma.post_details.count(), prisma.post_replies.count(), prisma.post_reply_keys.count(), prisma.post_revisions.count(), prisma.posts.count(), prisma.post_search_data.count(), prisma.post_stats.count(), prisma.post_timings.count(), prisma.post_uploads.count(), prisma.published_pages.count(), prisma.push_subscriptions.count(), prisma.quoted_posts.count(), prisma.remote_themes.count(), prisma.reviewable_claimed_topics.count(), prisma.reviewable_histories.count(), prisma.reviewables.count(), prisma.reviewable_scores.count(), prisma.scheduler_stats.count(), prisma.schema_migration_details.count(), prisma.schema_migrations.count(), prisma.screened_emails.count(), prisma.screened_ip_addresses.count(), prisma.screened_urls.count(), prisma.search_logs.count(), prisma.shared_drafts.count(), prisma.single_sign_on_records.count(), prisma.site_settings.count(), prisma.skipped_email_logs.count(), prisma.stylesheet_cache.count(), prisma.tag_group_memberships.count(), prisma.tag_group_permissions.count(), prisma.tag_groups.count(), prisma.tags.count(), prisma.tag_search_data.count(), prisma.tags_web_hooks.count(), prisma.tag_users.count(), prisma.theme_fields.count(), prisma.theme_modifier_sets.count(), prisma.themes.count(), prisma.theme_settings.count(), prisma.theme_translation_overrides.count(), prisma.topic_allowed_groups.count(), prisma.topic_allowed_users.count(), prisma.topic_custom_fields.count(), prisma.topic_embeds.count(), prisma.topic_groups.count(), prisma.topic_invites.count(), prisma.topic_link_clicks.count(), prisma.topic_links.count(), prisma.topics.count(), prisma.topic_search_data.count(), prisma.topic_tags.count(), prisma.topic_timers.count(), prisma.topic_users.count(), prisma.topic_views.count(), prisma.top_topics.count(), prisma.translation_overrides.count(), prisma.unsubscribe_keys.count(), prisma.uploads.count(), prisma.user_actions.count(), prisma.user_api_keys.count(), prisma.user_archived_messages.count(), prisma.user_associated_accounts.count(), prisma.user_auth_token_logs.count(), prisma.user_auth_tokens.count(), prisma.user_avatars.count(), prisma.user_badges.count(), prisma.user_custom_fields.count(), prisma.user_emails.count(), prisma.user_exports.count(), prisma.user_field_options.count(), prisma.user_fields.count(), prisma.user_histories.count(), prisma.user_open_ids.count(), prisma.user_options.count(), prisma.user_profiles.count(), prisma.user_profile_views.count(), prisma.users.count(), prisma.user_search_data.count(), prisma.user_second_factors.count(), prisma.user_security_keys.count(), prisma.user_stats.count(), prisma.user_uploads.count(), prisma.user_visits.count(), prisma.user_warnings.count(), prisma.watched_words.count(), prisma.web_crawler_requests.count(), prisma.web_hook_events.count(), prisma.web_hook_event_types.count(), prisma.web_hook_event_types_hooks.count(), prisma.web_hooks.count(), ]) } main() .catch(async (e) => { console.log(e) }) .finally(async () => { await prisma.disconnect() }) ``` </details> 3. Run script using `ts-node index.ts`. 4. You should see this error: ``` PrismaClientUnknownRequestError: Invalid `prisma.web_hooks.count()` invocation in /Users/siddhant/Code/Tests/experiments/index.ts:163:22 159 prisma.web_crawler_requests.count(), 160 prisma.web_hook_events.count(), 161 prisma.web_hook_event_types.count(), 162 prisma.web_hook_event_types_hooks.count(), → 163 prisma.web_hooks.count() PANIC: Unable to resolve a primary identifier for model poll_votes. at PrismaClientFetcher.message (/Users/siddhant/Code/Tests/experiments/node_modules/@prisma/client/src/runtime/getPrismaClient.ts:642:46) at processTicksAndRejections (internal/process/task_queues.js:97:5) Error in Prisma Client: PANIC: Unable to resolve a primary identifier for model poll_votes. in libs/prisma-models/src/model.rs:94:30 This is a non-recoverable error which probably happens when the Prisma Query Engine has a panic. Please create an issue in https://github.com/prisma/prisma-client-js describing the last Prisma Client query you called. ``` If you try running the reported failing query individually (`prisma.web_hooks.count()`), then everything's fine, so I'm assuming this only occurs when there are a bunch of queries run at the same time (I might be wrong though) Note that this exact query is what is sent out from Studio, which causes it to crash. Running this with DEBUG=* spits out a bunch of errors (they're a bit too long to paste in here, but you should be able to see them yourself if this script crashes for you too) Since these credentials are sensitive, I cannot create a reproduction repo, but I can send over an archive on Slack if needed. ## Expected behavior No crash. ## Prisma information Same as above ## Environment & setup <!-- In which environment does the problem occur --> - OS: macOS - Database: Postgres - Prisma version: ``` @prisma/cli : 2.0.0-alpha.1142 Current platform : darwin Query Engine : query-engine 7120ecb512fc5ebe187a8cfc7b5d650bb51f7271 Migration Engine : migration-engine-cli 7120ecb512fc5ebe187a8cfc7b5d650bb51f7271 Introspection Engine : introspection-core 7120ecb512fc5ebe187a8cfc7b5d650bb51f7271 ``` - Node.js version: v13.10.1
process
multiple quick count requests result in invalid prisma web hooks count invocation bug description after introspecting our discourse database credentials on and running studio studio crashes i ran the faulty query in a separate script and that crashes too so i might have found a prisma client issue how to reproduce introspect our discourse db credentials on generate client create a ts script with content expand typescript import prismaclient from prisma client const prisma new prismaclient const main async await promise all prisma anonymous users count prisma api keys count prisma application requests count prisma ar internal metadata count prisma backup draft posts count prisma backup draft topics count prisma backup metadata count prisma badge groupings count prisma badges count prisma badge types count prisma bookmarks count prisma categories count prisma categories web hooks count prisma category custom fields count prisma category featured topics count prisma category groups count prisma category search data count prisma category tag groups count prisma category tags count prisma category tag stats count prisma category users count prisma child themes count prisma color scheme colors count prisma color schemes count prisma custom emojis count prisma developers count prisma directory items count prisma drafts count prisma draft sequences count prisma email change requests count prisma email logs count prisma email tokens count prisma embeddable hosts count prisma github user infos count prisma given daily likes count prisma group archived messages count prisma group custom fields count prisma group histories count prisma group mentions count prisma group requests count prisma groups count prisma groups web hooks count prisma group users count prisma ignored users count prisma incoming domains count prisma incoming emails count prisma incoming links count prisma incoming referers count prisma invited groups count prisma invites count prisma javascript caches count prisma message bus count prisma muted users count prisma notifications count prisma user infos count prisma onceoff logs count prisma optimized images count prisma permalinks count prisma plugin store rows count prisma poll options count prisma polls count prisma poll votes count prisma post actions count prisma post action types count prisma post custom fields count prisma post details count prisma post replies count prisma post reply keys count prisma post revisions count prisma posts count prisma post search data count prisma post stats count prisma post timings count prisma post uploads count prisma published pages count prisma push subscriptions count prisma quoted posts count prisma remote themes count prisma reviewable claimed topics count prisma reviewable histories count prisma reviewables count prisma reviewable scores count prisma scheduler stats count prisma schema migration details count prisma schema migrations count prisma screened emails count prisma screened ip addresses count prisma screened urls count prisma search logs count prisma shared drafts count prisma single sign on records count prisma site settings count prisma skipped email logs count prisma stylesheet cache count prisma tag group memberships count prisma tag group permissions count prisma tag groups count prisma tags count prisma tag search data count prisma tags web hooks count prisma tag users count prisma theme fields count prisma theme modifier sets count prisma themes count prisma theme settings count prisma theme translation overrides count prisma topic allowed groups count prisma topic allowed users count prisma topic custom fields count prisma topic embeds count prisma topic groups count prisma topic invites count prisma topic link clicks count prisma topic links count prisma topics count prisma topic search data count prisma topic tags count prisma topic timers count prisma topic users count prisma topic views count prisma top topics count prisma translation overrides count prisma unsubscribe keys count prisma uploads count prisma user actions count prisma user api keys count prisma user archived messages count prisma user associated accounts count prisma user auth token logs count prisma user auth tokens count prisma user avatars count prisma user badges count prisma user custom fields count prisma user emails count prisma user exports count prisma user field options count prisma user fields count prisma user histories count prisma user open ids count prisma user options count prisma user profiles count prisma user profile views count prisma users count prisma user search data count prisma user second factors count prisma user security keys count prisma user stats count prisma user uploads count prisma user visits count prisma user warnings count prisma watched words count prisma web crawler requests count prisma web hook events count prisma web hook event types count prisma web hook event types hooks count prisma web hooks count main catch async e console log e finally async await prisma disconnect run script using ts node index ts you should see this error prismaclientunknownrequesterror invalid prisma web hooks count invocation in users siddhant code tests experiments index ts prisma web crawler requests count prisma web hook events count prisma web hook event types count prisma web hook event types hooks count → prisma web hooks count panic unable to resolve a primary identifier for model poll votes at prismaclientfetcher message users siddhant code tests experiments node modules prisma client src runtime getprismaclient ts at processticksandrejections internal process task queues js error in prisma client panic unable to resolve a primary identifier for model poll votes in libs prisma models src model rs this is a non recoverable error which probably happens when the prisma query engine has a panic please create an issue in describing the last prisma client query you called if you try running the reported failing query individually prisma web hooks count then everything s fine so i m assuming this only occurs when there are a bunch of queries run at the same time i might be wrong though note that this exact query is what is sent out from studio which causes it to crash running this with debug spits out a bunch of errors they re a bit too long to paste in here but you should be able to see them yourself if this script crashes for you too since these credentials are sensitive i cannot create a reproduction repo but i can send over an archive on slack if needed expected behavior no crash prisma information same as above environment setup os macos database postgres prisma version prisma cli alpha current platform darwin query engine query engine migration engine migration engine cli introspection engine introspection core node js version
1
8,918
12,025,366,877
IssuesEvent
2020-04-12 08:50:12
Arch666Angel/mods
https://api.github.com/repos/Arch666Angel/mods
closed
crafting time for fish water
Angels Bio Processing Enhancement
check ratio for adv chem plant vs fish tank, seems like it's 1:2 (?) it should require less chem plants, so maybe increase amount/recipe a bit if this is the case
1.0
crafting time for fish water - check ratio for adv chem plant vs fish tank, seems like it's 1:2 (?) it should require less chem plants, so maybe increase amount/recipe a bit if this is the case
process
crafting time for fish water check ratio for adv chem plant vs fish tank seems like it s it should require less chem plants so maybe increase amount recipe a bit if this is the case
1
6,219
9,160,219,918
IssuesEvent
2019-03-01 06:30:07
ga4gh/dockstore
https://api.github.com/repos/ga4gh/dockstore
closed
Complete NIH insider training
process
See https://wiki.oicr.on.ca/display/DOC/Notes+on+NIH+Training for accompanying notes ┆Issue is synchronized with this [Jira Story](https://ucsc-cgl.atlassian.net/browse/DOCK-227) ┆Issue Number: DOCK-227
1.0
Complete NIH insider training - See https://wiki.oicr.on.ca/display/DOC/Notes+on+NIH+Training for accompanying notes ┆Issue is synchronized with this [Jira Story](https://ucsc-cgl.atlassian.net/browse/DOCK-227) ┆Issue Number: DOCK-227
process
complete nih insider training see for accompanying notes ┆issue is synchronized with this ┆issue number dock
1
98,363
16,373,809,966
IssuesEvent
2021-05-15 17:39:10
hugh-whitesource/NodeGoat-1
https://api.github.com/repos/hugh-whitesource/NodeGoat-1
opened
WS-2018-0084 (High) detected in sshpk-1.10.1.tgz
security vulnerability
## WS-2018-0084 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>sshpk-1.10.1.tgz</b></p></summary> <p>A library for finding and using SSH public keys</p> <p>Library home page: <a href="https://registry.npmjs.org/sshpk/-/sshpk-1.10.1.tgz">https://registry.npmjs.org/sshpk/-/sshpk-1.10.1.tgz</a></p> <p>Path to dependency file: NodeGoat-1/package.json</p> <p>Path to vulnerable library: NodeGoat-1/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/sshpk/package.json</p> <p> Dependency Hierarchy: - grunt-npm-install-0.3.1.tgz (Root Library) - npm-3.10.10.tgz - request-2.75.0.tgz - http-signature-1.1.1.tgz - :x: **sshpk-1.10.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/hugh-whitesource/NodeGoat-1/commit/1acb8446b41e455d2f087e892c9a9ce80609f601">1acb8446b41e455d2f087e892c9a9ce80609f601</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Versions of sshpk before 1.14.1 are vulnerable to regular expression denial of service when parsing crafted invalid public keys. <p>Publish Date: 2018-04-25 <p>URL: <a href=https://github.com/joyent/node-sshpk/blob/v1.13.1/lib/formats/ssh.js#L17>WS-2018-0084</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>8.0</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nodesecurity.io/advisories/606">https://nodesecurity.io/advisories/606</a></p> <p>Release Date: 2018-01-27</p> <p>Fix Resolution: 1.14.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"sshpk","packageVersion":"1.10.1","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt-npm-install:0.3.1;npm:3.10.10;request:2.75.0;http-signature:1.1.1;sshpk:1.10.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"1.14.1"}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2018-0084","vulnerabilityDetails":"Versions of sshpk before 1.14.1 are vulnerable to regular expression denial of service when parsing crafted invalid public keys.\n\n","vulnerabilityUrl":"https://github.com/joyent/node-sshpk/blob/v1.13.1/lib/formats/ssh.js#L17","cvss2Severity":"high","cvss2Score":"8.0","extraData":{}}</REMEDIATE> -->
True
WS-2018-0084 (High) detected in sshpk-1.10.1.tgz - ## WS-2018-0084 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>sshpk-1.10.1.tgz</b></p></summary> <p>A library for finding and using SSH public keys</p> <p>Library home page: <a href="https://registry.npmjs.org/sshpk/-/sshpk-1.10.1.tgz">https://registry.npmjs.org/sshpk/-/sshpk-1.10.1.tgz</a></p> <p>Path to dependency file: NodeGoat-1/package.json</p> <p>Path to vulnerable library: NodeGoat-1/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/sshpk/package.json</p> <p> Dependency Hierarchy: - grunt-npm-install-0.3.1.tgz (Root Library) - npm-3.10.10.tgz - request-2.75.0.tgz - http-signature-1.1.1.tgz - :x: **sshpk-1.10.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/hugh-whitesource/NodeGoat-1/commit/1acb8446b41e455d2f087e892c9a9ce80609f601">1acb8446b41e455d2f087e892c9a9ce80609f601</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Versions of sshpk before 1.14.1 are vulnerable to regular expression denial of service when parsing crafted invalid public keys. <p>Publish Date: 2018-04-25 <p>URL: <a href=https://github.com/joyent/node-sshpk/blob/v1.13.1/lib/formats/ssh.js#L17>WS-2018-0084</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>8.0</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nodesecurity.io/advisories/606">https://nodesecurity.io/advisories/606</a></p> <p>Release Date: 2018-01-27</p> <p>Fix Resolution: 1.14.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"sshpk","packageVersion":"1.10.1","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt-npm-install:0.3.1;npm:3.10.10;request:2.75.0;http-signature:1.1.1;sshpk:1.10.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"1.14.1"}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2018-0084","vulnerabilityDetails":"Versions of sshpk before 1.14.1 are vulnerable to regular expression denial of service when parsing crafted invalid public keys.\n\n","vulnerabilityUrl":"https://github.com/joyent/node-sshpk/blob/v1.13.1/lib/formats/ssh.js#L17","cvss2Severity":"high","cvss2Score":"8.0","extraData":{}}</REMEDIATE> -->
non_process
ws high detected in sshpk tgz ws high severity vulnerability vulnerable library sshpk tgz a library for finding and using ssh public keys library home page a href path to dependency file nodegoat package json path to vulnerable library nodegoat node modules npm node modules request node modules http signature node modules sshpk package json dependency hierarchy grunt npm install tgz root library npm tgz request tgz http signature tgz x sshpk tgz vulnerable library found in head commit a href found in base branch master vulnerability details versions of sshpk before are vulnerable to regular expression denial of service when parsing crafted invalid public keys publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree grunt npm install npm request http signature sshpk isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier ws vulnerabilitydetails versions of sshpk before are vulnerable to regular expression denial of service when parsing crafted invalid public keys n n vulnerabilityurl
0
7,656
10,740,478,463
IssuesEvent
2019-10-29 18:15:58
brucemiller/LaTeXML
https://api.github.com/repos/brucemiller/LaTeXML
opened
Pluggable and configurable post-processing
enhancement postprocessing
This issue targets specifically the post-processing pipeline of latexml, as part of a larger configuration effort for latexml. We want to make it possible to write plugins with additional post-processors, and have them execute in the correct order. For this we need to provide several pieces of specification and documentation: * We need named, and ordered, post-processing stages * We need a customization language that allows to declare plugins as post-processors of a certain stage. Potentially ties with the `--profile` mechanism. * We need to refactor the existing post-processors as components of this pipeline * Likely we need to formalize and declare all pieces available during post-processing, and make them available to each stage. Some of them are: - source directory - target directory - document object - cache file - asset list? Maybe left implicit rather than maintained? In any case, we should have an issue filed as Bruce treats this as a short-term implementation goal.
1.0
Pluggable and configurable post-processing - This issue targets specifically the post-processing pipeline of latexml, as part of a larger configuration effort for latexml. We want to make it possible to write plugins with additional post-processors, and have them execute in the correct order. For this we need to provide several pieces of specification and documentation: * We need named, and ordered, post-processing stages * We need a customization language that allows to declare plugins as post-processors of a certain stage. Potentially ties with the `--profile` mechanism. * We need to refactor the existing post-processors as components of this pipeline * Likely we need to formalize and declare all pieces available during post-processing, and make them available to each stage. Some of them are: - source directory - target directory - document object - cache file - asset list? Maybe left implicit rather than maintained? In any case, we should have an issue filed as Bruce treats this as a short-term implementation goal.
process
pluggable and configurable post processing this issue targets specifically the post processing pipeline of latexml as part of a larger configuration effort for latexml we want to make it possible to write plugins with additional post processors and have them execute in the correct order for this we need to provide several pieces of specification and documentation we need named and ordered post processing stages we need a customization language that allows to declare plugins as post processors of a certain stage potentially ties with the profile mechanism we need to refactor the existing post processors as components of this pipeline likely we need to formalize and declare all pieces available during post processing and make them available to each stage some of them are source directory target directory document object cache file asset list maybe left implicit rather than maintained in any case we should have an issue filed as bruce treats this as a short term implementation goal
1
7,242
10,410,295,279
IssuesEvent
2019-09-13 10:59:39
googleapis/google-cloud-dotnet
https://api.github.com/repos/googleapis/google-cloud-dotnet
closed
Update MultiRegional and Regional to Standard
type: docs type: process
(I don't have assign privileges in this repository but this should be assigned to @dopiera ) Previously, we implemented the concept of "Location Type" to move the concept of bucket location away from storage class. It is no longer considered correct to use "MultiRegional" or "Regional" as a storage class, instead those should be location types. As such, we need to update every instance of the "MultiRegional" and "Regional" storage in samples, library documentation, and tests to use "Standard" instead--Standard is a correct storage class. Note that "MultiRegional" and "Regional" should only be replaced with "Standard" in cases where they are used a storage class. If they're used as a location type, that's fine and doesn't need to be replaced. https://github.com/googleapis/google-cloud-dotnet/blob/d2aba1aa0fbcb7b9cb8cc4f603ce038ceaeed81e/apis/Google.Cloud.Storage.V1/Google.Cloud.Storage.V1.IntegrationTests/StorageClassTest.cs#L41 https://github.com/GoogleCloudPlatform/dotnet-docs-samples/search?q=regional&unscoped_q=regional
1.0
Update MultiRegional and Regional to Standard - (I don't have assign privileges in this repository but this should be assigned to @dopiera ) Previously, we implemented the concept of "Location Type" to move the concept of bucket location away from storage class. It is no longer considered correct to use "MultiRegional" or "Regional" as a storage class, instead those should be location types. As such, we need to update every instance of the "MultiRegional" and "Regional" storage in samples, library documentation, and tests to use "Standard" instead--Standard is a correct storage class. Note that "MultiRegional" and "Regional" should only be replaced with "Standard" in cases where they are used a storage class. If they're used as a location type, that's fine and doesn't need to be replaced. https://github.com/googleapis/google-cloud-dotnet/blob/d2aba1aa0fbcb7b9cb8cc4f603ce038ceaeed81e/apis/Google.Cloud.Storage.V1/Google.Cloud.Storage.V1.IntegrationTests/StorageClassTest.cs#L41 https://github.com/GoogleCloudPlatform/dotnet-docs-samples/search?q=regional&unscoped_q=regional
process
update multiregional and regional to standard i don t have assign privileges in this repository but this should be assigned to dopiera previously we implemented the concept of location type to move the concept of bucket location away from storage class it is no longer considered correct to use multiregional or regional as a storage class instead those should be location types as such we need to update every instance of the multiregional and regional storage in samples library documentation and tests to use standard instead standard is a correct storage class note that multiregional and regional should only be replaced with standard in cases where they are used a storage class if they re used as a location type that s fine and doesn t need to be replaced
1
82,292
7,836,568,963
IssuesEvent
2018-06-17 21:07:32
stuartmccoll/gitlab-changelog-generator
https://api.github.com/repos/stuartmccoll/gitlab-changelog-generator
opened
Add additional unit tests for generator.py
good first issue tests
The `generator.py` module doesn't have full unit test coverage; finish testing this module.
1.0
Add additional unit tests for generator.py - The `generator.py` module doesn't have full unit test coverage; finish testing this module.
non_process
add additional unit tests for generator py the generator py module doesn t have full unit test coverage finish testing this module
0
6,646
9,661,337,710
IssuesEvent
2019-05-20 17:45:54
isawnyu/isaw.web
https://api.github.com/repos/isawnyu/isaw.web
closed
fix footer to correct bad page heading hierarchy: 1
WCAG A deploy enhancement high priority requirement
We've done a lot of work to clean up skipped heading levels and related problems in our templates, but the footer still needs some attention. Specifically, there are three bits of text that are currently wrapped in ```<h3>``` tags (the "support" link and the "gallery hours" and "library hours" headings). These three components need to be wrapped in ```<h2>``` instead. That's easy for us to do via the "ISAW Settings" control panel -> "footer HTML" editor, but some CSS modifications are required in order to preserve current look and feel. There are currently two relevant CSS selectors: - ```.footer-row #portal-footer h3``` - ```.footer-row #portal-footer h3.support a``` Why not duplicate each of these for h2 to read as follows and see if that takes care of the need: - ```.footer-row #portal-footer h2``` - ```.footer-row #portal-footer h2.support a``` We don't anticipate using ```<h3>``` in the footer going forward.
1.0
fix footer to correct bad page heading hierarchy: 1 - We've done a lot of work to clean up skipped heading levels and related problems in our templates, but the footer still needs some attention. Specifically, there are three bits of text that are currently wrapped in ```<h3>``` tags (the "support" link and the "gallery hours" and "library hours" headings). These three components need to be wrapped in ```<h2>``` instead. That's easy for us to do via the "ISAW Settings" control panel -> "footer HTML" editor, but some CSS modifications are required in order to preserve current look and feel. There are currently two relevant CSS selectors: - ```.footer-row #portal-footer h3``` - ```.footer-row #portal-footer h3.support a``` Why not duplicate each of these for h2 to read as follows and see if that takes care of the need: - ```.footer-row #portal-footer h2``` - ```.footer-row #portal-footer h2.support a``` We don't anticipate using ```<h3>``` in the footer going forward.
non_process
fix footer to correct bad page heading hierarchy we ve done a lot of work to clean up skipped heading levels and related problems in our templates but the footer still needs some attention specifically there are three bits of text that are currently wrapped in tags the support link and the gallery hours and library hours headings these three components need to be wrapped in instead that s easy for us to do via the isaw settings control panel footer html editor but some css modifications are required in order to preserve current look and feel there are currently two relevant css selectors footer row portal footer footer row portal footer support a why not duplicate each of these for to read as follows and see if that takes care of the need footer row portal footer footer row portal footer support a we don t anticipate using in the footer going forward
0
87,191
15,756,293,260
IssuesEvent
2021-03-31 03:15:48
scriptex/webpack-mpa-ts
https://api.github.com/repos/scriptex/webpack-mpa-ts
closed
CVE-2021-23358 (High) detected in underscore-1.4.4.js, underscore-1.4.4.tgz
security vulnerability
## CVE-2021-23358 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>underscore-1.4.4.js</b>, <b>underscore-1.4.4.tgz</b></p></summary> <p> <details><summary><b>underscore-1.4.4.js</b></p></summary> <p>JavaScript's functional programming helper library.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/underscore.js/1.4.4/underscore.js">https://cdnjs.cloudflare.com/ajax/libs/underscore.js/1.4.4/underscore.js</a></p> <p>Path to dependency file: webpack-mpa-ts/node_modules/underscore/index.html</p> <p>Path to vulnerable library: webpack-mpa-ts/node_modules/underscore/underscore.js</p> <p> Dependency Hierarchy: - :x: **underscore-1.4.4.js** (Vulnerable Library) </details> <details><summary><b>underscore-1.4.4.tgz</b></p></summary> <p>JavaScript's functional programming helper library.</p> <p>Library home page: <a href="https://registry.npmjs.org/underscore/-/underscore-1.4.4.tgz">https://registry.npmjs.org/underscore/-/underscore-1.4.4.tgz</a></p> <p>Path to dependency file: webpack-mpa-ts/package.json</p> <p>Path to vulnerable library: webpack-mpa-ts/node_modules/underscore/package.json</p> <p> Dependency Hierarchy: - webpack-spritesmith-1.1.0.tgz (Root Library) - spritesheet-templates-10.5.0.tgz - :x: **underscore-1.4.4.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/scriptex/webpack-mpa-ts/commit/e93762befc68c8a9c15e63ba72692ae14eba1799">e93762befc68c8a9c15e63ba72692ae14eba1799</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The package underscore from 1.13.0-0 and before 1.13.0-2, from 1.3.2 and before 1.12.1 are vulnerable to Arbitrary Code Execution via the template function, particularly when a variable property is passed as an argument as it is not sanitized. <p>Publish Date: 2021-03-29 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23358>CVE-2021-23358</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23358">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23358</a></p> <p>Release Date: 2021-03-29</p> <p>Fix Resolution: underscore - 1.12.1,1.13.0-2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-23358 (High) detected in underscore-1.4.4.js, underscore-1.4.4.tgz - ## CVE-2021-23358 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>underscore-1.4.4.js</b>, <b>underscore-1.4.4.tgz</b></p></summary> <p> <details><summary><b>underscore-1.4.4.js</b></p></summary> <p>JavaScript's functional programming helper library.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/underscore.js/1.4.4/underscore.js">https://cdnjs.cloudflare.com/ajax/libs/underscore.js/1.4.4/underscore.js</a></p> <p>Path to dependency file: webpack-mpa-ts/node_modules/underscore/index.html</p> <p>Path to vulnerable library: webpack-mpa-ts/node_modules/underscore/underscore.js</p> <p> Dependency Hierarchy: - :x: **underscore-1.4.4.js** (Vulnerable Library) </details> <details><summary><b>underscore-1.4.4.tgz</b></p></summary> <p>JavaScript's functional programming helper library.</p> <p>Library home page: <a href="https://registry.npmjs.org/underscore/-/underscore-1.4.4.tgz">https://registry.npmjs.org/underscore/-/underscore-1.4.4.tgz</a></p> <p>Path to dependency file: webpack-mpa-ts/package.json</p> <p>Path to vulnerable library: webpack-mpa-ts/node_modules/underscore/package.json</p> <p> Dependency Hierarchy: - webpack-spritesmith-1.1.0.tgz (Root Library) - spritesheet-templates-10.5.0.tgz - :x: **underscore-1.4.4.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/scriptex/webpack-mpa-ts/commit/e93762befc68c8a9c15e63ba72692ae14eba1799">e93762befc68c8a9c15e63ba72692ae14eba1799</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The package underscore from 1.13.0-0 and before 1.13.0-2, from 1.3.2 and before 1.12.1 are vulnerable to Arbitrary Code Execution via the template function, particularly when a variable property is passed as an argument as it is not sanitized. <p>Publish Date: 2021-03-29 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23358>CVE-2021-23358</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23358">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23358</a></p> <p>Release Date: 2021-03-29</p> <p>Fix Resolution: underscore - 1.12.1,1.13.0-2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in underscore js underscore tgz cve high severity vulnerability vulnerable libraries underscore js underscore tgz underscore js javascript s functional programming helper library library home page a href path to dependency file webpack mpa ts node modules underscore index html path to vulnerable library webpack mpa ts node modules underscore underscore js dependency hierarchy x underscore js vulnerable library underscore tgz javascript s functional programming helper library library home page a href path to dependency file webpack mpa ts package json path to vulnerable library webpack mpa ts node modules underscore package json dependency hierarchy webpack spritesmith tgz root library spritesheet templates tgz x underscore tgz vulnerable library found in head commit a href found in base branch master vulnerability details the package underscore from and before from and before are vulnerable to arbitrary code execution via the template function particularly when a variable property is passed as an argument as it is not sanitized publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution underscore step up your open source security game with whitesource
0
3,353
6,486,747,113
IssuesEvent
2017-08-19 22:51:14
Great-Hill-Corporation/quickBlocks
https://api.github.com/repos/Great-Hill-Corporation/quickBlocks
closed
Need a way to upgrade binary file formats
monitors-cacheMan status-inprocess type-enhancement
I need to be able to upgrade binary file formats at will. It's built into the Serialize code already, but I don't know how to use it. I think an easy place to start this process is using the cacheMan code and the account caches. Consolidate with #226
1.0
Need a way to upgrade binary file formats - I need to be able to upgrade binary file formats at will. It's built into the Serialize code already, but I don't know how to use it. I think an easy place to start this process is using the cacheMan code and the account caches. Consolidate with #226
process
need a way to upgrade binary file formats i need to be able to upgrade binary file formats at will it s built into the serialize code already but i don t know how to use it i think an easy place to start this process is using the cacheman code and the account caches consolidate with
1
257,289
27,562,959,227
IssuesEvent
2023-03-08 00:05:51
opensearch-project/oui
https://api.github.com/repos/opensearch-project/oui
closed
yo-4.3.0.tgz: 1 vulnerabilities (highest severity is: 7.5) - autoclosed
Mend: dependency security vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>yo-4.3.0.tgz</b></p></summary> <p></p> <p> <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/oui/commit/bbca6f5de4b03c3c57a1333400bba2178b3aa99e">bbca6f5de4b03c3c57a1333400bba2178b3aa99e</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (yo version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2022-25881](https://www.mend.io/vulnerability-database/CVE-2022-25881) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | http-cache-semantics-4.1.0.tgz | Transitive | N/A* | &#10060; | <p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the section "Details" below to see if there is a version of transitive dependency where vulnerability is fixed.</p> ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-25881</summary> ### Vulnerable Library - <b>http-cache-semantics-4.1.0.tgz</b></p> <p>Parses Cache-Control and other headers. Helps building correct HTTP caches and proxies</p> <p>Library home page: <a href="https://registry.npmjs.org/http-cache-semantics/-/http-cache-semantics-4.1.0.tgz">https://registry.npmjs.org/http-cache-semantics/-/http-cache-semantics-4.1.0.tgz</a></p> <p> Dependency Hierarchy: - yo-4.3.0.tgz (Root Library) - got-11.8.5.tgz - cacheable-request-7.0.2.tgz - :x: **http-cache-semantics-4.1.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/oui/commit/bbca6f5de4b03c3c57a1333400bba2178b3aa99e">bbca6f5de4b03c3c57a1333400bba2178b3aa99e</a></p> <p>Found in base branch: <b>main</b></p> </p> <p></p> ### Vulnerability Details <p> This affects versions of the package http-cache-semantics before 4.1.1. The issue can be exploited via malicious request header values sent to a server, when that server reads the cache policy from the request using this library. <p>Publish Date: 2023-01-31 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25881>CVE-2022-25881</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>7.5</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.cve.org/CVERecord?id=CVE-2022-25881">https://www.cve.org/CVERecord?id=CVE-2022-25881</a></p> <p>Release Date: 2023-01-31</p> <p>Fix Resolution: http-cache-semantics - 4.1.1</p> </p> <p></p> </details>
True
yo-4.3.0.tgz: 1 vulnerabilities (highest severity is: 7.5) - autoclosed - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>yo-4.3.0.tgz</b></p></summary> <p></p> <p> <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/oui/commit/bbca6f5de4b03c3c57a1333400bba2178b3aa99e">bbca6f5de4b03c3c57a1333400bba2178b3aa99e</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (yo version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2022-25881](https://www.mend.io/vulnerability-database/CVE-2022-25881) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | http-cache-semantics-4.1.0.tgz | Transitive | N/A* | &#10060; | <p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the section "Details" below to see if there is a version of transitive dependency where vulnerability is fixed.</p> ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-25881</summary> ### Vulnerable Library - <b>http-cache-semantics-4.1.0.tgz</b></p> <p>Parses Cache-Control and other headers. Helps building correct HTTP caches and proxies</p> <p>Library home page: <a href="https://registry.npmjs.org/http-cache-semantics/-/http-cache-semantics-4.1.0.tgz">https://registry.npmjs.org/http-cache-semantics/-/http-cache-semantics-4.1.0.tgz</a></p> <p> Dependency Hierarchy: - yo-4.3.0.tgz (Root Library) - got-11.8.5.tgz - cacheable-request-7.0.2.tgz - :x: **http-cache-semantics-4.1.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/oui/commit/bbca6f5de4b03c3c57a1333400bba2178b3aa99e">bbca6f5de4b03c3c57a1333400bba2178b3aa99e</a></p> <p>Found in base branch: <b>main</b></p> </p> <p></p> ### Vulnerability Details <p> This affects versions of the package http-cache-semantics before 4.1.1. The issue can be exploited via malicious request header values sent to a server, when that server reads the cache policy from the request using this library. <p>Publish Date: 2023-01-31 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25881>CVE-2022-25881</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>7.5</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.cve.org/CVERecord?id=CVE-2022-25881">https://www.cve.org/CVERecord?id=CVE-2022-25881</a></p> <p>Release Date: 2023-01-31</p> <p>Fix Resolution: http-cache-semantics - 4.1.1</p> </p> <p></p> </details>
non_process
yo tgz vulnerabilities highest severity is autoclosed vulnerable library yo tgz found in head commit a href vulnerabilities cve severity cvss dependency type fixed in yo version remediation available high http cache semantics tgz transitive n a for some transitive vulnerabilities there is no version of direct dependency with a fix check the section details below to see if there is a version of transitive dependency where vulnerability is fixed details cve vulnerable library http cache semantics tgz parses cache control and other headers helps building correct http caches and proxies library home page a href dependency hierarchy yo tgz root library got tgz cacheable request tgz x http cache semantics tgz vulnerable library found in head commit a href found in base branch main vulnerability details this affects versions of the package http cache semantics before the issue can be exploited via malicious request header values sent to a server when that server reads the cache policy from the request using this library publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution http cache semantics
0
501,654
14,530,643,259
IssuesEvent
2020-12-14 19:34:03
ChainSafe/forest
https://api.github.com/repos/ChainSafe/forest
closed
Test Block Processing from Storage Miner: Perform state transitions for block received from Storage Miner
Priority: 3 - Medium Storage Miner
When a block is received over RPC (from Storage Miner), the syncer must be notified so that the system can sync towards it. Notes - We don't have a mechanism to notify the syncer as of now (we don't have access to the syncer from RPC). What is the best way to do this? A channel or a network event? This should be all there is to do. - A block from RPC or pubsub should be handled the same essentially (although there are peers to deal with on pubsub and track their heads, but that doesn't affect this issue)
1.0
Test Block Processing from Storage Miner: Perform state transitions for block received from Storage Miner - When a block is received over RPC (from Storage Miner), the syncer must be notified so that the system can sync towards it. Notes - We don't have a mechanism to notify the syncer as of now (we don't have access to the syncer from RPC). What is the best way to do this? A channel or a network event? This should be all there is to do. - A block from RPC or pubsub should be handled the same essentially (although there are peers to deal with on pubsub and track their heads, but that doesn't affect this issue)
non_process
test block processing from storage miner perform state transitions for block received from storage miner when a block is received over rpc from storage miner the syncer must be notified so that the system can sync towards it notes we don t have a mechanism to notify the syncer as of now we don t have access to the syncer from rpc what is the best way to do this a channel or a network event this should be all there is to do a block from rpc or pubsub should be handled the same essentially although there are peers to deal with on pubsub and track their heads but that doesn t affect this issue
0
6,537
9,634,579,785
IssuesEvent
2019-05-15 21:39:09
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
Very helpful
automation/svc cxp kudos process-automation/subsvc triaged
Thank you for documenting this. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 5622fd71-2727-a211-ed28-7be30f116e1a * Version Independent ID: 148cd7ae-766e-b2ea-200f-5c4b5306a310 * Content: [Troubleshooting Starting and Stopping VMs with Azure Automation](https://docs.microsoft.com/en-us/azure/automation/troubleshoot/start-stop-vm#feedback) * Content Source: [articles/automation/troubleshoot/start-stop-vm.md](https://github.com/Microsoft/azure-docs/blob/master/articles/automation/troubleshoot/start-stop-vm.md) * Service: **automation** * Sub-service: **process-automation** * GitHub Login: @georgewallace * Microsoft Alias: **gwallace**
1.0
Very helpful - Thank you for documenting this. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 5622fd71-2727-a211-ed28-7be30f116e1a * Version Independent ID: 148cd7ae-766e-b2ea-200f-5c4b5306a310 * Content: [Troubleshooting Starting and Stopping VMs with Azure Automation](https://docs.microsoft.com/en-us/azure/automation/troubleshoot/start-stop-vm#feedback) * Content Source: [articles/automation/troubleshoot/start-stop-vm.md](https://github.com/Microsoft/azure-docs/blob/master/articles/automation/troubleshoot/start-stop-vm.md) * Service: **automation** * Sub-service: **process-automation** * GitHub Login: @georgewallace * Microsoft Alias: **gwallace**
process
very helpful thank you for documenting this document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service automation sub service process automation github login georgewallace microsoft alias gwallace
1
502,472
14,546,835,206
IssuesEvent
2020-12-15 21:53:38
GoogleChrome/lighthouse
https://api.github.com/repos/GoogleChrome/lighthouse
closed
Remove h2 'push' suggestion from audit description
needs-priority
https://github.com/GoogleChrome/lighthouse/blob/378a31f8117d20c852562514612c80ea12892c54/lighthouse-core%2Faudits%2Fdobetterweb%2Fuses-http2.js#L31 The general consensus is that h2 push was a mistake. At best, it is very hard to get right, and easy to accidentally result in worse performance. the web.dev article doesn't even mention push.
1.0
Remove h2 'push' suggestion from audit description - https://github.com/GoogleChrome/lighthouse/blob/378a31f8117d20c852562514612c80ea12892c54/lighthouse-core%2Faudits%2Fdobetterweb%2Fuses-http2.js#L31 The general consensus is that h2 push was a mistake. At best, it is very hard to get right, and easy to accidentally result in worse performance. the web.dev article doesn't even mention push.
non_process
remove push suggestion from audit description the general consensus is that push was a mistake at best it is very hard to get right and easy to accidentally result in worse performance the web dev article doesn t even mention push
0
9,766
12,749,657,855
IssuesEvent
2020-06-26 23:43:45
unicode-org/icu4x
https://api.github.com/repos/unicode-org/icu4x
closed
Code review model
C-process T-docs
I remember we talked about this on several occasions, but I don't think we ever clearly defined the approach to reviewing we want to take. For a bit I've been experimenting with one, that I'd like to suggest for us: > Every PR requires at least one review to be merged. > > Merging should be performed by the PR author. If the author wants to grant another team member rights to merge, they can state so in the PR comment. > > Review can be of either architectural or syntactical value, and often is of both. > The PR author can specify, when requesting review, what kind of review they are asking for, from each reviewer or even which area they'd like the reviewer to focus on (useful when distributing reviews). > > ## Reviewers role > > The reviewer is responsible for accepting only once they feel the current PR with their comments *not applied* is ready to be merged. In other words, the `accepted` can be set with pending review comments, if those comments don't affect whether the patch is ready to be merged (for example, they're stylistic suggestions). > > The reviewer is also responsible for indicating the nature of their review comments - specifically, between three types: "blocking", "suggestion", "optional". `Blocking` is when the reviewer considers the change to be unmergable and requires a new revision. `Suggestion` is for when the reviewer considers the change to be suboptimal, but usable, and wants to defer the decision to the PR author, while stating their opinion. `Suggestion` is for when the reviewer considers multiple options to be mostly comparable or tradeoffs, and wants to defer to the PR author for the final decision after bringing up a new option. > > In particular, reviewer should try to minimize their impact on the PR, and use the `blocking` conservatively. Their role is not to "improve the code" or "teach" the PR author, but rather help the PR author avoid making mistakes and lowering the quality of the code. In result most review comments should be in form of suggestions and optionals, and the reviewer should request another revision for review only if they feel the need to see it before they can evaluate if the PR is mergable. > > Lastly, the reviewers role is to evaluate the stakeholders group and ensure that the review coverage is complete - if they review only portion of the PR, or if they see the need for additional stakeholders to be involved, they should add additional reviewers, or CC them into the issue, depending on what kind of involvement they expect (inform vs verify). > > ## Selection of reviewers > > When the PR author creates a new PR, they should consider three sources of reviewers and CCed stakeholders: > > * Owners and peers of the component they work with > * People involved in the preceeding conversation > * Recognized experts in the domain the PR operates in > > The goal of the PR author is to find the subset of stakeholders that represent those three groups well. Depending on the scope and priority of the PR, the reviewers group size can be adjusted, with small PRs being sufficent for review by just one stakeholder, and larger PRs, or first-of-a-kind using a larger pool of reviewers. > > ## PR author and reviewers workflow > > When the PR author submits the PR for review, and once they selected the initial group, each member of that group can chose to either review or resign from reviewing. If they resign, their only duty is to ensure that there is a sufficient coverage of the review, and nominate their replacement if not. > > After a round of review, if there are blocking issues or a reviewer kept themselves in the review queue without accepting a review, the PR author must update the PR and re-request review from all pending reviewers. > > PRs should generally not be merged with pending reviewers. > > If the PR author decides to make any substantial changes that go beyond of what the reviewers already approved, they can re-request an already accepted review after updating the PR. This is a braindump based on my experience working with Mozilla Bugzilla, Github, ECMA402 and other projects. I expect the text itself to require a rewrite, but first, I'd like to collect feedback on the content. I tried to fit it into RACI (informed vs consulted) and plug `review required`, `input required` vs `sign-off required`, but Github kind of flattens this so it leaves the comments and the PR author to state what kind of input is requested and the only distinction we can draw is between CC (inform) and review request (put on some form of a blocking path). Let's discuss this!
1.0
Code review model - I remember we talked about this on several occasions, but I don't think we ever clearly defined the approach to reviewing we want to take. For a bit I've been experimenting with one, that I'd like to suggest for us: > Every PR requires at least one review to be merged. > > Merging should be performed by the PR author. If the author wants to grant another team member rights to merge, they can state so in the PR comment. > > Review can be of either architectural or syntactical value, and often is of both. > The PR author can specify, when requesting review, what kind of review they are asking for, from each reviewer or even which area they'd like the reviewer to focus on (useful when distributing reviews). > > ## Reviewers role > > The reviewer is responsible for accepting only once they feel the current PR with their comments *not applied* is ready to be merged. In other words, the `accepted` can be set with pending review comments, if those comments don't affect whether the patch is ready to be merged (for example, they're stylistic suggestions). > > The reviewer is also responsible for indicating the nature of their review comments - specifically, between three types: "blocking", "suggestion", "optional". `Blocking` is when the reviewer considers the change to be unmergable and requires a new revision. `Suggestion` is for when the reviewer considers the change to be suboptimal, but usable, and wants to defer the decision to the PR author, while stating their opinion. `Suggestion` is for when the reviewer considers multiple options to be mostly comparable or tradeoffs, and wants to defer to the PR author for the final decision after bringing up a new option. > > In particular, reviewer should try to minimize their impact on the PR, and use the `blocking` conservatively. Their role is not to "improve the code" or "teach" the PR author, but rather help the PR author avoid making mistakes and lowering the quality of the code. In result most review comments should be in form of suggestions and optionals, and the reviewer should request another revision for review only if they feel the need to see it before they can evaluate if the PR is mergable. > > Lastly, the reviewers role is to evaluate the stakeholders group and ensure that the review coverage is complete - if they review only portion of the PR, or if they see the need for additional stakeholders to be involved, they should add additional reviewers, or CC them into the issue, depending on what kind of involvement they expect (inform vs verify). > > ## Selection of reviewers > > When the PR author creates a new PR, they should consider three sources of reviewers and CCed stakeholders: > > * Owners and peers of the component they work with > * People involved in the preceeding conversation > * Recognized experts in the domain the PR operates in > > The goal of the PR author is to find the subset of stakeholders that represent those three groups well. Depending on the scope and priority of the PR, the reviewers group size can be adjusted, with small PRs being sufficent for review by just one stakeholder, and larger PRs, or first-of-a-kind using a larger pool of reviewers. > > ## PR author and reviewers workflow > > When the PR author submits the PR for review, and once they selected the initial group, each member of that group can chose to either review or resign from reviewing. If they resign, their only duty is to ensure that there is a sufficient coverage of the review, and nominate their replacement if not. > > After a round of review, if there are blocking issues or a reviewer kept themselves in the review queue without accepting a review, the PR author must update the PR and re-request review from all pending reviewers. > > PRs should generally not be merged with pending reviewers. > > If the PR author decides to make any substantial changes that go beyond of what the reviewers already approved, they can re-request an already accepted review after updating the PR. This is a braindump based on my experience working with Mozilla Bugzilla, Github, ECMA402 and other projects. I expect the text itself to require a rewrite, but first, I'd like to collect feedback on the content. I tried to fit it into RACI (informed vs consulted) and plug `review required`, `input required` vs `sign-off required`, but Github kind of flattens this so it leaves the comments and the PR author to state what kind of input is requested and the only distinction we can draw is between CC (inform) and review request (put on some form of a blocking path). Let's discuss this!
process
code review model i remember we talked about this on several occasions but i don t think we ever clearly defined the approach to reviewing we want to take for a bit i ve been experimenting with one that i d like to suggest for us every pr requires at least one review to be merged merging should be performed by the pr author if the author wants to grant another team member rights to merge they can state so in the pr comment review can be of either architectural or syntactical value and often is of both the pr author can specify when requesting review what kind of review they are asking for from each reviewer or even which area they d like the reviewer to focus on useful when distributing reviews reviewers role the reviewer is responsible for accepting only once they feel the current pr with their comments not applied is ready to be merged in other words the accepted can be set with pending review comments if those comments don t affect whether the patch is ready to be merged for example they re stylistic suggestions the reviewer is also responsible for indicating the nature of their review comments specifically between three types blocking suggestion optional blocking is when the reviewer considers the change to be unmergable and requires a new revision suggestion is for when the reviewer considers the change to be suboptimal but usable and wants to defer the decision to the pr author while stating their opinion suggestion is for when the reviewer considers multiple options to be mostly comparable or tradeoffs and wants to defer to the pr author for the final decision after bringing up a new option in particular reviewer should try to minimize their impact on the pr and use the blocking conservatively their role is not to improve the code or teach the pr author but rather help the pr author avoid making mistakes and lowering the quality of the code in result most review comments should be in form of suggestions and optionals and the reviewer should request another revision for review only if they feel the need to see it before they can evaluate if the pr is mergable lastly the reviewers role is to evaluate the stakeholders group and ensure that the review coverage is complete if they review only portion of the pr or if they see the need for additional stakeholders to be involved they should add additional reviewers or cc them into the issue depending on what kind of involvement they expect inform vs verify selection of reviewers when the pr author creates a new pr they should consider three sources of reviewers and cced stakeholders owners and peers of the component they work with people involved in the preceeding conversation recognized experts in the domain the pr operates in the goal of the pr author is to find the subset of stakeholders that represent those three groups well depending on the scope and priority of the pr the reviewers group size can be adjusted with small prs being sufficent for review by just one stakeholder and larger prs or first of a kind using a larger pool of reviewers pr author and reviewers workflow when the pr author submits the pr for review and once they selected the initial group each member of that group can chose to either review or resign from reviewing if they resign their only duty is to ensure that there is a sufficient coverage of the review and nominate their replacement if not after a round of review if there are blocking issues or a reviewer kept themselves in the review queue without accepting a review the pr author must update the pr and re request review from all pending reviewers prs should generally not be merged with pending reviewers if the pr author decides to make any substantial changes that go beyond of what the reviewers already approved they can re request an already accepted review after updating the pr this is a braindump based on my experience working with mozilla bugzilla github and other projects i expect the text itself to require a rewrite but first i d like to collect feedback on the content i tried to fit it into raci informed vs consulted and plug review required input required vs sign off required but github kind of flattens this so it leaves the comments and the pr author to state what kind of input is requested and the only distinction we can draw is between cc inform and review request put on some form of a blocking path let s discuss this
1
16,734
21,899,351,537
IssuesEvent
2022-05-20 11:55:12
camunda/zeebe
https://api.github.com/repos/camunda/zeebe
closed
NPE: Cannot invoke `ExecutableActivity.getFlowScope()` because `innerActivity` is null
kind/bug scope/broker team/process-automation
**Describe the bug** While deploying a BPMN resource, transforming a multi-instance activity fails with a NPE. Error group: https://console.cloud.google.com/errors/detail/CLKvqKL_ldriaQ;service=zeebe;filter=%5B%5D;time=P7D?project=camunda-cloud-240911 Logs: [Log Explorer](https://console.cloud.google.com/logs/query;query=logName:%22stdout%22%0Aresource.type%3D%22k8s_container%22%0Aresource.labels.pod_name%3D%22zeebe-1%22%0Aresource.labels.project_id%3D%22camunda-cloud-240911%22%0Aresource.labels.location%3D%22europe-west1%22%0Aresource.labels.container_name%3D%22zeebe%22%0Aresource.labels.namespace_name%3D%229dc9676c-3817-4cec-a790-288ac65c1ae6-zeebe%22%0Aresource.labels.cluster_name%3D%22production-worker-1%22;timeRange=2022-05-17T14:43:21.800Z%2F2022-05-17T15:43:21.800Z;cursorTimestamp=2022-05-17T15:43:21.411794728Z?project=camunda-cloud-240911) Archived Logs: https://drive.google.com/file/d/1VarHjCc-XQZ6bZz7PUE4MmdDy0pQ5Xt9/view?usp=sharing **To Reproduce** **Expected behavior** <!-- A clear and concise description of what you expected to happen. --> **Log/Stacktrace** <!-- If possible add the full stacktrace or Zeebe log which contains the issue. --> <details><summary>Full Stacktrace</summary> <p> ``` java.lang.NullPointerException: Cannot invoke "io.camunda.zeebe.engine.processing.deployment.model.element.ExecutableActivity.getFlowScope()" because "innerActivity" is null at io.camunda.zeebe.engine.processing.deployment.model.transformer.MultiInstanceActivityTransformer.transform ( [io/camunda.zeebe.engine.processing.deployment.model.transformer/MultiInstanceActivityTransformer.java:53](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment.model.transformer%2FMultiInstanceActivityTransformer.java&line=53&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.deployment.model.transformer.MultiInstanceActivityTransformer.transform ( [io/camunda.zeebe.engine.processing.deployment.model.transformer/MultiInstanceActivityTransformer.java:29](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment.model.transformer%2FMultiInstanceActivityTransformer.java&line=29&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.deployment.model.transformation.TransformationVisitor.visit ( [io/camunda.zeebe.engine.processing.deployment.model.transformation/TransformationVisitor.java:42](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment.model.transformation%2FTransformationVisitor.java&line=42&project=camunda-cloud-240911) ) at io.camunda.zeebe.model.bpmn.traversal.TypeHierarchyVisitor.visit ( [io/camunda.zeebe.model.bpmn.traversal/TypeHierarchyVisitor.java:39](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.model.bpmn.traversal%2FTypeHierarchyVisitor.java&line=39&project=camunda-cloud-240911) ) at io.camunda.zeebe.model.bpmn.traversal.ModelWalker.walk ( [io/camunda.zeebe.model.bpmn.traversal/ModelWalker.java:61](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.model.bpmn.traversal%2FModelWalker.java&line=61&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.deployment.model.transformation.BpmnTransformer.transformDefinitions ( [io/camunda.zeebe.engine.processing.deployment.model.transformation/BpmnTransformer.java:117](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment.model.transformation%2FBpmnTransformer.java&line=117&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.deployment.transform.BpmnResourceTransformer.transformResource ( [io/camunda.zeebe.engine.processing.deployment.transform/BpmnResourceTransformer.java:69](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment.transform%2FBpmnResourceTransformer.java&line=69&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.deployment.transform.DeploymentTransformer.transformResource ( [io/camunda.zeebe.engine.processing.deployment.transform/DeploymentTransformer.java:120](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment.transform%2FDeploymentTransformer.java&line=120&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.deployment.transform.DeploymentTransformer.transform ( [io/camunda.zeebe.engine.processing.deployment.transform/DeploymentTransformer.java:97](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment.transform%2FDeploymentTransformer.java&line=97&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.deployment.DeploymentCreateProcessor.processRecord ( [io/camunda.zeebe.engine.processing.deployment/DeploymentCreateProcessor.java:100](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment%2FDeploymentCreateProcessor.java&line=100&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.streamprocessor.TypedRecordProcessor.processRecord ( [io/camunda.zeebe.engine.processing.streamprocessor/TypedRecordProcessor.java:54](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.streamprocessor%2FTypedRecordProcessor.java&line=54&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.lambda$processInTransaction$3 ( [io/camunda.zeebe.engine.processing.streamprocessor/ProcessingStateMachine.java:300](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.streamprocessor%2FProcessingStateMachine.java&line=300&project=camunda-cloud-240911) ) at io.camunda.zeebe.db.impl.rocksdb.transaction.ZeebeTransaction.run ( [io/camunda.zeebe.db.impl.rocksdb.transaction/ZeebeTransaction.java:84](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.db.impl.rocksdb.transaction%2FZeebeTransaction.java&line=84&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.processInTransaction ( [io/camunda.zeebe.engine.processing.streamprocessor/ProcessingStateMachine.java:290](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.streamprocessor%2FProcessingStateMachine.java&line=290&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.processCommand ( [io/camunda.zeebe.engine.processing.streamprocessor/ProcessingStateMachine.java:253](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.streamprocessor%2FProcessingStateMachine.java&line=253&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.tryToReadNextRecord ( [io/camunda.zeebe.engine.processing.streamprocessor/ProcessingStateMachine.java:213](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.streamprocessor%2FProcessingStateMachine.java&line=213&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.readNextRecord ( [io/camunda.zeebe.engine.processing.streamprocessor/ProcessingStateMachine.java:189](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.streamprocessor%2FProcessingStateMachine.java&line=189&project=camunda-cloud-240911) ) at io.camunda.zeebe.util.sched.ActorJob.invoke ( [io/camunda.zeebe.util.sched/ActorJob.java:79](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.util.sched%2FActorJob.java&line=79&project=camunda-cloud-240911) ) at io.camunda.zeebe.util.sched.ActorJob.execute ( [io/camunda.zeebe.util.sched/ActorJob.java:44](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.util.sched%2FActorJob.java&line=44&project=camunda-cloud-240911) ) at io.camunda.zeebe.util.sched.ActorTask.execute ( [io/camunda.zeebe.util.sched/ActorTask.java:122](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.util.sched%2FActorTask.java&line=122&project=camunda-cloud-240911) ) at io.camunda.zeebe.util.sched.ActorThread.executeCurrentTask ( [io/camunda.zeebe.util.sched/ActorThread.java:97](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.util.sched%2FActorThread.java&line=97&project=camunda-cloud-240911) ) at io.camunda.zeebe.util.sched.ActorThread.doWork ( [io/camunda.zeebe.util.sched/ActorThread.java:80](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.util.sched%2FActorThread.java&line=80&project=camunda-cloud-240911) ) at io.camunda.zeebe.util.sched.ActorThread.run ( [io/camunda.zeebe.util.sched/ActorThread.java:189](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.util.sched%2FActorThread.java&line=189&project=camunda-cloud-240911) ) ``` </p> </details> **Environment:** - Zeebe Version: 8.0.1
1.0
NPE: Cannot invoke `ExecutableActivity.getFlowScope()` because `innerActivity` is null - **Describe the bug** While deploying a BPMN resource, transforming a multi-instance activity fails with a NPE. Error group: https://console.cloud.google.com/errors/detail/CLKvqKL_ldriaQ;service=zeebe;filter=%5B%5D;time=P7D?project=camunda-cloud-240911 Logs: [Log Explorer](https://console.cloud.google.com/logs/query;query=logName:%22stdout%22%0Aresource.type%3D%22k8s_container%22%0Aresource.labels.pod_name%3D%22zeebe-1%22%0Aresource.labels.project_id%3D%22camunda-cloud-240911%22%0Aresource.labels.location%3D%22europe-west1%22%0Aresource.labels.container_name%3D%22zeebe%22%0Aresource.labels.namespace_name%3D%229dc9676c-3817-4cec-a790-288ac65c1ae6-zeebe%22%0Aresource.labels.cluster_name%3D%22production-worker-1%22;timeRange=2022-05-17T14:43:21.800Z%2F2022-05-17T15:43:21.800Z;cursorTimestamp=2022-05-17T15:43:21.411794728Z?project=camunda-cloud-240911) Archived Logs: https://drive.google.com/file/d/1VarHjCc-XQZ6bZz7PUE4MmdDy0pQ5Xt9/view?usp=sharing **To Reproduce** **Expected behavior** <!-- A clear and concise description of what you expected to happen. --> **Log/Stacktrace** <!-- If possible add the full stacktrace or Zeebe log which contains the issue. --> <details><summary>Full Stacktrace</summary> <p> ``` java.lang.NullPointerException: Cannot invoke "io.camunda.zeebe.engine.processing.deployment.model.element.ExecutableActivity.getFlowScope()" because "innerActivity" is null at io.camunda.zeebe.engine.processing.deployment.model.transformer.MultiInstanceActivityTransformer.transform ( [io/camunda.zeebe.engine.processing.deployment.model.transformer/MultiInstanceActivityTransformer.java:53](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment.model.transformer%2FMultiInstanceActivityTransformer.java&line=53&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.deployment.model.transformer.MultiInstanceActivityTransformer.transform ( [io/camunda.zeebe.engine.processing.deployment.model.transformer/MultiInstanceActivityTransformer.java:29](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment.model.transformer%2FMultiInstanceActivityTransformer.java&line=29&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.deployment.model.transformation.TransformationVisitor.visit ( [io/camunda.zeebe.engine.processing.deployment.model.transformation/TransformationVisitor.java:42](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment.model.transformation%2FTransformationVisitor.java&line=42&project=camunda-cloud-240911) ) at io.camunda.zeebe.model.bpmn.traversal.TypeHierarchyVisitor.visit ( [io/camunda.zeebe.model.bpmn.traversal/TypeHierarchyVisitor.java:39](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.model.bpmn.traversal%2FTypeHierarchyVisitor.java&line=39&project=camunda-cloud-240911) ) at io.camunda.zeebe.model.bpmn.traversal.ModelWalker.walk ( [io/camunda.zeebe.model.bpmn.traversal/ModelWalker.java:61](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.model.bpmn.traversal%2FModelWalker.java&line=61&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.deployment.model.transformation.BpmnTransformer.transformDefinitions ( [io/camunda.zeebe.engine.processing.deployment.model.transformation/BpmnTransformer.java:117](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment.model.transformation%2FBpmnTransformer.java&line=117&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.deployment.transform.BpmnResourceTransformer.transformResource ( [io/camunda.zeebe.engine.processing.deployment.transform/BpmnResourceTransformer.java:69](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment.transform%2FBpmnResourceTransformer.java&line=69&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.deployment.transform.DeploymentTransformer.transformResource ( [io/camunda.zeebe.engine.processing.deployment.transform/DeploymentTransformer.java:120](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment.transform%2FDeploymentTransformer.java&line=120&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.deployment.transform.DeploymentTransformer.transform ( [io/camunda.zeebe.engine.processing.deployment.transform/DeploymentTransformer.java:97](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment.transform%2FDeploymentTransformer.java&line=97&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.deployment.DeploymentCreateProcessor.processRecord ( [io/camunda.zeebe.engine.processing.deployment/DeploymentCreateProcessor.java:100](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.deployment%2FDeploymentCreateProcessor.java&line=100&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.streamprocessor.TypedRecordProcessor.processRecord ( [io/camunda.zeebe.engine.processing.streamprocessor/TypedRecordProcessor.java:54](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.streamprocessor%2FTypedRecordProcessor.java&line=54&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.lambda$processInTransaction$3 ( [io/camunda.zeebe.engine.processing.streamprocessor/ProcessingStateMachine.java:300](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.streamprocessor%2FProcessingStateMachine.java&line=300&project=camunda-cloud-240911) ) at io.camunda.zeebe.db.impl.rocksdb.transaction.ZeebeTransaction.run ( [io/camunda.zeebe.db.impl.rocksdb.transaction/ZeebeTransaction.java:84](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.db.impl.rocksdb.transaction%2FZeebeTransaction.java&line=84&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.processInTransaction ( [io/camunda.zeebe.engine.processing.streamprocessor/ProcessingStateMachine.java:290](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.streamprocessor%2FProcessingStateMachine.java&line=290&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.processCommand ( [io/camunda.zeebe.engine.processing.streamprocessor/ProcessingStateMachine.java:253](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.streamprocessor%2FProcessingStateMachine.java&line=253&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.tryToReadNextRecord ( [io/camunda.zeebe.engine.processing.streamprocessor/ProcessingStateMachine.java:213](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.streamprocessor%2FProcessingStateMachine.java&line=213&project=camunda-cloud-240911) ) at io.camunda.zeebe.engine.processing.streamprocessor.ProcessingStateMachine.readNextRecord ( [io/camunda.zeebe.engine.processing.streamprocessor/ProcessingStateMachine.java:189](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.engine.processing.streamprocessor%2FProcessingStateMachine.java&line=189&project=camunda-cloud-240911) ) at io.camunda.zeebe.util.sched.ActorJob.invoke ( [io/camunda.zeebe.util.sched/ActorJob.java:79](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.util.sched%2FActorJob.java&line=79&project=camunda-cloud-240911) ) at io.camunda.zeebe.util.sched.ActorJob.execute ( [io/camunda.zeebe.util.sched/ActorJob.java:44](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.util.sched%2FActorJob.java&line=44&project=camunda-cloud-240911) ) at io.camunda.zeebe.util.sched.ActorTask.execute ( [io/camunda.zeebe.util.sched/ActorTask.java:122](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.util.sched%2FActorTask.java&line=122&project=camunda-cloud-240911) ) at io.camunda.zeebe.util.sched.ActorThread.executeCurrentTask ( [io/camunda.zeebe.util.sched/ActorThread.java:97](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.util.sched%2FActorThread.java&line=97&project=camunda-cloud-240911) ) at io.camunda.zeebe.util.sched.ActorThread.doWork ( [io/camunda.zeebe.util.sched/ActorThread.java:80](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.util.sched%2FActorThread.java&line=80&project=camunda-cloud-240911) ) at io.camunda.zeebe.util.sched.ActorThread.run ( [io/camunda.zeebe.util.sched/ActorThread.java:189](https://console.cloud.google.com/debug?referrer=fromlog&file=io%2Fcamunda.zeebe.util.sched%2FActorThread.java&line=189&project=camunda-cloud-240911) ) ``` </p> </details> **Environment:** - Zeebe Version: 8.0.1
process
npe cannot invoke executableactivity getflowscope because inneractivity is null describe the bug while deploying a bpmn resource transforming a multi instance activity fails with a npe error group logs archived logs to reproduce expected behavior log stacktrace full stacktrace java lang nullpointerexception cannot invoke io camunda zeebe engine processing deployment model element executableactivity getflowscope because inneractivity is null at io camunda zeebe engine processing deployment model transformer multiinstanceactivitytransformer transform at io camunda zeebe engine processing deployment model transformer multiinstanceactivitytransformer transform at io camunda zeebe engine processing deployment model transformation transformationvisitor visit at io camunda zeebe model bpmn traversal typehierarchyvisitor visit at io camunda zeebe model bpmn traversal modelwalker walk at io camunda zeebe engine processing deployment model transformation bpmntransformer transformdefinitions at io camunda zeebe engine processing deployment transform bpmnresourcetransformer transformresource at io camunda zeebe engine processing deployment transform deploymenttransformer transformresource at io camunda zeebe engine processing deployment transform deploymenttransformer transform at io camunda zeebe engine processing deployment deploymentcreateprocessor processrecord at io camunda zeebe engine processing streamprocessor typedrecordprocessor processrecord at io camunda zeebe engine processing streamprocessor processingstatemachine lambda processintransaction at io camunda zeebe db impl rocksdb transaction zeebetransaction run at io camunda zeebe engine processing streamprocessor processingstatemachine processintransaction at io camunda zeebe engine processing streamprocessor processingstatemachine processcommand at io camunda zeebe engine processing streamprocessor processingstatemachine trytoreadnextrecord at io camunda zeebe engine processing streamprocessor processingstatemachine readnextrecord at io camunda zeebe util sched actorjob invoke at io camunda zeebe util sched actorjob execute at io camunda zeebe util sched actortask execute at io camunda zeebe util sched actorthread executecurrenttask at io camunda zeebe util sched actorthread dowork at io camunda zeebe util sched actorthread run environment zeebe version
1
20,775
6,927,635,156
IssuesEvent
2017-11-30 23:49:15
angular/angular
https://api.github.com/repos/angular/angular
closed
published code & npm bundles include abstract methods
comp: build & ci freq3: high severity2: inconvenient
**I'm submitting a bug report** **Current behavior** Published code & umd bundles from npm such as `@angular/platform-browser/bundles/platform-browser.umd.js` from `@angular/platform-browser` include extra licenses and empty abstract methods. **Expected behavior** Ideally our .umd bundles would not include these abstract no-op methods or the duplicate license information. This would probably result in bundle size reduction and performance increase for all Angular apps using these bundles. **Minimal reproduction of the problem with instructions** ``` npm install @angular/platform-browser grep ") { }" node_modules/@angular/platform-browser/bundles/platform-browser.umd.js ``` **Angular version:** 2.4.1 & master
1.0
published code & npm bundles include abstract methods - **I'm submitting a bug report** **Current behavior** Published code & umd bundles from npm such as `@angular/platform-browser/bundles/platform-browser.umd.js` from `@angular/platform-browser` include extra licenses and empty abstract methods. **Expected behavior** Ideally our .umd bundles would not include these abstract no-op methods or the duplicate license information. This would probably result in bundle size reduction and performance increase for all Angular apps using these bundles. **Minimal reproduction of the problem with instructions** ``` npm install @angular/platform-browser grep ") { }" node_modules/@angular/platform-browser/bundles/platform-browser.umd.js ``` **Angular version:** 2.4.1 & master
non_process
published code npm bundles include abstract methods i m submitting a bug report current behavior published code umd bundles from npm such as angular platform browser bundles platform browser umd js from angular platform browser include extra licenses and empty abstract methods expected behavior ideally our umd bundles would not include these abstract no op methods or the duplicate license information this would probably result in bundle size reduction and performance increase for all angular apps using these bundles minimal reproduction of the problem with instructions npm install angular platform browser grep node modules angular platform browser bundles platform browser umd js angular version master
0
21,452
29,489,355,567
IssuesEvent
2023-06-02 12:20:59
metabase/metabase
https://api.github.com/repos/metabase/metabase
closed
[metabase-lib] Remove `Question.card()` and the `getCard` selector
.metabase-lib .Team/QueryProcessor :hammer_and_wrench:
This method returns a `Card`, a vanilla map containing all the details of a question, including its MBQL or native query. We don't want components to be playing with this raw data; rather they should be using the interface of `Question`. The only exception where a renamed `card()` method needs to persist is the Redux world. Since Redux state is supposed to be serializable, the fundamental reducer for the current question should really store the inner `Card`, and the `getQuestion` selector should wrap it back up in a `Question` again. The `getCard` selector should be deleted. ## Steps Not all of these may form their own PRs, depending on how complex they prove to be. 1. Replace any `._card` usage, anywhere, with `.card()`. 1. Make `getQuestion` fundamental; make `getCard` call `.card()` on it. - Update all the reducers that maintain the card to maintain the question instead. 1. Change all the users of `getCard` to be based on `getQuestion`. - That'll involve porting a lot of React components that thread through `card` for `question`. 1. Make `Urls.*` accept Question in place of Card (there's a load of these). 1. Remove any other uses of `card()` I can find. 1. Delete `getCard` and `card()`.
1.0
[metabase-lib] Remove `Question.card()` and the `getCard` selector - This method returns a `Card`, a vanilla map containing all the details of a question, including its MBQL or native query. We don't want components to be playing with this raw data; rather they should be using the interface of `Question`. The only exception where a renamed `card()` method needs to persist is the Redux world. Since Redux state is supposed to be serializable, the fundamental reducer for the current question should really store the inner `Card`, and the `getQuestion` selector should wrap it back up in a `Question` again. The `getCard` selector should be deleted. ## Steps Not all of these may form their own PRs, depending on how complex they prove to be. 1. Replace any `._card` usage, anywhere, with `.card()`. 1. Make `getQuestion` fundamental; make `getCard` call `.card()` on it. - Update all the reducers that maintain the card to maintain the question instead. 1. Change all the users of `getCard` to be based on `getQuestion`. - That'll involve porting a lot of React components that thread through `card` for `question`. 1. Make `Urls.*` accept Question in place of Card (there's a load of these). 1. Remove any other uses of `card()` I can find. 1. Delete `getCard` and `card()`.
process
remove question card and the getcard selector this method returns a card a vanilla map containing all the details of a question including its mbql or native query we don t want components to be playing with this raw data rather they should be using the interface of question the only exception where a renamed card method needs to persist is the redux world since redux state is supposed to be serializable the fundamental reducer for the current question should really store the inner card and the getquestion selector should wrap it back up in a question again the getcard selector should be deleted steps not all of these may form their own prs depending on how complex they prove to be replace any card usage anywhere with card make getquestion fundamental make getcard call card on it update all the reducers that maintain the card to maintain the question instead change all the users of getcard to be based on getquestion that ll involve porting a lot of react components that thread through card for question make urls accept question in place of card there s a load of these remove any other uses of card i can find delete getcard and card
1
8,622
11,776,784,434
IssuesEvent
2020-03-16 13:49:54
elastic/beats
https://api.github.com/repos/elastic/beats
opened
Add de-dot processor that converts dotted field names to nested objects
:Processors Filebeat enhancement
**Background** This is a requirement that came up in the https://github.com/elastic/ecs-logging initiative. In summary, we're trying to make logging simpler by logging ECS-compliant JSON to a file that Filebeat can just forward to Elasticsearch. However, due to readability, performance, and other technical reasons, it's not always possible for the loggers to produce a correctly nested JSON structure. Some fields, like `log.logger` may be represented via a field name containing a dot (`"log.logger": "INFO"`) while others may be nested (`"foo": { "bar": "baz"}`). More context here: https://github.com/elastic/ecs-logging-java/issues/51 **The problem** When processing log data, for example with an Elasticsearch Ingest pipeline, we need all fields to be nested. Otherwise, the user doesn't know whether to access a field via `doc["foo.bar"]` or via `doc["foo"]["bar"]`. We don't want users to have knowledge about which fields are nested vs dotted as this is an implementation detail that can vary with different `ecs-logging` implementations and may even change for the same implementation. Also, ECS defines that fields should be always nested. **Describe the enhancement:** We'd like to have a Filebeat processor that expands all dotted field names to nested objects. This would decouple the representation in the log file from how the documents are supposed to look once they hit the ingest node processing pipeline. **Concerns** - Performance: This might be a performance hit but I suspect other processors, like grok, to be much more processing intensive. - If the JSON is already fully nested, we could short-circuit the processing - We could require that once we're in a nested context, dots are no longer replaced with dotting. - Allowed: `"foo.bar": {"baz": "qux"}` - Disallowed: `"foo": {"bar.baz": "qux"}"` - Given this restricion, the de-dotting can be done very efficiently by sorting the keys alphabetically and processing the JSON similar to how a SAX-parser works (@urso's idea). **Open Questions** What to do when conflicts occur. - Incompatible mappings: `"foo.bar": "baz"`, `"foo": "bar"` (foo is both an object and a string). - Duplicate keys: `"foo.bar": "baz"`, `"foo.bar": "qux"`
1.0
Add de-dot processor that converts dotted field names to nested objects - **Background** This is a requirement that came up in the https://github.com/elastic/ecs-logging initiative. In summary, we're trying to make logging simpler by logging ECS-compliant JSON to a file that Filebeat can just forward to Elasticsearch. However, due to readability, performance, and other technical reasons, it's not always possible for the loggers to produce a correctly nested JSON structure. Some fields, like `log.logger` may be represented via a field name containing a dot (`"log.logger": "INFO"`) while others may be nested (`"foo": { "bar": "baz"}`). More context here: https://github.com/elastic/ecs-logging-java/issues/51 **The problem** When processing log data, for example with an Elasticsearch Ingest pipeline, we need all fields to be nested. Otherwise, the user doesn't know whether to access a field via `doc["foo.bar"]` or via `doc["foo"]["bar"]`. We don't want users to have knowledge about which fields are nested vs dotted as this is an implementation detail that can vary with different `ecs-logging` implementations and may even change for the same implementation. Also, ECS defines that fields should be always nested. **Describe the enhancement:** We'd like to have a Filebeat processor that expands all dotted field names to nested objects. This would decouple the representation in the log file from how the documents are supposed to look once they hit the ingest node processing pipeline. **Concerns** - Performance: This might be a performance hit but I suspect other processors, like grok, to be much more processing intensive. - If the JSON is already fully nested, we could short-circuit the processing - We could require that once we're in a nested context, dots are no longer replaced with dotting. - Allowed: `"foo.bar": {"baz": "qux"}` - Disallowed: `"foo": {"bar.baz": "qux"}"` - Given this restricion, the de-dotting can be done very efficiently by sorting the keys alphabetically and processing the JSON similar to how a SAX-parser works (@urso's idea). **Open Questions** What to do when conflicts occur. - Incompatible mappings: `"foo.bar": "baz"`, `"foo": "bar"` (foo is both an object and a string). - Duplicate keys: `"foo.bar": "baz"`, `"foo.bar": "qux"`
process
add de dot processor that converts dotted field names to nested objects background this is a requirement that came up in the initiative in summary we re trying to make logging simpler by logging ecs compliant json to a file that filebeat can just forward to elasticsearch however due to readability performance and other technical reasons it s not always possible for the loggers to produce a correctly nested json structure some fields like log logger may be represented via a field name containing a dot log logger info while others may be nested foo bar baz more context here the problem when processing log data for example with an elasticsearch ingest pipeline we need all fields to be nested otherwise the user doesn t know whether to access a field via doc or via doc we don t want users to have knowledge about which fields are nested vs dotted as this is an implementation detail that can vary with different ecs logging implementations and may even change for the same implementation also ecs defines that fields should be always nested describe the enhancement we d like to have a filebeat processor that expands all dotted field names to nested objects this would decouple the representation in the log file from how the documents are supposed to look once they hit the ingest node processing pipeline concerns performance this might be a performance hit but i suspect other processors like grok to be much more processing intensive if the json is already fully nested we could short circuit the processing we could require that once we re in a nested context dots are no longer replaced with dotting allowed foo bar baz qux disallowed foo bar baz qux given this restricion the de dotting can be done very efficiently by sorting the keys alphabetically and processing the json similar to how a sax parser works urso s idea open questions what to do when conflicts occur incompatible mappings foo bar baz foo bar foo is both an object and a string duplicate keys foo bar baz foo bar qux
1
11,267
14,059,193,013
IssuesEvent
2020-11-03 02:21:24
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
A series of date time related bugs
priority/high severity/Major sig/coprocessor type/bug
## Bug Report Known bugs that need to be fixed one day in future: 1. Does not support `date` that 00:00:00 is an invalid time. 2. Always reports errors for invalid time while MySQL does not. 3. Incorrect date time processing for some functions when year < 1970 or date time is invalid. 4. Zero month or zero day is not supported.
1.0
A series of date time related bugs - ## Bug Report Known bugs that need to be fixed one day in future: 1. Does not support `date` that 00:00:00 is an invalid time. 2. Always reports errors for invalid time while MySQL does not. 3. Incorrect date time processing for some functions when year < 1970 or date time is invalid. 4. Zero month or zero day is not supported.
process
a series of date time related bugs bug report known bugs that need to be fixed one day in future does not support date that is an invalid time always reports errors for invalid time while mysql does not incorrect date time processing for some functions when year or date time is invalid zero month or zero day is not supported
1
332,493
29,479,668,184
IssuesEvent
2023-06-02 03:42:36
pytorch/pytorch
https://api.github.com/repos/pytorch/pytorch
opened
DISABLED test_cpu_transposed_broadcast3 (__main__.SweepInputsCpuTest)
triaged module: flaky-tests skipped module: inductor
Platforms: rocm This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/flakytest?name=test_cpu_transposed_broadcast3&suite=SweepInputsCpuTest) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/undefined). Over the past 3 hours, it has been determined flaky in 2 workflow(s) with 2 failures and 2 successes. **Debugging instructions (after clicking on the recent samples link):** DO NOT ASSUME THINGS ARE OKAY IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs. To find relevant log snippets: 1. Click on the workflow logs linked above 2. Click on the Test step of the job so that it is expanded. Otherwise, the grepping will not work. 3. Grep for `test_cpu_transposed_broadcast3` 4. There should be several instances run (as flaky tests are rerun in CI) from which you can study the logs. Test file path: `inductor/test_torchinductor.py` or `inductor/test_torchinductor.py`
1.0
DISABLED test_cpu_transposed_broadcast3 (__main__.SweepInputsCpuTest) - Platforms: rocm This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/flakytest?name=test_cpu_transposed_broadcast3&suite=SweepInputsCpuTest) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/undefined). Over the past 3 hours, it has been determined flaky in 2 workflow(s) with 2 failures and 2 successes. **Debugging instructions (after clicking on the recent samples link):** DO NOT ASSUME THINGS ARE OKAY IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs. To find relevant log snippets: 1. Click on the workflow logs linked above 2. Click on the Test step of the job so that it is expanded. Otherwise, the grepping will not work. 3. Grep for `test_cpu_transposed_broadcast3` 4. There should be several instances run (as flaky tests are rerun in CI) from which you can study the logs. Test file path: `inductor/test_torchinductor.py` or `inductor/test_torchinductor.py`
non_process
disabled test cpu transposed main sweepinputscputest platforms rocm this test was disabled because it is failing in ci see and the most recent trunk over the past hours it has been determined flaky in workflow s with failures and successes debugging instructions after clicking on the recent samples link do not assume things are okay if the ci is green we now shield flaky tests from developers so ci will thus be green but it will be harder to parse the logs to find relevant log snippets click on the workflow logs linked above click on the test step of the job so that it is expanded otherwise the grepping will not work grep for test cpu transposed there should be several instances run as flaky tests are rerun in ci from which you can study the logs test file path inductor test torchinductor py or inductor test torchinductor py
0
14,719
17,928,403,166
IssuesEvent
2021-09-10 05:10:16
googleapis/release-please
https://api.github.com/repos/googleapis/release-please
closed
Dependency Dashboard
type: process
This issue provides visibility into Renovate updates and their statuses. [Learn more](https://docs.renovatebot.com/key-concepts/dashboard/) ## Awaiting Schedule These updates are awaiting their schedule. Click on a checkbox to get an update now. - [ ] <!-- unschedule-branch=renovate/actions-setup-node-2.x -->chore(deps): update actions/setup-node action to v2 ## Ignored or Blocked These are blocked by an existing closed PR and will not be recreated unless you click a checkbox below. - [ ] <!-- recreate-branch=renovate/gts-3.x -->[chore(deps): update dependency gts to v3](../pull/580) - [ ] <!-- recreate-branch=renovate/mocha-9.x -->[chore(deps): update dependency mocha to v9](../pull/927) (`mocha`, `@types/mocha`) - [ ] <!-- recreate-branch=renovate/detect-indent-7.x -->[fix(deps): update dependency detect-indent to v7](../pull/1023) - [ ] <!-- recreate-branch=renovate/figures-4.x -->[fix(deps): update dependency figures to v4](../pull/950) - [ ] <!-- recreate-branch=renovate/type-fest-2.x -->[fix(deps): update dependency type-fest to v2](../pull/1006) - [ ] <!-- recreate-branch=renovate/unist-util-visit-4.x -->[fix(deps): update dependency unist-util-visit to v4](../pull/990) - [ ] <!-- recreate-branch=renovate/unist-util-visit-parents-5.x -->[fix(deps): update dependency unist-util-visit-parents to v5](../pull/991) - [ ] <!-- recreate-branch=renovate/yargs-17.x -->[fix(deps): update dependency yargs to v17](../pull/895) (`yargs`, `@types/yargs`) --- - [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
1.0
Dependency Dashboard - This issue provides visibility into Renovate updates and their statuses. [Learn more](https://docs.renovatebot.com/key-concepts/dashboard/) ## Awaiting Schedule These updates are awaiting their schedule. Click on a checkbox to get an update now. - [ ] <!-- unschedule-branch=renovate/actions-setup-node-2.x -->chore(deps): update actions/setup-node action to v2 ## Ignored or Blocked These are blocked by an existing closed PR and will not be recreated unless you click a checkbox below. - [ ] <!-- recreate-branch=renovate/gts-3.x -->[chore(deps): update dependency gts to v3](../pull/580) - [ ] <!-- recreate-branch=renovate/mocha-9.x -->[chore(deps): update dependency mocha to v9](../pull/927) (`mocha`, `@types/mocha`) - [ ] <!-- recreate-branch=renovate/detect-indent-7.x -->[fix(deps): update dependency detect-indent to v7](../pull/1023) - [ ] <!-- recreate-branch=renovate/figures-4.x -->[fix(deps): update dependency figures to v4](../pull/950) - [ ] <!-- recreate-branch=renovate/type-fest-2.x -->[fix(deps): update dependency type-fest to v2](../pull/1006) - [ ] <!-- recreate-branch=renovate/unist-util-visit-4.x -->[fix(deps): update dependency unist-util-visit to v4](../pull/990) - [ ] <!-- recreate-branch=renovate/unist-util-visit-parents-5.x -->[fix(deps): update dependency unist-util-visit-parents to v5](../pull/991) - [ ] <!-- recreate-branch=renovate/yargs-17.x -->[fix(deps): update dependency yargs to v17](../pull/895) (`yargs`, `@types/yargs`) --- - [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
process
dependency dashboard this issue provides visibility into renovate updates and their statuses awaiting schedule these updates are awaiting their schedule click on a checkbox to get an update now chore deps update actions setup node action to ignored or blocked these are blocked by an existing closed pr and will not be recreated unless you click a checkbox below pull pull mocha types mocha pull pull pull pull pull pull yargs types yargs check this box to trigger a request for renovate to run again on this repository
1
21,742
4,744,262,596
IssuesEvent
2016-10-21 00:04:44
facebook/nuclide
https://api.github.com/repos/facebook/nuclide
closed
Document setting changes for HHVM debugger
documentation
17d2c86946dc10189a250b30513e30c88167a0ff added a setting to choose where nuclide looks for the HHVM binary on your remote server, so we should have a brief mention in the docs.
1.0
Document setting changes for HHVM debugger - 17d2c86946dc10189a250b30513e30c88167a0ff added a setting to choose where nuclide looks for the HHVM binary on your remote server, so we should have a brief mention in the docs.
non_process
document setting changes for hhvm debugger added a setting to choose where nuclide looks for the hhvm binary on your remote server so we should have a brief mention in the docs
0
18,979
24,966,129,258
IssuesEvent
2022-11-01 19:32:58
darktable-org/darktable
https://api.github.com/repos/darktable-org/darktable
closed
Exported jpg is partially burned out
duplicate understood: unclear reproduce: peculiar scope: image processing
**Describe the bug/issue** This is what is see in Darktable ![Screenshot_2022-10-30_06-38-35](https://user-images.githubusercontent.com/32796771/198864287-49b38e56-1272-4782-9096-2e75ef874b44.png) This is what the exported jpg file lookslike ![Screenshot_2022-10-30_06-42-50](https://user-images.githubusercontent.com/32796771/198864361-6b9133f0-eb63-4542-aa2a-bc3d07219b26.png) **To Reproduce** First I used 'Color balance rgb' ![Screenshot_2022-10-30_06-46-30](https://user-images.githubusercontent.com/32796771/198864489-8934561a-8c3a-4815-9bbf-34f2a1119208.png) Then I used the 'Contrast equalizer' ![Screenshot_2022-10-30_06-47-09](https://user-images.githubusercontent.com/32796771/198864505-2ae8fd3f-9c62-4528-91b4-33c400b0cecd.png) I didn't use masks. **Platform** * darktable version :4.0.1 * Linux - Distro : Ubuntu 20.04.5 LTS * Graphics card : none * Desktop : xfce
1.0
Exported jpg is partially burned out - **Describe the bug/issue** This is what is see in Darktable ![Screenshot_2022-10-30_06-38-35](https://user-images.githubusercontent.com/32796771/198864287-49b38e56-1272-4782-9096-2e75ef874b44.png) This is what the exported jpg file lookslike ![Screenshot_2022-10-30_06-42-50](https://user-images.githubusercontent.com/32796771/198864361-6b9133f0-eb63-4542-aa2a-bc3d07219b26.png) **To Reproduce** First I used 'Color balance rgb' ![Screenshot_2022-10-30_06-46-30](https://user-images.githubusercontent.com/32796771/198864489-8934561a-8c3a-4815-9bbf-34f2a1119208.png) Then I used the 'Contrast equalizer' ![Screenshot_2022-10-30_06-47-09](https://user-images.githubusercontent.com/32796771/198864505-2ae8fd3f-9c62-4528-91b4-33c400b0cecd.png) I didn't use masks. **Platform** * darktable version :4.0.1 * Linux - Distro : Ubuntu 20.04.5 LTS * Graphics card : none * Desktop : xfce
process
exported jpg is partially burned out describe the bug issue this is what is see in darktable this is what the exported jpg file lookslike to reproduce first i used color balance rgb then i used the contrast equalizer i didn t use masks platform darktable version linux distro ubuntu lts graphics card none desktop xfce
1
13,500
16,041,204,802
IssuesEvent
2021-04-22 08:07:32
rjsears/chia_plot_manager
https://api.github.com/repos/rjsears/chia_plot_manager
closed
Add network traffic monitoring to "Plot in Progress" checks
In Process TODO
Currently, we write a `status file` both on the sending and receiving end to make determine if we are moving a plot since we only want to move one at a time (at least for now). I want to add in physical monitoring of the network link between the NAS and the Plotter (with a selectable minimum bw rate) as an additional check to make sure we are really sending a plot. Additionally, if we see the checkfile in place but we are below the network bw threshold, we can fore a `reset` and try to send the plot again.
1.0
Add network traffic monitoring to "Plot in Progress" checks - Currently, we write a `status file` both on the sending and receiving end to make determine if we are moving a plot since we only want to move one at a time (at least for now). I want to add in physical monitoring of the network link between the NAS and the Plotter (with a selectable minimum bw rate) as an additional check to make sure we are really sending a plot. Additionally, if we see the checkfile in place but we are below the network bw threshold, we can fore a `reset` and try to send the plot again.
process
add network traffic monitoring to plot in progress checks currently we write a status file both on the sending and receiving end to make determine if we are moving a plot since we only want to move one at a time at least for now i want to add in physical monitoring of the network link between the nas and the plotter with a selectable minimum bw rate as an additional check to make sure we are really sending a plot additionally if we see the checkfile in place but we are below the network bw threshold we can fore a reset and try to send the plot again
1
665,971
22,337,466,935
IssuesEvent
2022-06-14 20:01:35
Gilded-Games/The-Aether
https://api.github.com/repos/Gilded-Games/The-Aether
closed
Feature: Art Refresh
priority/high status/in-progress type/feature feat/art
- [x] Items - [x] Blocks - [x] Entities - [x] GUI Elements - [x] Armor/Accessories - [ ] Effects - [x] Vignettes? - [x] Programmer Art versions of any new content that only has the new art (- bconlon). - [x] Final Polishing
1.0
Feature: Art Refresh - - [x] Items - [x] Blocks - [x] Entities - [x] GUI Elements - [x] Armor/Accessories - [ ] Effects - [x] Vignettes? - [x] Programmer Art versions of any new content that only has the new art (- bconlon). - [x] Final Polishing
non_process
feature art refresh items blocks entities gui elements armor accessories effects vignettes programmer art versions of any new content that only has the new art bconlon final polishing
0
5,672
8,556,469,596
IssuesEvent
2018-11-08 13:16:53
easy-software-ufal/annotations_repos
https://api.github.com/repos/easy-software-ufal/annotations_repos
opened
Microsoft/aspnet-api-versioning Wrong property navigation action is selected in ODATA controllers
C# no operator wrong processing
Issue: `https://github.com/Microsoft/aspnet-api-versioning/issues/336` PR: `https://github.com/Microsoft/aspnet-api-versioning/commit/8c7dfa0f63e7e5856e0ccfe2b16f4397c4f566f3`
1.0
Microsoft/aspnet-api-versioning Wrong property navigation action is selected in ODATA controllers - Issue: `https://github.com/Microsoft/aspnet-api-versioning/issues/336` PR: `https://github.com/Microsoft/aspnet-api-versioning/commit/8c7dfa0f63e7e5856e0ccfe2b16f4397c4f566f3`
process
microsoft aspnet api versioning wrong property navigation action is selected in odata controllers issue pr
1
182,155
6,667,546,016
IssuesEvent
2017-10-03 13:00:31
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
www.alexa.com - design is broken
browser-chrome priority-normal status-duplicate
<!-- @browser: Chrome 60.0.3112 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36 --> <!-- @reported_with: web --> **URL**: https://www.alexa.com/topsites/category/Top/Recreation/Audio **Browser / Version**: Chrome 60.0.3112 **Operating System**: Windows 10 **Tested Another Browser**: Unknown **Problem type**: Design is broken **Description**: Bottom Drawer Marketing Resourse is not working fine **Steps to Reproduce**: _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
www.alexa.com - design is broken - <!-- @browser: Chrome 60.0.3112 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36 --> <!-- @reported_with: web --> **URL**: https://www.alexa.com/topsites/category/Top/Recreation/Audio **Browser / Version**: Chrome 60.0.3112 **Operating System**: Windows 10 **Tested Another Browser**: Unknown **Problem type**: Design is broken **Description**: Bottom Drawer Marketing Resourse is not working fine **Steps to Reproduce**: _From [webcompat.com](https://webcompat.com/) with ❤️_
non_process
design is broken url browser version chrome operating system windows tested another browser unknown problem type design is broken description bottom drawer marketing resourse is not working fine steps to reproduce from with ❤️
0
21,383
29,202,230,186
IssuesEvent
2023-05-21 00:37:16
devssa/onde-codar-em-salvador
https://api.github.com/repos/devssa/onde-codar-em-salvador
closed
[Hibrido / ] Process Mapping Analyst (Presencial - Belo horizonte) na Coodesh
SALVADOR BANCO DE DADOS REQUISITOS PROCESSOS INOVAÇÃO GITHUB UMA DOCUMENTAÇÃO MODELAGEM DE PROCESSOS BPMN NEGÓCIOS ALOCADO Stale
## Descrição da vaga: Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios. Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/vagas/process-mapping-analyst-presencial-belo-horizonte-201156492?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋 <p>A <strong>Prime Results</strong> está buscando <strong><ins>Process Mapping Analyst</ins></strong> para compor seu time!</p> <p>Acreditamos no poder de transformação social realizado pelas empresas. Acreditamos no poder transformador das pessoas, aliado à gestão e tecnologia. Compartilhamos nosso conhecimento para solucionar problemas complexos e gerar valor para nossos clientes.</p> <p><strong>Responsabilidades e atribuições:</strong></p> <ul> <li>Uso de ferramentas e técnicas para modelagem de processos;</li> <li>Identificação e análise da situação AS IS;</li> <li>Documentação de processo, construção de procedimentos e regras de negócio para condução de processos;&nbsp;</li> <li>Design de processos utilizando modelagem BPMN / EPC / VSM (preferência software Bizagi), Construção de tickets de melhoria de processos, apresentando mudanças e requisitos alinhado a equipe envolvida.</li> </ul> ## Prime Results : <p>O Best Seller Simon Sinek, diz que a maioria das empresas sabem o que fazem, porém não sabem por que o fazem. Não é o nosso caso. A Prime Results é uma empresa especializada em gestão organizacional que usa seu potencial de transformação em empresas que geram impacto positivo na sociedade. Nossos clientes hoje, fazem a diferença na vida de mais de 250.000 brasileiros, nas áreas de proteção patrimonial, saúde e assistência 24 horas.&nbsp;</p> <p>Nosso objetivo central é criar um ambiente criativo, dinâmico e engajado, sempre aliados a métodos, processos inteligentes e muita inovação.</p><a href='https://coodesh.com/empresas/prime-results'>Veja mais no site</a> ## Habilidades: - Análise de requisitos - Banco de dados orientado a documentos - BPMN ## Local: undefined ## Requisitos: - Conhecimento em ferramentas e técnicas de modelagem de processos; - Experiência em análise de requisitos, documentação e regras de negócios; - Conhecimento em modelagem BPMN/EPC/VSM; - Boa comunicação oral e escrita. ## Benefícios: - Vale Refeição - 25,00 o dia trabalhado (Cartão Flash); - Vale Transporte ou Auxilio Combustível; - Assistência Médica após o período de experiência; - Acesso ao Clube Certo - Clube de Benefícios; - Gympass; - Parceria com instituições de ensino (Cursos de graduação e pós graduação); ## Como se candidatar: Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Process Mapping Analyst (Presencial - Belo horizonte) na Prime Results ](https://coodesh.com/vagas/process-mapping-analyst-presencial-belo-horizonte-201156492?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação. ## Labels #### Alocação Alocado #### Regime CLT #### Categoria Testes/Q.A
2.0
[Hibrido / ] Process Mapping Analyst (Presencial - Belo horizonte) na Coodesh - ## Descrição da vaga: Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios. Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/vagas/process-mapping-analyst-presencial-belo-horizonte-201156492?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋 <p>A <strong>Prime Results</strong> está buscando <strong><ins>Process Mapping Analyst</ins></strong> para compor seu time!</p> <p>Acreditamos no poder de transformação social realizado pelas empresas. Acreditamos no poder transformador das pessoas, aliado à gestão e tecnologia. Compartilhamos nosso conhecimento para solucionar problemas complexos e gerar valor para nossos clientes.</p> <p><strong>Responsabilidades e atribuições:</strong></p> <ul> <li>Uso de ferramentas e técnicas para modelagem de processos;</li> <li>Identificação e análise da situação AS IS;</li> <li>Documentação de processo, construção de procedimentos e regras de negócio para condução de processos;&nbsp;</li> <li>Design de processos utilizando modelagem BPMN / EPC / VSM (preferência software Bizagi), Construção de tickets de melhoria de processos, apresentando mudanças e requisitos alinhado a equipe envolvida.</li> </ul> ## Prime Results : <p>O Best Seller Simon Sinek, diz que a maioria das empresas sabem o que fazem, porém não sabem por que o fazem. Não é o nosso caso. A Prime Results é uma empresa especializada em gestão organizacional que usa seu potencial de transformação em empresas que geram impacto positivo na sociedade. Nossos clientes hoje, fazem a diferença na vida de mais de 250.000 brasileiros, nas áreas de proteção patrimonial, saúde e assistência 24 horas.&nbsp;</p> <p>Nosso objetivo central é criar um ambiente criativo, dinâmico e engajado, sempre aliados a métodos, processos inteligentes e muita inovação.</p><a href='https://coodesh.com/empresas/prime-results'>Veja mais no site</a> ## Habilidades: - Análise de requisitos - Banco de dados orientado a documentos - BPMN ## Local: undefined ## Requisitos: - Conhecimento em ferramentas e técnicas de modelagem de processos; - Experiência em análise de requisitos, documentação e regras de negócios; - Conhecimento em modelagem BPMN/EPC/VSM; - Boa comunicação oral e escrita. ## Benefícios: - Vale Refeição - 25,00 o dia trabalhado (Cartão Flash); - Vale Transporte ou Auxilio Combustível; - Assistência Médica após o período de experiência; - Acesso ao Clube Certo - Clube de Benefícios; - Gympass; - Parceria com instituições de ensino (Cursos de graduação e pós graduação); ## Como se candidatar: Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Process Mapping Analyst (Presencial - Belo horizonte) na Prime Results ](https://coodesh.com/vagas/process-mapping-analyst-presencial-belo-horizonte-201156492?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação. ## Labels #### Alocação Alocado #### Regime CLT #### Categoria Testes/Q.A
process
process mapping analyst presencial belo horizonte na coodesh descrição da vaga esta é uma vaga de um parceiro da plataforma coodesh ao candidatar se você terá acesso as informações completas sobre a empresa e benefícios fique atento ao redirecionamento que vai te levar para uma url com o pop up personalizado de candidatura 👋 a prime results está buscando process mapping analyst para compor seu time acreditamos no poder de transformação social realizado pelas empresas acreditamos no poder transformador das pessoas aliado à gestão e tecnologia compartilhamos nosso conhecimento para solucionar problemas complexos e gerar valor para nossos clientes responsabilidades e atribuições uso de ferramentas e técnicas para modelagem de processos identificação e análise da situação as is documentação de processo construção de procedimentos e regras de negócio para condução de processos nbsp design de processos utilizando modelagem bpmn epc vsm preferência software bizagi construção de tickets de melhoria de processos apresentando mudanças e requisitos alinhado a equipe envolvida prime results o best seller simon sinek diz que a maioria das empresas sabem o que fazem porém não sabem por que o fazem não é o nosso caso a prime results é uma empresa especializada em gestão organizacional que usa seu potencial de transformação em empresas que geram impacto positivo na sociedade nossos clientes hoje fazem a diferença na vida de mais de brasileiros nas áreas de proteção patrimonial saúde e assistência horas nbsp nosso objetivo central é criar um ambiente criativo dinâmico e engajado sempre aliados a métodos processos inteligentes e muita inovação habilidades análise de requisitos banco de dados orientado a documentos bpmn local undefined requisitos conhecimento em ferramentas e técnicas de modelagem de processos experiência em análise de requisitos documentação e regras de negócios conhecimento em modelagem bpmn epc vsm boa comunicação oral e escrita benefícios vale refeição o dia trabalhado cartão flash vale transporte ou auxilio combustível assistência médica após o período de experiência acesso ao clube certo clube de benefícios gympass parceria com instituições de ensino cursos de graduação e pós graduação como se candidatar candidatar se exclusivamente através da plataforma coodesh no link a seguir após candidatar se via plataforma coodesh e validar o seu login você poderá acompanhar e receber todas as interações do processo por lá utilize a opção pedir feedback entre uma etapa e outra na vaga que se candidatou isso fará com que a pessoa recruiter responsável pelo processo na empresa receba a notificação labels alocação alocado regime clt categoria testes q a
1
7,196
10,333,482,601
IssuesEvent
2019-09-03 05:33:03
qgis/QGIS-Documentation
https://api.github.com/repos/qgis/QGIS-Documentation
closed
[FEATURE][processing] New algorithm for extracting binary blobs from fields
Automatic new feature Processing Alg
Original commit: https://github.com/qgis/QGIS/commit/e80c31e80908119ecfc0f8847bb7273119c759a6 by nyalldawson Allows users to extract binary fields to files. The filename is set via a data defined expression, so can be taken from a column value or a more complex expression.
1.0
[FEATURE][processing] New algorithm for extracting binary blobs from fields - Original commit: https://github.com/qgis/QGIS/commit/e80c31e80908119ecfc0f8847bb7273119c759a6 by nyalldawson Allows users to extract binary fields to files. The filename is set via a data defined expression, so can be taken from a column value or a more complex expression.
process
new algorithm for extracting binary blobs from fields original commit by nyalldawson allows users to extract binary fields to files the filename is set via a data defined expression so can be taken from a column value or a more complex expression
1
9,661
3,954,553,379
IssuesEvent
2016-04-29 17:27:20
flutter/flutter
https://api.github.com/repos/flutter/flutter
opened
Remove the beginFrame() call in _runApp
affects: framework ⚠ code health
I don't understand what that call is doing, but it's critical to keep the tests running. It seems utterly redundant, though.
1.0
Remove the beginFrame() call in _runApp - I don't understand what that call is doing, but it's critical to keep the tests running. It seems utterly redundant, though.
non_process
remove the beginframe call in runapp i don t understand what that call is doing but it s critical to keep the tests running it seems utterly redundant though
0
691,910
23,716,140,467
IssuesEvent
2022-08-30 11:57:27
Aizistral-Studios/No-Chat-Reports
https://api.github.com/repos/Aizistral-Studios/No-Chat-Reports
closed
Separate config file for server list
enhancement confirmed priority: normal
Not a huge priority, but may come useful for modpacks which only want to give config or only server list for this mod.
1.0
Separate config file for server list - Not a huge priority, but may come useful for modpacks which only want to give config or only server list for this mod.
non_process
separate config file for server list not a huge priority but may come useful for modpacks which only want to give config or only server list for this mod
0
18,881
24,819,344,514
IssuesEvent
2022-10-25 15:16:44
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
closed
Transformation fails if <topicref> has @rev and references filtered-out topic
bug priority/medium preprocess
## Expected Behavior If I have a `<topicref>` with `@rev` defined in a map: ```xml <map> <title>Map</title> <topicref href="topic2.dita" rev="1.0"/> </map> ``` and that topic's content is hidden using `@props` (or other exclusion filtering): ```xml <topic id="topic2_id" rev="1.0" props="hide"> <title>Topic 2</title> <body> <!-- ... --> </body> </topic> ``` and the content is published with a filter that highlights `@rev` content _and_ excludes `@props` content: ```shell dita -i map.ditamap -f html5 -filter filter_props_rev.ditaval ``` then map publishing should complete, although messages should be issued for topics with filtered content as expected. ## Actual Behavior In this case, the XSLT processing goes down an unhandled path, with transformation failing as follows: ``` [move-meta] Error evaluating (fn:not(...)) in xsl:when/@test on line 810 column 68 of mappullImpl.xsl: [move-meta] XPTY0004: An empty sequence is not allowed as the first argument of [move-meta] dita-ot:matches-linktext-class(). Found while atomizing the first operand of '=' [move-meta] at template getmetadata on line 1037 of mappullImpl.xsl: [move-meta] invoked by xsl:call-template at file:/home/chrispy/dita-ot/plugins/org.dita.base/xsl/preprocess/mappullImpl.xsl#790 [move-meta] at template get-stuff on line 692 of mappullImpl.xsl: [move-meta] invoked by xsl:call-template at file:/home/chrispy/dita-ot/plugins/org.dita.base/xsl/preprocess/mappullImpl.xsl#179 [move-meta] In template rule with match="*[fn:contains(...)]" on line 80 of mappullImpl.xsl [move-meta] invoked by xsl:apply-templates at file:/home/chrispy/dita-ot/plugins/org.dita.base/xsl/preprocess/mappullImpl.xsl#1148 [move-meta] In template rule with match="*[fn:contains(...)]" on line 1146 of mappullImpl.xsl [move-meta] invoked by built-in template rule (text-only) Error: Failed to run pipeline: Failed to transform document: An empty sequence is not allowed as the first argument of dita-ot:matches-linktext-class(). Found while atomizing the first operand of '=' ``` This failure only occurs when the DITAVAL applies `@revprop` highlighting to `@rev` attributes, _and_ a map `<topicref>` attempts to apply `@rev` highlighting to a filtered-out topic. ## Possible Solution I will add something here when I've looked into the XSLT code more closely. ## Steps to Reproduce Download the attached testcase: [testcase.zip](https://github.com/dita-ot/dita-ot/files/7412543/testcase.zip) then run: ```shell # this fails - excludes @props content, applies @rev highlighting dita -i map.ditamap -f html5 -filter filter_props_rev.ditaval # this completes - excludes @props content, no @rev highlighting dita -i map.ditamap -f html5 -filter filter_props_norev.ditaval ``` ## Environment <!-- Include relevant details about the environment you experienced this in. --> * DITA-OT version: 3.6.1 * Operating system and version: Ubuntu 20.04 * How did you run DITA-OT? `dita` command * Transformation type: `html5`
1.0
Transformation fails if <topicref> has @rev and references filtered-out topic - ## Expected Behavior If I have a `<topicref>` with `@rev` defined in a map: ```xml <map> <title>Map</title> <topicref href="topic2.dita" rev="1.0"/> </map> ``` and that topic's content is hidden using `@props` (or other exclusion filtering): ```xml <topic id="topic2_id" rev="1.0" props="hide"> <title>Topic 2</title> <body> <!-- ... --> </body> </topic> ``` and the content is published with a filter that highlights `@rev` content _and_ excludes `@props` content: ```shell dita -i map.ditamap -f html5 -filter filter_props_rev.ditaval ``` then map publishing should complete, although messages should be issued for topics with filtered content as expected. ## Actual Behavior In this case, the XSLT processing goes down an unhandled path, with transformation failing as follows: ``` [move-meta] Error evaluating (fn:not(...)) in xsl:when/@test on line 810 column 68 of mappullImpl.xsl: [move-meta] XPTY0004: An empty sequence is not allowed as the first argument of [move-meta] dita-ot:matches-linktext-class(). Found while atomizing the first operand of '=' [move-meta] at template getmetadata on line 1037 of mappullImpl.xsl: [move-meta] invoked by xsl:call-template at file:/home/chrispy/dita-ot/plugins/org.dita.base/xsl/preprocess/mappullImpl.xsl#790 [move-meta] at template get-stuff on line 692 of mappullImpl.xsl: [move-meta] invoked by xsl:call-template at file:/home/chrispy/dita-ot/plugins/org.dita.base/xsl/preprocess/mappullImpl.xsl#179 [move-meta] In template rule with match="*[fn:contains(...)]" on line 80 of mappullImpl.xsl [move-meta] invoked by xsl:apply-templates at file:/home/chrispy/dita-ot/plugins/org.dita.base/xsl/preprocess/mappullImpl.xsl#1148 [move-meta] In template rule with match="*[fn:contains(...)]" on line 1146 of mappullImpl.xsl [move-meta] invoked by built-in template rule (text-only) Error: Failed to run pipeline: Failed to transform document: An empty sequence is not allowed as the first argument of dita-ot:matches-linktext-class(). Found while atomizing the first operand of '=' ``` This failure only occurs when the DITAVAL applies `@revprop` highlighting to `@rev` attributes, _and_ a map `<topicref>` attempts to apply `@rev` highlighting to a filtered-out topic. ## Possible Solution I will add something here when I've looked into the XSLT code more closely. ## Steps to Reproduce Download the attached testcase: [testcase.zip](https://github.com/dita-ot/dita-ot/files/7412543/testcase.zip) then run: ```shell # this fails - excludes @props content, applies @rev highlighting dita -i map.ditamap -f html5 -filter filter_props_rev.ditaval # this completes - excludes @props content, no @rev highlighting dita -i map.ditamap -f html5 -filter filter_props_norev.ditaval ``` ## Environment <!-- Include relevant details about the environment you experienced this in. --> * DITA-OT version: 3.6.1 * Operating system and version: Ubuntu 20.04 * How did you run DITA-OT? `dita` command * Transformation type: `html5`
process
transformation fails if has rev and references filtered out topic expected behavior if i have a with rev defined in a map xml map and that topic s content is hidden using props or other exclusion filtering xml topic and the content is published with a filter that highlights rev content and excludes props content shell dita i map ditamap f filter filter props rev ditaval then map publishing should complete although messages should be issued for topics with filtered content as expected actual behavior in this case the xslt processing goes down an unhandled path with transformation failing as follows error evaluating fn not in xsl when test on line column of mappullimpl xsl an empty sequence is not allowed as the first argument of dita ot matches linktext class found while atomizing the first operand of at template getmetadata on line of mappullimpl xsl invoked by xsl call template at file home chrispy dita ot plugins org dita base xsl preprocess mappullimpl xsl at template get stuff on line of mappullimpl xsl invoked by xsl call template at file home chrispy dita ot plugins org dita base xsl preprocess mappullimpl xsl in template rule with match on line of mappullimpl xsl invoked by xsl apply templates at file home chrispy dita ot plugins org dita base xsl preprocess mappullimpl xsl in template rule with match on line of mappullimpl xsl invoked by built in template rule text only error failed to run pipeline failed to transform document an empty sequence is not allowed as the first argument of dita ot matches linktext class found while atomizing the first operand of this failure only occurs when the ditaval applies revprop highlighting to rev attributes and a map attempts to apply rev highlighting to a filtered out topic possible solution i will add something here when i ve looked into the xslt code more closely steps to reproduce download the attached testcase then run shell this fails excludes props content applies rev highlighting dita i map ditamap f filter filter props rev ditaval this completes excludes props content no rev highlighting dita i map ditamap f filter filter props norev ditaval environment dita ot version operating system and version ubuntu how did you run dita ot dita command transformation type
1
237,576
7,761,987,281
IssuesEvent
2018-06-01 11:58:17
gluster/glusterd2
https://api.github.com/repos/gluster/glusterd2
closed
volume listing: null is being returned in the body when there are no volumes
FW: ReST FW: Volume Management bug priority: low
```sh $ curl -i -X GET http://127.0.0.1:24007/v1/volumes HTTP/1.1 200 OK Content-Type: application/json; charset=UTF-8 X-Gluster-Cluster-Id: e60591eb-c6a6-4e21-ad47-eb3dd667775f X-Gluster-Node-Id: 1f982c00-55af-40ab-825d-399476e41abb X-Request-Id: 19e1e99e-bec5-421f-8f19-ef504ad5d43c Date: Fri, 01 Jun 2018 05:21:31 GMT Content-Length: 5 null ``` When there are no volumes, the client should get an empty list and not null.
1.0
volume listing: null is being returned in the body when there are no volumes - ```sh $ curl -i -X GET http://127.0.0.1:24007/v1/volumes HTTP/1.1 200 OK Content-Type: application/json; charset=UTF-8 X-Gluster-Cluster-Id: e60591eb-c6a6-4e21-ad47-eb3dd667775f X-Gluster-Node-Id: 1f982c00-55af-40ab-825d-399476e41abb X-Request-Id: 19e1e99e-bec5-421f-8f19-ef504ad5d43c Date: Fri, 01 Jun 2018 05:21:31 GMT Content-Length: 5 null ``` When there are no volumes, the client should get an empty list and not null.
non_process
volume listing null is being returned in the body when there are no volumes sh curl i x get http ok content type application json charset utf x gluster cluster id x gluster node id x request id date fri jun gmt content length null when there are no volumes the client should get an empty list and not null
0