Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1
value | created_at stringlengths 19 19 | repo stringlengths 5 112 | repo_url stringlengths 34 141 | action stringclasses 3
values | title stringlengths 1 1k | labels stringlengths 4 1.38k | body stringlengths 1 262k | index stringclasses 16
values | text_combine stringlengths 96 262k | label stringclasses 2
values | text stringlengths 96 252k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
388,944 | 26,789,794,461 | IssuesEvent | 2023-02-01 07:28:36 | mozilla-it/ctms-api | https://api.github.com/repos/mozilla-it/ctms-api | closed | CTMS code comparison over Time-Ranges: | documentation | ----
Compare main to...
- 1 day ago: https://github.com/mozilla-it/ctms-api/compare/main%40{1.day.ago}...main
- 2 weeks ago: https://github.com/mozilla-it/ctms-api/compare/main%40%7B2.weeks.ago%7D...main
- 1 month ago: https://github.com/mozilla-it/ctms-api/compare/main%40%7B1.month.ago%7D...main
- 6 months ago: https://github.com/mozilla-it/ctms-api/compare/main%40%7B6.months.ago%7D...main
- 1 year ago: https://github.com/mozilla-it/ctms-api/compare/main%40%7B1.year.ago%7D...main
###### provided for reference of progress/work/code over time
---- | 1.0 | CTMS code comparison over Time-Ranges: - ----
Compare main to...
- 1 day ago: https://github.com/mozilla-it/ctms-api/compare/main%40{1.day.ago}...main
- 2 weeks ago: https://github.com/mozilla-it/ctms-api/compare/main%40%7B2.weeks.ago%7D...main
- 1 month ago: https://github.com/mozilla-it/ctms-api/compare/main%40%7B1.month.ago%7D...main
- 6 months ago: https://github.com/mozilla-it/ctms-api/compare/main%40%7B6.months.ago%7D...main
- 1 year ago: https://github.com/mozilla-it/ctms-api/compare/main%40%7B1.year.ago%7D...main
###### provided for reference of progress/work/code over time
---- | non_priority | ctms code comparison over time ranges compare main to day ago weeks ago month ago months ago year ago provided for reference of progress work code over time | 0 |
221,282 | 24,608,423,745 | IssuesEvent | 2022-10-14 18:37:34 | adoptium/ci-jenkins-pipelines | https://api.github.com/repos/adoptium/ci-jenkins-pipelines | closed | Request to make Jenkins job permissions configurable | enhancement security | https://github.com/adoptium/ci-jenkins-pipelines/pull/154
https://github.com/adoptium/ci-jenkins-pipelines/pull/155
These 2 PRs introduced changes specific to Adopt that do not apply to other consumers of the Jenkins pipelines. Can these be reworked to be configurable via the defaults.json file? Minimum would be a boolean but ideally also moving the hardcoded team name(s) into the config file. | True | Request to make Jenkins job permissions configurable - https://github.com/adoptium/ci-jenkins-pipelines/pull/154
https://github.com/adoptium/ci-jenkins-pipelines/pull/155
These 2 PRs introduced changes specific to Adopt that do not apply to other consumers of the Jenkins pipelines. Can these be reworked to be configurable via the defaults.json file? Minimum would be a boolean but ideally also moving the hardcoded team name(s) into the config file. | non_priority | request to make jenkins job permissions configurable these prs introduced changes specific to adopt that do not apply to other consumers of the jenkins pipelines can these be reworked to be configurable via the defaults json file minimum would be a boolean but ideally also moving the hardcoded team name s into the config file | 0 |
23,890 | 11,983,204,926 | IssuesEvent | 2020-04-07 14:06:58 | elastic/kibana | https://api.github.com/repos/elastic/kibana | closed | [APM] Service map refresh failing to remove absent services | Feature:Service Maps [zube]: In Progress apm-test-plan-7.7.0 bug v7.7.0 | In the service map, when applying short time filters (e.g. `now-1s`), services are only added, making the map larger with each refresh, when it should omit those services which are not returned in the API results applying the short range.
This is due to the fact that services are rendered in an additive way, when it should be updating the rendered elements to remove nodes which are no longer included in the list of services returned by the API.
This relates to the issue described in #62300. | 1.0 | [APM] Service map refresh failing to remove absent services - In the service map, when applying short time filters (e.g. `now-1s`), services are only added, making the map larger with each refresh, when it should omit those services which are not returned in the API results applying the short range.
This is due to the fact that services are rendered in an additive way, when it should be updating the rendered elements to remove nodes which are no longer included in the list of services returned by the API.
This relates to the issue described in #62300. | non_priority | service map refresh failing to remove absent services in the service map when applying short time filters e g now services are only added making the map larger with each refresh when it should omit those services which are not returned in the api results applying the short range this is due to the fact that services are rendered in an additive way when it should be updating the rendered elements to remove nodes which are no longer included in the list of services returned by the api this relates to the issue described in | 0 |
15,027 | 18,740,545,908 | IssuesEvent | 2021-11-04 13:08:51 | qgis/QGIS | https://api.github.com/repos/qgis/QGIS | reopened | Computed Variables for Algorithm Outputs are NULL in Graphical Modeler | Processing Bug Modeller | ### What is the bug or the crash?
I have tried to use the computed variables for an algorithm output (e.g. `@Reproject_layer_OUTPUT_minx `) in a `Pre-calculated value` field in a subsequent algorithm. All but those coming from the input layer are always NULL.
### Steps to reproduce the issue
1. Start Graphical Modeler
2. Add Input "Vector Layer"
3. Add "Reproject layer" (or other tools)
4. Add "Create grid"
5. For "Grid extent" choose "Pre-calculated value" and paste
```
to_string( @Reproject_layer_OUTPUT_minx ) || ',' ||
to_string( @Reproject_layer_OUTPUT_maxx ) || ',' ||
to_string( @Reproject_layer_OUTPUT_miny ) || ',' ||
to_string( @Reproject_layer_OUTPUT_maxy )
```
Output:
```
Input Parameters:
{ CRS: QgsCoordinateReferenceSystem('EPSG:32634'), EXTENT: None, HOVERLAY: 0, HSPACING: 24000, OUTPUT: 'TEMPORARY_OUTPUT', TYPE: 2, VOVERLAY: 0, VSPACING: 24000 }
Horizontal spacing is too large for the covered area.
Error encountered while running Create grid
Execution failed after 0.12 seconds
```
6. Trying again with the following works:
```
to_string(@CountryLayer_minx ) || ', ' ||
to_string( @CountryLayer_maxx ) || ', ' ||
to_string( @CountryLayer_miny ) || ', ' ||
to_string( @CountryLayer_maxy )
```
Output:
```
{ CRS: QgsCoordinateReferenceSystem('EPSG:32634')', EXTENT: '4744834.27221361, 5402446.1378068905, 4125728.6944347215, 5306091.750087504', HOVERLAY: 0, HSPACING: 24000, OUTPUT: 'memory:Grid', TYPE: 2, VOVERLAY: 0, VSPACING: 24000 }
```
### Versions
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css">
p, li { white-space: pre-wrap; }
</style></head><body>
QGIS version | 3.22.0-Białowieża | QGIS code revision | d9022691f1
-- | -- | -- | --
Qt version | 5.12.8
Python version | 3.8.10
GDAL/OGR version | 3.0.4
PROJ version | 6.3.1
EPSG Registry database version | v9.8.6 (2020-01-22)
Compiled against GEOS | 3.8.0-CAPI-1.13.1 | Running against GEOS | 3.8.0-CAPI-1.13.1
SQLite version | 3.31.1
PDAL version | 2.0.1
PostgreSQL client version | 12.8 (Ubuntu 12.8-0ubuntu0.20.04.1)
SpatiaLite version | 4.3.0a
QWT version | 6.1.4
QScintilla2 version | 2.11.2
OS version | Ubuntu 20.04.3 LTS
| | |
Active Python plugins
valuetool | 3.0.8
quick_map_services | 0.19.11.1
zoom_level | 0.1
QuickWKT | 3.1
processing | 2.12.99
db_manager | 0.1.20
grassprovider | 2.12.99
sagaprovider | 2.12.99
MetaSearch | 0.3.5
</body></html>
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [ ] I tried with a new QGIS profile
### Additional context
Test QGZ project: [test.qgz.zip](https://github.com/qgis/QGIS/files/7473540/test.qgz.zip)
The generated Python script:
```python
"""
Model exported as python.
Name : test_out_grid
Group :
With QGIS : 31803
"""
from qgis.core import QgsProcessing
from qgis.core import QgsProcessingAlgorithm
from qgis.core import QgsProcessingMultiStepFeedback
from qgis.core import QgsProcessingParameterVectorLayer
from qgis.core import QgsProcessingParameterFeatureSink
from qgis.core import QgsProcessingParameterBoolean
from qgis.core import QgsCoordinateReferenceSystem
from qgis.core import QgsExpression
import processing
class Test_out_grid(QgsProcessingAlgorithm):
def initAlgorithm(self, config=None):
self.addParameter(QgsProcessingParameterVectorLayer('CountryLayer', 'Country Layer', types=[QgsProcessing.TypeVectorPolygon], defaultValue=None))
self.addParameter(QgsProcessingParameterFeatureSink('Out_grid', 'out_grid', type=QgsProcessing.TypeVectorPolygon, createByDefault=True, defaultValue=None))
self.addParameter(QgsProcessingParameterBoolean('VERBOSE_LOG', 'Verbose logging', optional=True, defaultValue=False))
def processAlgorithm(self, parameters, context, model_feedback):
# Use a multi-step feedback, so that individual child algorithm progress reports are adjusted for the
# overall progress through the model
feedback = QgsProcessingMultiStepFeedback(2, model_feedback)
results = {}
outputs = {}
# Create grid
alg_params = {
'CRS': QgsCoordinateReferenceSystem('EPSG:32634'),
'EXTENT': QgsExpression('to_string(@Reproject_layer_OUTPUT_minx ) || \', \' ||\nto_string( @Reproject_layer_OUTPUT_maxx ) || \', \' ||\nto_string( @Reproject_layer_OUTPUT_miny ) || \', \' ||\nto_string( @Reproject_layer_OUTPUT_maxy )').evaluate(),
'HOVERLAY': 0,
'HSPACING': 24000,
'TYPE': 2,
'VOVERLAY': 0,
'VSPACING': 24000,
'OUTPUT': parameters['Out_grid']
}
outputs['CreateGrid'] = processing.run('native:creategrid', alg_params, context=context, feedback=feedback, is_child_algorithm=True)
results['Out_grid'] = outputs['CreateGrid']['OUTPUT']
feedback.setCurrentStep(1)
if feedback.isCanceled():
return {}
# Reproject layer
alg_params = {
'INPUT': parameters['CountryLayer'],
'OPERATION': '',
'TARGET_CRS': QgsCoordinateReferenceSystem('EPSG:32634'),
'OUTPUT': QgsProcessing.TEMPORARY_OUTPUT
}
outputs['ReprojectLayer'] = processing.run('native:reprojectlayer', alg_params, context=context, feedback=feedback, is_child_algorithm=True)
return results
def name(self):
return 'test_out_grid'
def displayName(self):
return 'test_out_grid'
def group(self):
return ''
def groupId(self):
return ''
def createInstance(self):
return Test_out_grid()
``` | 1.0 | Computed Variables for Algorithm Outputs are NULL in Graphical Modeler - ### What is the bug or the crash?
I have tried to use the computed variables for an algorithm output (e.g. `@Reproject_layer_OUTPUT_minx `) in a `Pre-calculated value` field in a subsequent algorithm. All but those coming from the input layer are always NULL.
### Steps to reproduce the issue
1. Start Graphical Modeler
2. Add Input "Vector Layer"
3. Add "Reproject layer" (or other tools)
4. Add "Create grid"
5. For "Grid extent" choose "Pre-calculated value" and paste
```
to_string( @Reproject_layer_OUTPUT_minx ) || ',' ||
to_string( @Reproject_layer_OUTPUT_maxx ) || ',' ||
to_string( @Reproject_layer_OUTPUT_miny ) || ',' ||
to_string( @Reproject_layer_OUTPUT_maxy )
```
Output:
```
Input Parameters:
{ CRS: QgsCoordinateReferenceSystem('EPSG:32634'), EXTENT: None, HOVERLAY: 0, HSPACING: 24000, OUTPUT: 'TEMPORARY_OUTPUT', TYPE: 2, VOVERLAY: 0, VSPACING: 24000 }
Horizontal spacing is too large for the covered area.
Error encountered while running Create grid
Execution failed after 0.12 seconds
```
6. Trying again with the following works:
```
to_string(@CountryLayer_minx ) || ', ' ||
to_string( @CountryLayer_maxx ) || ', ' ||
to_string( @CountryLayer_miny ) || ', ' ||
to_string( @CountryLayer_maxy )
```
Output:
```
{ CRS: QgsCoordinateReferenceSystem('EPSG:32634')', EXTENT: '4744834.27221361, 5402446.1378068905, 4125728.6944347215, 5306091.750087504', HOVERLAY: 0, HSPACING: 24000, OUTPUT: 'memory:Grid', TYPE: 2, VOVERLAY: 0, VSPACING: 24000 }
```
### Versions
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css">
p, li { white-space: pre-wrap; }
</style></head><body>
QGIS version | 3.22.0-Białowieża | QGIS code revision | d9022691f1
-- | -- | -- | --
Qt version | 5.12.8
Python version | 3.8.10
GDAL/OGR version | 3.0.4
PROJ version | 6.3.1
EPSG Registry database version | v9.8.6 (2020-01-22)
Compiled against GEOS | 3.8.0-CAPI-1.13.1 | Running against GEOS | 3.8.0-CAPI-1.13.1
SQLite version | 3.31.1
PDAL version | 2.0.1
PostgreSQL client version | 12.8 (Ubuntu 12.8-0ubuntu0.20.04.1)
SpatiaLite version | 4.3.0a
QWT version | 6.1.4
QScintilla2 version | 2.11.2
OS version | Ubuntu 20.04.3 LTS
| | |
Active Python plugins
valuetool | 3.0.8
quick_map_services | 0.19.11.1
zoom_level | 0.1
QuickWKT | 3.1
processing | 2.12.99
db_manager | 0.1.20
grassprovider | 2.12.99
sagaprovider | 2.12.99
MetaSearch | 0.3.5
</body></html>
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [ ] I tried with a new QGIS profile
### Additional context
Test QGZ project: [test.qgz.zip](https://github.com/qgis/QGIS/files/7473540/test.qgz.zip)
The generated Python script:
```python
"""
Model exported as python.
Name : test_out_grid
Group :
With QGIS : 31803
"""
from qgis.core import QgsProcessing
from qgis.core import QgsProcessingAlgorithm
from qgis.core import QgsProcessingMultiStepFeedback
from qgis.core import QgsProcessingParameterVectorLayer
from qgis.core import QgsProcessingParameterFeatureSink
from qgis.core import QgsProcessingParameterBoolean
from qgis.core import QgsCoordinateReferenceSystem
from qgis.core import QgsExpression
import processing
class Test_out_grid(QgsProcessingAlgorithm):
def initAlgorithm(self, config=None):
self.addParameter(QgsProcessingParameterVectorLayer('CountryLayer', 'Country Layer', types=[QgsProcessing.TypeVectorPolygon], defaultValue=None))
self.addParameter(QgsProcessingParameterFeatureSink('Out_grid', 'out_grid', type=QgsProcessing.TypeVectorPolygon, createByDefault=True, defaultValue=None))
self.addParameter(QgsProcessingParameterBoolean('VERBOSE_LOG', 'Verbose logging', optional=True, defaultValue=False))
def processAlgorithm(self, parameters, context, model_feedback):
# Use a multi-step feedback, so that individual child algorithm progress reports are adjusted for the
# overall progress through the model
feedback = QgsProcessingMultiStepFeedback(2, model_feedback)
results = {}
outputs = {}
# Create grid
alg_params = {
'CRS': QgsCoordinateReferenceSystem('EPSG:32634'),
'EXTENT': QgsExpression('to_string(@Reproject_layer_OUTPUT_minx ) || \', \' ||\nto_string( @Reproject_layer_OUTPUT_maxx ) || \', \' ||\nto_string( @Reproject_layer_OUTPUT_miny ) || \', \' ||\nto_string( @Reproject_layer_OUTPUT_maxy )').evaluate(),
'HOVERLAY': 0,
'HSPACING': 24000,
'TYPE': 2,
'VOVERLAY': 0,
'VSPACING': 24000,
'OUTPUT': parameters['Out_grid']
}
outputs['CreateGrid'] = processing.run('native:creategrid', alg_params, context=context, feedback=feedback, is_child_algorithm=True)
results['Out_grid'] = outputs['CreateGrid']['OUTPUT']
feedback.setCurrentStep(1)
if feedback.isCanceled():
return {}
# Reproject layer
alg_params = {
'INPUT': parameters['CountryLayer'],
'OPERATION': '',
'TARGET_CRS': QgsCoordinateReferenceSystem('EPSG:32634'),
'OUTPUT': QgsProcessing.TEMPORARY_OUTPUT
}
outputs['ReprojectLayer'] = processing.run('native:reprojectlayer', alg_params, context=context, feedback=feedback, is_child_algorithm=True)
return results
def name(self):
return 'test_out_grid'
def displayName(self):
return 'test_out_grid'
def group(self):
return ''
def groupId(self):
return ''
def createInstance(self):
return Test_out_grid()
``` | non_priority | computed variables for algorithm outputs are null in graphical modeler what is the bug or the crash i have tried to use the computed variables for an algorithm output e g reproject layer output minx in a pre calculated value field in a subsequent algorithm all but those coming from the input layer are always null steps to reproduce the issue start graphical modeler add input vector layer add reproject layer or other tools add create grid for grid extent choose pre calculated value and paste to string reproject layer output minx to string reproject layer output maxx to string reproject layer output miny to string reproject layer output maxy output input parameters crs qgscoordinatereferencesystem epsg extent none hoverlay hspacing output temporary output type voverlay vspacing horizontal spacing is too large for the covered area error encountered while running create grid execution failed after seconds trying again with the following works to string countrylayer minx to string countrylayer maxx to string countrylayer miny to string countrylayer maxy output crs qgscoordinatereferencesystem epsg extent hoverlay hspacing output memory grid type voverlay vspacing versions doctype html public dtd html en p li white space pre wrap qgis version białowieża qgis code revision qt version python version gdal ogr version proj version epsg registry database version compiled against geos capi running against geos capi sqlite version pdal version postgresql client version ubuntu spatialite version qwt version version os version ubuntu lts active python plugins valuetool quick map services zoom level quickwkt processing db manager grassprovider sagaprovider metasearch supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context test qgz project the generated python script python model exported as python name test out grid group with qgis from qgis core import qgsprocessing from qgis core import qgsprocessingalgorithm from qgis core import qgsprocessingmultistepfeedback from qgis core import qgsprocessingparametervectorlayer from qgis core import qgsprocessingparameterfeaturesink from qgis core import qgsprocessingparameterboolean from qgis core import qgscoordinatereferencesystem from qgis core import qgsexpression import processing class test out grid qgsprocessingalgorithm def initalgorithm self config none self addparameter qgsprocessingparametervectorlayer countrylayer country layer types defaultvalue none self addparameter qgsprocessingparameterfeaturesink out grid out grid type qgsprocessing typevectorpolygon createbydefault true defaultvalue none self addparameter qgsprocessingparameterboolean verbose log verbose logging optional true defaultvalue false def processalgorithm self parameters context model feedback use a multi step feedback so that individual child algorithm progress reports are adjusted for the overall progress through the model feedback qgsprocessingmultistepfeedback model feedback results outputs create grid alg params crs qgscoordinatereferencesystem epsg extent qgsexpression to string reproject layer output minx nto string reproject layer output maxx nto string reproject layer output miny nto string reproject layer output maxy evaluate hoverlay hspacing type voverlay vspacing output parameters outputs processing run native creategrid alg params context context feedback feedback is child algorithm true results outputs feedback setcurrentstep if feedback iscanceled return reproject layer alg params input parameters operation target crs qgscoordinatereferencesystem epsg output qgsprocessing temporary output outputs processing run native reprojectlayer alg params context context feedback feedback is child algorithm true return results def name self return test out grid def displayname self return test out grid def group self return def groupid self return def createinstance self return test out grid | 0 |
199,091 | 15,733,470,931 | IssuesEvent | 2021-03-29 19:36:25 | albin-johansson/centurion | https://api.github.com/repos/albin-johansson/centurion | closed | Overhaul documentation. | documentation | - [X] Overhaul Doxygen documentation.
- [ ] Create RtD documentation.
- [X] `area`
- [X] `base_path`
- [ ] `basic_joystick`
- [x] `basic_renderer`
- [x] `basic_window`
- [x] `battery`
- [x] `blend_mode`
- [x] `button_state`
- [x] `library`, `config`
- [x] Clipboard
- [x] `color`
- [x] Color constants
- [x] `condition`
- [x] `controller`
- [x] Counter
- [x] CPU
- [ ] `cursor`
- [x] Events
- [x] `event_type`
- [x] `font`
- [x] `font_cache`
- [x] Hints
- [ ] `joystick`
- [ ] `joystick_handle`
- [x] `key_code`
- [x] `key_modifier`
- [ ] `key_state`
- [x] Logging
- [x] Message boxes
- [x] `mouse_button`
- [ ] `mouse_state`
- [x] `music`
- [x] `mutex`
- [x] `pixel_format`
- [x] `platform`
- [x] Point
- [x] `pref_path`
- [x] RAM
- [x] Rect
- [x] `renderer`
- [x] `renderer_handle`
- [x] `scale_mode`
- [x] `scan_code`
- [x] `scoped_lock`
- [x] Screen
- [ ] `sdl_string`
- [x] `semaphore`
- [x] `sound_effect`
- [x] `surface`
- [x] `texture`
- [x] `texture_access`
- [x] `thread`
- [ ] Touch
- [x] `try_lock`
- [x] Types
- [ ] `unicode_string`
- [x] Video
- [x] `window`
- [x] `window_handle`
- [x] Window Utilities
| 1.0 | Overhaul documentation. - - [X] Overhaul Doxygen documentation.
- [ ] Create RtD documentation.
- [X] `area`
- [X] `base_path`
- [ ] `basic_joystick`
- [x] `basic_renderer`
- [x] `basic_window`
- [x] `battery`
- [x] `blend_mode`
- [x] `button_state`
- [x] `library`, `config`
- [x] Clipboard
- [x] `color`
- [x] Color constants
- [x] `condition`
- [x] `controller`
- [x] Counter
- [x] CPU
- [ ] `cursor`
- [x] Events
- [x] `event_type`
- [x] `font`
- [x] `font_cache`
- [x] Hints
- [ ] `joystick`
- [ ] `joystick_handle`
- [x] `key_code`
- [x] `key_modifier`
- [ ] `key_state`
- [x] Logging
- [x] Message boxes
- [x] `mouse_button`
- [ ] `mouse_state`
- [x] `music`
- [x] `mutex`
- [x] `pixel_format`
- [x] `platform`
- [x] Point
- [x] `pref_path`
- [x] RAM
- [x] Rect
- [x] `renderer`
- [x] `renderer_handle`
- [x] `scale_mode`
- [x] `scan_code`
- [x] `scoped_lock`
- [x] Screen
- [ ] `sdl_string`
- [x] `semaphore`
- [x] `sound_effect`
- [x] `surface`
- [x] `texture`
- [x] `texture_access`
- [x] `thread`
- [ ] Touch
- [x] `try_lock`
- [x] Types
- [ ] `unicode_string`
- [x] Video
- [x] `window`
- [x] `window_handle`
- [x] Window Utilities
| non_priority | overhaul documentation overhaul doxygen documentation create rtd documentation area base path basic joystick basic renderer basic window battery blend mode button state library config clipboard color color constants condition controller counter cpu cursor events event type font font cache hints joystick joystick handle key code key modifier key state logging message boxes mouse button mouse state music mutex pixel format platform point pref path ram rect renderer renderer handle scale mode scan code scoped lock screen sdl string semaphore sound effect surface texture texture access thread touch try lock types unicode string video window window handle window utilities | 0 |
391,345 | 11,572,220,994 | IssuesEvent | 2020-02-20 23:25:13 | googleapis/python-texttospeech | https://api.github.com/repos/googleapis/python-texttospeech | closed | Synthesis failed for python-texttospeech | api: texttospeech autosynth failure priority: p1 type: bug | Hello! Autosynth couldn't regenerate python-texttospeech. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/synth.py.
On branch autosynth
nothing to commit, working tree clean
HEAD detached at FETCH_HEAD
nothing to commit, working tree clean
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:6aec9c34db0e4be221cdaf6faba27bdc07cfea846808b3d3b964dfce3a9a0f9b
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/texttospeech/artman_texttospeech_v1beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/texttospeech-v1beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/texttospeech/v1beta1/cloud_tts.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/texttospeech-v1beta1/google/cloud/texttospeech_v1beta1/proto/cloud_tts.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/texttospeech-v1beta1/google/cloud/texttospeech_v1beta1/proto.
synthtool > Running generator for google/cloud/texttospeech/artman_texttospeech_v1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/texttospeech-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/texttospeech/v1/cloud_tts.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/texttospeech-v1/google/cloud/texttospeech_v1/proto/cloud_tts.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/texttospeech-v1/google/cloud/texttospeech_v1/proto.
synthtool > No replacements made in **/gapic/*_client.py for pattern \\"(.+?)-\*\\", maybe replacement is not longer needed?
.coveragerc
.flake8
.github/CONTRIBUTING.md
.github/ISSUE_TEMPLATE/bug_report.md
.github/ISSUE_TEMPLATE/feature_request.md
.github/ISSUE_TEMPLATE/support_request.md
.github/PULL_REQUEST_TEMPLATE.md
.github/release-please.yml
.gitignore
.kokoro/build.sh
.kokoro/continuous/common.cfg
.kokoro/continuous/continuous.cfg
.kokoro/docs/common.cfg
.kokoro/docs/docs.cfg
.kokoro/presubmit/common.cfg
.kokoro/presubmit/presubmit.cfg
.kokoro/publish-docs.sh
.kokoro/release.sh
.kokoro/release/common.cfg
.kokoro/release/release.cfg
.kokoro/trampoline.sh
CODE_OF_CONDUCT.md
CONTRIBUTING.rst
LICENSE
MANIFEST.in
docs/_static/custom.css
docs/_templates/layout.html
docs/conf.py.j2
noxfile.py.j2
renovate.json
setup.cfg
Running session blacken
Creating virtual environment (virtualenv) using python3.6 in .nox/blacken
pip install black==19.3b0
Error: pip is not installed into the virtualenv, it is located at /tmpfs/src/git/autosynth/env/bin/pip. Pass external=True into run() to explicitly allow this.
Session blacken failed.
synthtool > Failed executing nox -s blacken:
None
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/synth.py", line 46, in <module>
s.shell.run(["nox", "-s", "blacken"], hide_output=False)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['nox', '-s', 'blacken']' returned non-zero exit status 1.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/bc296324-7867-4eae-a113-8090e6bcc834).
| 1.0 | Synthesis failed for python-texttospeech - Hello! Autosynth couldn't regenerate python-texttospeech. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/synth.py.
On branch autosynth
nothing to commit, working tree clean
HEAD detached at FETCH_HEAD
nothing to commit, working tree clean
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:6aec9c34db0e4be221cdaf6faba27bdc07cfea846808b3d3b964dfce3a9a0f9b
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/texttospeech/artman_texttospeech_v1beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/texttospeech-v1beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/texttospeech/v1beta1/cloud_tts.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/texttospeech-v1beta1/google/cloud/texttospeech_v1beta1/proto/cloud_tts.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/texttospeech-v1beta1/google/cloud/texttospeech_v1beta1/proto.
synthtool > Running generator for google/cloud/texttospeech/artman_texttospeech_v1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/texttospeech-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/texttospeech/v1/cloud_tts.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/texttospeech-v1/google/cloud/texttospeech_v1/proto/cloud_tts.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/texttospeech-v1/google/cloud/texttospeech_v1/proto.
synthtool > No replacements made in **/gapic/*_client.py for pattern \\"(.+?)-\*\\", maybe replacement is not longer needed?
.coveragerc
.flake8
.github/CONTRIBUTING.md
.github/ISSUE_TEMPLATE/bug_report.md
.github/ISSUE_TEMPLATE/feature_request.md
.github/ISSUE_TEMPLATE/support_request.md
.github/PULL_REQUEST_TEMPLATE.md
.github/release-please.yml
.gitignore
.kokoro/build.sh
.kokoro/continuous/common.cfg
.kokoro/continuous/continuous.cfg
.kokoro/docs/common.cfg
.kokoro/docs/docs.cfg
.kokoro/presubmit/common.cfg
.kokoro/presubmit/presubmit.cfg
.kokoro/publish-docs.sh
.kokoro/release.sh
.kokoro/release/common.cfg
.kokoro/release/release.cfg
.kokoro/trampoline.sh
CODE_OF_CONDUCT.md
CONTRIBUTING.rst
LICENSE
MANIFEST.in
docs/_static/custom.css
docs/_templates/layout.html
docs/conf.py.j2
noxfile.py.j2
renovate.json
setup.cfg
Running session blacken
Creating virtual environment (virtualenv) using python3.6 in .nox/blacken
pip install black==19.3b0
Error: pip is not installed into the virtualenv, it is located at /tmpfs/src/git/autosynth/env/bin/pip. Pass external=True into run() to explicitly allow this.
Session blacken failed.
synthtool > Failed executing nox -s blacken:
None
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/synth.py", line 46, in <module>
s.shell.run(["nox", "-s", "blacken"], hide_output=False)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['nox', '-s', 'blacken']' returned non-zero exit status 1.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/bc296324-7867-4eae-a113-8090e6bcc834).
| priority | synthesis failed for python texttospeech hello autosynth couldn t regenerate python texttospeech broken heart here s the output from running synth py cloning into working repo switched to branch autosynth running synthtool synthtool executing tmpfs src git autosynth working repo synth py on branch autosynth nothing to commit working tree clean head detached at fetch head nothing to commit working tree clean synthtool ensuring dependencies synthtool pulling artman image latest pulling from googleapis artman digest status image is up to date for googleapis artman latest synthtool cloning googleapis synthtool running generator for google cloud texttospeech artman texttospeech yaml synthtool generated code into home kbuilder cache synthtool googleapis artman genfiles python texttospeech synthtool copy home kbuilder cache synthtool googleapis google cloud texttospeech cloud tts proto to home kbuilder cache synthtool googleapis artman genfiles python texttospeech google cloud texttospeech proto cloud tts proto synthtool placed proto files into home kbuilder cache synthtool googleapis artman genfiles python texttospeech google cloud texttospeech proto synthtool running generator for google cloud texttospeech artman texttospeech yaml synthtool generated code into home kbuilder cache synthtool googleapis artman genfiles python texttospeech synthtool copy home kbuilder cache synthtool googleapis google cloud texttospeech cloud tts proto to home kbuilder cache synthtool googleapis artman genfiles python texttospeech google cloud texttospeech proto cloud tts proto synthtool placed proto files into home kbuilder cache synthtool googleapis artman genfiles python texttospeech google cloud texttospeech proto synthtool no replacements made in gapic client py for pattern maybe replacement is not longer needed coveragerc github contributing md github issue template bug report md github issue template feature request md github issue template support request md github pull request template md github release please yml gitignore kokoro build sh kokoro continuous common cfg kokoro continuous continuous cfg kokoro docs common cfg kokoro docs docs cfg kokoro presubmit common cfg kokoro presubmit presubmit cfg kokoro publish docs sh kokoro release sh kokoro release common cfg kokoro release release cfg kokoro trampoline sh code of conduct md contributing rst license manifest in docs static custom css docs templates layout html docs conf py noxfile py renovate json setup cfg running session blacken creating virtual environment virtualenv using in nox blacken pip install black error pip is not installed into the virtualenv it is located at tmpfs src git autosynth env bin pip pass external true into run to explicitly allow this session blacken failed synthtool failed executing nox s blacken none synthtool wrote metadata to synth metadata traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src git autosynth env lib site packages synthtool main py line in main file tmpfs src git autosynth env lib site packages click core py line in call return self main args kwargs file tmpfs src git autosynth env lib site packages click core py line in main rv self invoke ctx file tmpfs src git autosynth env lib site packages click core py line in invoke return ctx invoke self callback ctx params file tmpfs src git autosynth env lib site packages click core py line in invoke return callback args kwargs file tmpfs src git autosynth env lib site packages synthtool main py line in main spec loader exec module synth module type ignore file line in exec module file line in call with frames removed file tmpfs src git autosynth working repo synth py line in s shell run hide output false file tmpfs src git autosynth env lib site packages synthtool shell py line in run raise exc file tmpfs src git autosynth env lib site packages synthtool shell py line in run encoding utf file home kbuilder pyenv versions lib subprocess py line in run output stdout stderr stderr subprocess calledprocesserror command returned non zero exit status synthesis failed google internal developers can see the full log | 1 |
238,000 | 18,215,194,034 | IssuesEvent | 2021-09-30 02:46:38 | portainer/portainer | https://api.github.com/repos/portainer/portainer | closed | Is the user's password supposed to be returned via Portainer API? | kind/enhancement area/documentation internal/jira | Hi guys,
Actually, my question is practically reflected in a $subject
Recently I tried to retrieve the password of a certain user via API (using GET request to the ``/users`` or ``/users/{id}`` API points) for further use in my code.
But with no luck. I just got a response:
```
'response' => {
'PortainerAuthorizations' => undef,
'Role' => 2,
'Id' => 2,
'EndpointAuthorizations' => undef,
'Username' => 'filosof'
},
```
However, if I get it right, the documentation states that the password shall be present in a response:
https://app.swaggerhub.com/apis/deviantony/Portainer/2.0.1#/users/UserList
https://app.swaggerhub.com/apis/deviantony/Portainer/2.0.1#/users/UserInspect
Thus I have to ask you for some help to sort out if it is a bug or a feature, or some misconfiguration on my side.
Thank you in advance. | 1.0 | Is the user's password supposed to be returned via Portainer API? - Hi guys,
Actually, my question is practically reflected in a $subject
Recently I tried to retrieve the password of a certain user via API (using GET request to the ``/users`` or ``/users/{id}`` API points) for further use in my code.
But with no luck. I just got a response:
```
'response' => {
'PortainerAuthorizations' => undef,
'Role' => 2,
'Id' => 2,
'EndpointAuthorizations' => undef,
'Username' => 'filosof'
},
```
However, if I get it right, the documentation states that the password shall be present in a response:
https://app.swaggerhub.com/apis/deviantony/Portainer/2.0.1#/users/UserList
https://app.swaggerhub.com/apis/deviantony/Portainer/2.0.1#/users/UserInspect
Thus I have to ask you for some help to sort out if it is a bug or a feature, or some misconfiguration on my side.
Thank you in advance. | non_priority | is the user s password supposed to be returned via portainer api hi guys actually my question is practically reflected in a subject recently i tried to retrieve the password of a certain user via api using get request to the users or users id api points for further use in my code but with no luck i just got a response response portainerauthorizations undef role id endpointauthorizations undef username filosof however if i get it right the documentation states that the password shall be present in a response thus i have to ask you for some help to sort out if it is a bug or a feature or some misconfiguration on my side thank you in advance | 0 |
426,566 | 12,374,199,793 | IssuesEvent | 2020-05-19 00:49:37 | kubeflow/pipelines | https://api.github.com/repos/kubeflow/pipelines | closed | mlpipeline-metrics not found | area/backend kind/bug priority/p0 status/triaged | ### What steps did you take:
I just install v1.0 and kick off Exit-Handler pipeline.
### What happened:
Checking pipeline logs, I get
```
I0224 21:55:00.245907 1 interceptor.go:29] /api.RunService/ReadArtifact handler starting
I0224 21:55:00.248584 1 error.go:218] ResourceNotFoundError: artifact runs/9c61556e-8ba8-4ad1-b579-639678a0abd8/nodes/exit-handler-mc8v6/artifacts/mlpipeline-metrics not found.
github.com/kubeflow/pipelines/backend/src/common/util.NewResourceNotFoundError
backend/src/common/util/error.go:150
github.com/kubeflow/pipelines/backend/src/apiserver/resource.(*ResourceManager).ReadArtifact
backend/src/apiserver/resource/resource_manager.go:860
github.com/kubeflow/pipelines/backend/src/apiserver/server.(*RunServer).ReadArtifact
backend/src/apiserver/server/run_server.go:134
github.com/kubeflow/pipelines/backend/api/go_client._RunService_ReadArtifact_Handler.func1
bazel-out/k8-opt/bin/backend/api/linux_amd64_stripped/go_client_go_proto%/github.com/kubeflow/pipelines/backend/api/go_client/run.pb.go:1395
main.apiServerInterceptor
backend/src/apiserver/interceptor.go:30
github.com/kubeflow/pipelines/backend/api/go_client._RunService_ReadArtifact_Handler
bazel-out/k8-opt/bin/backend/api/linux_amd64_stripped/go_client_go_proto%/github.com/kubeflow/pipelines/backend/api/go_client/run.pb.go:1397
google.golang.org/grpc.(*Server).processUnaryRPC
external/org_golang_google_grpc/server.go:966
google.golang.org/grpc.(*Server).handleStream
external/org_golang_google_grpc/server.go:1245
google.golang.org/grpc.(*Server).serveStreams.func1.1
external/org_golang_google_grpc/server.go:685
runtime.goexit
GOROOT/src/runtime/asm_amd64.s:1333
failed to read artifact 'run_id:"9c61556e-8ba8-4ad1-b579-639678a0abd8" node_id:"exit-handler-mc8v6" artifact_name:"mlpipeline-metrics" '.
github.com/kubeflow/pipelines/backend/src/common/util.(*UserError).wrapf
backend/src/common/util/error.go:206
github.com/kubeflow/pipelines/backend/src/common/util.Wrapf
backend/src/common/util/error.go:231
github.com/kubeflow/pipelines/backend/src/apiserver/server.(*RunServer).ReadArtifact
backend/src/apiserver/server/run_server.go:137
github.com/kubeflow/pipelines/backend/api/go_client._RunService_ReadArtifact_Handler.func1
bazel-out/k8-opt/bin/backend/api/linux_amd64_stripped/go_client_go_proto%/github.com/kubeflow/pipelines/backend/api/go_client/run.pb.go:1395
main.apiServerInterceptor
backend/src/apiserver/interceptor.go:30
github.com/kubeflow/pipelines/backend/api/go_client._RunService_ReadArtifact_Handler
bazel-out/k8-opt/bin/backend/api/linux_amd64_stripped/go_client_go_proto%/github.com/kubeflow/pipelines/backend/api/go_client/run.pb.go:1397
google.golang.org/grpc.(*Server).processUnaryRPC
external/org_golang_google_grpc/server.go:966
google.golang.org/grpc.(*Server).handleStream
external/org_golang_google_grpc/server.go:1245
google.golang.org/grpc.(*Server).serveStreams.func1.1
external/org_golang_google_grpc/server.go:685
runtime.goexit
GOROOT/src/runtime/asm_amd64.s:1333
/api.RunService/ReadArtifact call failed
github.com/kubeflow/pipelines/backend/src/common/util.(*UserError).wrapf
backend/src/common/util/error.go:206
github.com/kubeflow/pipelines/backend/src/common/util.Wrapf
backend/src/common/util/error.go:231
main.apiServerInterceptor
backend/src/apiserver/interceptor.go:32
github.com/kubeflow/pipelines/backend/api/go_client._RunService_ReadArtifact_Handler
bazel-out/k8-opt/bin/backend/api/linux_amd64_stripped/go_client_go_proto%/github.com/kubeflow/pipelines/backend/api/go_client/run.pb.go:1397
google.golang.org/grpc.(*Server).processUnaryRPC
external/org_golang_google_grpc/server.go:966
google.golang.org/grpc.(*Server).handleStream
external/org_golang_google_grpc/server.go:1245
google.golang.org/grpc.(*Server).serveStreams.func1.1
external/org_golang_google_grpc/server.go:685
runtime.goexit
GOROOT/src/runtime/asm_amd64.s:1333
```

### What did you expect to happen:
It suppose to have a `run/` in minio bucket.
### Environment:
How did you deploy Kubeflow Pipelines (KFP)?
using kfctl
KFP version: 0.2.0 docker-pullable://gcr.io/ml-pipeline/api-server@sha256:8149af1522a5fd75292dcb8630840dc23541d8c22f6d6938d3f45b6fd759872e
KFP SDK version: n/a
### Anything else you would like to add:
[Miscellaneous information that will assist in solving the issue.]
/kind bug
/area backend
| 1.0 | mlpipeline-metrics not found - ### What steps did you take:
I just install v1.0 and kick off Exit-Handler pipeline.
### What happened:
Checking pipeline logs, I get
```
I0224 21:55:00.245907 1 interceptor.go:29] /api.RunService/ReadArtifact handler starting
I0224 21:55:00.248584 1 error.go:218] ResourceNotFoundError: artifact runs/9c61556e-8ba8-4ad1-b579-639678a0abd8/nodes/exit-handler-mc8v6/artifacts/mlpipeline-metrics not found.
github.com/kubeflow/pipelines/backend/src/common/util.NewResourceNotFoundError
backend/src/common/util/error.go:150
github.com/kubeflow/pipelines/backend/src/apiserver/resource.(*ResourceManager).ReadArtifact
backend/src/apiserver/resource/resource_manager.go:860
github.com/kubeflow/pipelines/backend/src/apiserver/server.(*RunServer).ReadArtifact
backend/src/apiserver/server/run_server.go:134
github.com/kubeflow/pipelines/backend/api/go_client._RunService_ReadArtifact_Handler.func1
bazel-out/k8-opt/bin/backend/api/linux_amd64_stripped/go_client_go_proto%/github.com/kubeflow/pipelines/backend/api/go_client/run.pb.go:1395
main.apiServerInterceptor
backend/src/apiserver/interceptor.go:30
github.com/kubeflow/pipelines/backend/api/go_client._RunService_ReadArtifact_Handler
bazel-out/k8-opt/bin/backend/api/linux_amd64_stripped/go_client_go_proto%/github.com/kubeflow/pipelines/backend/api/go_client/run.pb.go:1397
google.golang.org/grpc.(*Server).processUnaryRPC
external/org_golang_google_grpc/server.go:966
google.golang.org/grpc.(*Server).handleStream
external/org_golang_google_grpc/server.go:1245
google.golang.org/grpc.(*Server).serveStreams.func1.1
external/org_golang_google_grpc/server.go:685
runtime.goexit
GOROOT/src/runtime/asm_amd64.s:1333
failed to read artifact 'run_id:"9c61556e-8ba8-4ad1-b579-639678a0abd8" node_id:"exit-handler-mc8v6" artifact_name:"mlpipeline-metrics" '.
github.com/kubeflow/pipelines/backend/src/common/util.(*UserError).wrapf
backend/src/common/util/error.go:206
github.com/kubeflow/pipelines/backend/src/common/util.Wrapf
backend/src/common/util/error.go:231
github.com/kubeflow/pipelines/backend/src/apiserver/server.(*RunServer).ReadArtifact
backend/src/apiserver/server/run_server.go:137
github.com/kubeflow/pipelines/backend/api/go_client._RunService_ReadArtifact_Handler.func1
bazel-out/k8-opt/bin/backend/api/linux_amd64_stripped/go_client_go_proto%/github.com/kubeflow/pipelines/backend/api/go_client/run.pb.go:1395
main.apiServerInterceptor
backend/src/apiserver/interceptor.go:30
github.com/kubeflow/pipelines/backend/api/go_client._RunService_ReadArtifact_Handler
bazel-out/k8-opt/bin/backend/api/linux_amd64_stripped/go_client_go_proto%/github.com/kubeflow/pipelines/backend/api/go_client/run.pb.go:1397
google.golang.org/grpc.(*Server).processUnaryRPC
external/org_golang_google_grpc/server.go:966
google.golang.org/grpc.(*Server).handleStream
external/org_golang_google_grpc/server.go:1245
google.golang.org/grpc.(*Server).serveStreams.func1.1
external/org_golang_google_grpc/server.go:685
runtime.goexit
GOROOT/src/runtime/asm_amd64.s:1333
/api.RunService/ReadArtifact call failed
github.com/kubeflow/pipelines/backend/src/common/util.(*UserError).wrapf
backend/src/common/util/error.go:206
github.com/kubeflow/pipelines/backend/src/common/util.Wrapf
backend/src/common/util/error.go:231
main.apiServerInterceptor
backend/src/apiserver/interceptor.go:32
github.com/kubeflow/pipelines/backend/api/go_client._RunService_ReadArtifact_Handler
bazel-out/k8-opt/bin/backend/api/linux_amd64_stripped/go_client_go_proto%/github.com/kubeflow/pipelines/backend/api/go_client/run.pb.go:1397
google.golang.org/grpc.(*Server).processUnaryRPC
external/org_golang_google_grpc/server.go:966
google.golang.org/grpc.(*Server).handleStream
external/org_golang_google_grpc/server.go:1245
google.golang.org/grpc.(*Server).serveStreams.func1.1
external/org_golang_google_grpc/server.go:685
runtime.goexit
GOROOT/src/runtime/asm_amd64.s:1333
```

### What did you expect to happen:
It suppose to have a `run/` in minio bucket.
### Environment:
How did you deploy Kubeflow Pipelines (KFP)?
using kfctl
KFP version: 0.2.0 docker-pullable://gcr.io/ml-pipeline/api-server@sha256:8149af1522a5fd75292dcb8630840dc23541d8c22f6d6938d3f45b6fd759872e
KFP SDK version: n/a
### Anything else you would like to add:
[Miscellaneous information that will assist in solving the issue.]
/kind bug
/area backend
| priority | mlpipeline metrics not found what steps did you take i just install and kick off exit handler pipeline what happened checking pipeline logs i get interceptor go api runservice readartifact handler starting error go resourcenotfounderror artifact runs nodes exit handler artifacts mlpipeline metrics not found github com kubeflow pipelines backend src common util newresourcenotfounderror backend src common util error go github com kubeflow pipelines backend src apiserver resource resourcemanager readartifact backend src apiserver resource resource manager go github com kubeflow pipelines backend src apiserver server runserver readartifact backend src apiserver server run server go github com kubeflow pipelines backend api go client runservice readartifact handler bazel out opt bin backend api linux stripped go client go proto github com kubeflow pipelines backend api go client run pb go main apiserverinterceptor backend src apiserver interceptor go github com kubeflow pipelines backend api go client runservice readartifact handler bazel out opt bin backend api linux stripped go client go proto github com kubeflow pipelines backend api go client run pb go google golang org grpc server processunaryrpc external org golang google grpc server go google golang org grpc server handlestream external org golang google grpc server go google golang org grpc server servestreams external org golang google grpc server go runtime goexit goroot src runtime asm s failed to read artifact run id node id exit handler artifact name mlpipeline metrics github com kubeflow pipelines backend src common util usererror wrapf backend src common util error go github com kubeflow pipelines backend src common util wrapf backend src common util error go github com kubeflow pipelines backend src apiserver server runserver readartifact backend src apiserver server run server go github com kubeflow pipelines backend api go client runservice readartifact handler bazel out opt bin backend api linux stripped go client go proto github com kubeflow pipelines backend api go client run pb go main apiserverinterceptor backend src apiserver interceptor go github com kubeflow pipelines backend api go client runservice readartifact handler bazel out opt bin backend api linux stripped go client go proto github com kubeflow pipelines backend api go client run pb go google golang org grpc server processunaryrpc external org golang google grpc server go google golang org grpc server handlestream external org golang google grpc server go google golang org grpc server servestreams external org golang google grpc server go runtime goexit goroot src runtime asm s api runservice readartifact call failed github com kubeflow pipelines backend src common util usererror wrapf backend src common util error go github com kubeflow pipelines backend src common util wrapf backend src common util error go main apiserverinterceptor backend src apiserver interceptor go github com kubeflow pipelines backend api go client runservice readartifact handler bazel out opt bin backend api linux stripped go client go proto github com kubeflow pipelines backend api go client run pb go google golang org grpc server processunaryrpc external org golang google grpc server go google golang org grpc server handlestream external org golang google grpc server go google golang org grpc server servestreams external org golang google grpc server go runtime goexit goroot src runtime asm s what did you expect to happen it suppose to have a run in minio bucket environment how did you deploy kubeflow pipelines kfp using kfctl kfp version docker pullable gcr io ml pipeline api server kfp sdk version n a anything else you would like to add kind bug area backend | 1 |
87,054 | 25,018,905,776 | IssuesEvent | 2022-11-03 21:38:18 | aws-amplify/amplify-hosting | https://api.github.com/repos/aws-amplify/amplify-hosting | closed | Different build command for different branch | question frontend-builds response-requested | **Please describe which feature you have a question about?**
Configuring Build Setting
**Provide additional details**
I have 1 app with two different environments and domains. each environment has different domain. for example:
- env `blue` will run in `app.blue` domain (for dev)
- env `red` will rub in `app.red` domain (for demo)
each env will have different build command like:
- npm run build:blue
- npm run build:red
I also set up 2 branch to auto-deploy when the branch is created
- red/** for red env
- blue/** for blue env
So, when I create 2 branches called `red/abc` and `blue/abc`, I will have sub domains called `red-abc.app.red` and `blue-abc.app.blue`.
The problem is in the amplify.yml we only able to run one build command. but, there is a `branch specific build settings` to create some conditional command.
<img width="951" alt="image" src="https://user-images.githubusercontent.com/29621236/199645738-ab743540-3bee-4a6c-86cf-0895dbe3cc2f.png">
The above screenshot gave me an example of 2 fixed branches (main and dev) but my branches are dynamics so I made some command modifications like this:
```
if [ "${AWS_BRANCH}" = "red/**" ]; then npm run build:red; fi //if [ "red/abc" = "red/**" ]; then npm run build:red; fi
if [ "${AWS_BRANCH}" = "blue/**" ]; then npm run build:blue; fi //if [ "red/abc" = "blue/**" ]; then npm run build:red; fi
```
it's not working because `"red/**"` is a normal string.
So my question is. is it possible to make a configuration like what I expected?
**What AWS Services are you utilizing?**
Amplify Hosting
**Provide additional details e.g. code snippets**
none
| 1.0 | Different build command for different branch - **Please describe which feature you have a question about?**
Configuring Build Setting
**Provide additional details**
I have 1 app with two different environments and domains. each environment has different domain. for example:
- env `blue` will run in `app.blue` domain (for dev)
- env `red` will rub in `app.red` domain (for demo)
each env will have different build command like:
- npm run build:blue
- npm run build:red
I also set up 2 branch to auto-deploy when the branch is created
- red/** for red env
- blue/** for blue env
So, when I create 2 branches called `red/abc` and `blue/abc`, I will have sub domains called `red-abc.app.red` and `blue-abc.app.blue`.
The problem is in the amplify.yml we only able to run one build command. but, there is a `branch specific build settings` to create some conditional command.
<img width="951" alt="image" src="https://user-images.githubusercontent.com/29621236/199645738-ab743540-3bee-4a6c-86cf-0895dbe3cc2f.png">
The above screenshot gave me an example of 2 fixed branches (main and dev) but my branches are dynamics so I made some command modifications like this:
```
if [ "${AWS_BRANCH}" = "red/**" ]; then npm run build:red; fi //if [ "red/abc" = "red/**" ]; then npm run build:red; fi
if [ "${AWS_BRANCH}" = "blue/**" ]; then npm run build:blue; fi //if [ "red/abc" = "blue/**" ]; then npm run build:red; fi
```
it's not working because `"red/**"` is a normal string.
So my question is. is it possible to make a configuration like what I expected?
**What AWS Services are you utilizing?**
Amplify Hosting
**Provide additional details e.g. code snippets**
none
| non_priority | different build command for different branch please describe which feature you have a question about configuring build setting provide additional details i have app with two different environments and domains each environment has different domain for example env blue will run in app blue domain for dev env red will rub in app red domain for demo each env will have different build command like npm run build blue npm run build red i also set up branch to auto deploy when the branch is created red for red env blue for blue env so when i create branches called red abc and blue abc i will have sub domains called red abc app red and blue abc app blue the problem is in the amplify yml we only able to run one build command but there is a branch specific build settings to create some conditional command img width alt image src the above screenshot gave me an example of fixed branches main and dev but my branches are dynamics so i made some command modifications like this if then npm run build red fi if then npm run build red fi if then npm run build blue fi if then npm run build red fi it s not working because red is a normal string so my question is is it possible to make a configuration like what i expected what aws services are you utilizing amplify hosting provide additional details e g code snippets none | 0 |
514,007 | 14,931,396,151 | IssuesEvent | 2021-01-25 05:41:19 | kubeflow/kubeflow | https://api.github.com/repos/kubeflow/kubeflow | closed | Improve the messaging around Scheduled Workflow Cron schedule | area/front-end area/pipelines kind/feature lifecycle/stale priority/p2 | /kind feature
**Why you need this feature:**

We were having a problem where we couldn't immediately figure out why the Cron wasn't working as we expected when giving it the following syntax.
```
10 10 * * *
```
We expected it to run at 10:10am every day but instead it was running every hour.
Thankfully someone in the Kubeflow chat had the same issue and pointed us to the problem.

Seems that we are using the alternative format?
https://godoc.org/github.com/robfig/cron#hdr-Alternative_Formats
So you will want to enter
Seconds | Minute | Hour | Day | Month | Day of Week
**Describe the solution you'd like:**
If it was possible to improve the messaging around this?
**Anything else you would like to add:**
[Miscellaneous information that will assist in solving the issue.] | 1.0 | Improve the messaging around Scheduled Workflow Cron schedule - /kind feature
**Why you need this feature:**

We were having a problem where we couldn't immediately figure out why the Cron wasn't working as we expected when giving it the following syntax.
```
10 10 * * *
```
We expected it to run at 10:10am every day but instead it was running every hour.
Thankfully someone in the Kubeflow chat had the same issue and pointed us to the problem.

Seems that we are using the alternative format?
https://godoc.org/github.com/robfig/cron#hdr-Alternative_Formats
So you will want to enter
Seconds | Minute | Hour | Day | Month | Day of Week
**Describe the solution you'd like:**
If it was possible to improve the messaging around this?
**Anything else you would like to add:**
[Miscellaneous information that will assist in solving the issue.] | priority | improve the messaging around scheduled workflow cron schedule kind feature why you need this feature we were having a problem where we couldn t immediately figure out why the cron wasn t working as we expected when giving it the following syntax we expected it to run at every day but instead it was running every hour thankfully someone in the kubeflow chat had the same issue and pointed us to the problem seems that we are using the alternative format so you will want to enter seconds minute hour day month day of week describe the solution you d like if it was possible to improve the messaging around this anything else you would like to add | 1 |
26,328 | 12,965,372,157 | IssuesEvent | 2020-07-20 22:13:09 | JuliaLang/julia | https://api.github.com/repos/JuliaLang/julia | closed | Feature request: Multithreaded sparse matrix vector product | linear algebra performance sparse | I have a large sparse matrix which I need to multiply with dense vectors. The types of the elements in the vectors are UInt256 and UInt8 in the matrix. Currently each product takes about 45 seconds using SparseArrays.
It would be great if this could be sped up using multithreading. As far as I can see no such library exists currently.
| True | Feature request: Multithreaded sparse matrix vector product - I have a large sparse matrix which I need to multiply with dense vectors. The types of the elements in the vectors are UInt256 and UInt8 in the matrix. Currently each product takes about 45 seconds using SparseArrays.
It would be great if this could be sped up using multithreading. As far as I can see no such library exists currently.
| non_priority | feature request multithreaded sparse matrix vector product i have a large sparse matrix which i need to multiply with dense vectors the types of the elements in the vectors are and in the matrix currently each product takes about seconds using sparsearrays it would be great if this could be sped up using multithreading as far as i can see no such library exists currently | 0 |
544,578 | 15,894,731,289 | IssuesEvent | 2021-04-11 11:21:47 | marcusolsson/grafana-hourly-heatmap-panel | https://api.github.com/repos/marcusolsson/grafana-hourly-heatmap-panel | closed | Color option for null values | priority/low type/enhancement | If there is a null value in the data, the heatmap prints black color. It would be nice to have a color palette to select a color for null values. | 1.0 | Color option for null values - If there is a null value in the data, the heatmap prints black color. It would be nice to have a color palette to select a color for null values. | priority | color option for null values if there is a null value in the data the heatmap prints black color it would be nice to have a color palette to select a color for null values | 1 |
212,357 | 7,236,179,340 | IssuesEvent | 2018-02-13 05:23:42 | yanis333/SOEN341_Website | https://api.github.com/repos/yanis333/SOEN341_Website | closed | Create Jquery function to validate textboxes (if empty or not). | High value Priority3 Risk2 feature sprint 2 | [sp] = 1
This is a feature for issue #59 | 1.0 | Create Jquery function to validate textboxes (if empty or not). - [sp] = 1
This is a feature for issue #59 | priority | create jquery function to validate textboxes if empty or not this is a feature for issue | 1 |
45,582 | 18,762,337,112 | IssuesEvent | 2021-11-05 18:01:47 | elastic/kibana | https://api.github.com/repos/elastic/kibana | opened | [Actions] Implement generic support for OAuth JWT authentication flow for the alerting connectors. | enhancement Team:Alerting Services Feature:Alerting/RuleActions estimate:medium | Inspired by the research issue https://github.com/elastic/kibana/issues/79372 results where we did a couple of POCs for the comparison of the different OAuth flows which is supported by the existing connectors integrations and the most urgent OAuth implementation requirements (for ServiceNow).
Current solution is to add a reusable connectors support for OAuth JWT flow as an alternative to the OAuth Client Credentials flow. Client Credentials is already implemented for Email MS Exchange connector. The reason why we are not able to reuse Client Credentials flow for some connectors types is that ServiceNow does not support OAuth Client Credentials for inbound integration.
JSON web token (JWT), pronounced "jot", is an open standard ([RFC 7519](https://datatracker.ietf.org/doc/html/rfc7519)) that defines a compact and self-contained way for securely transmitting information between parties as a JSON object. Again, JWT is a standard, meaning that all JWTs are tokens, but not all tokens are JWTs.
Because of its relatively small size, a JWT can be sent through a URL, through a POST parameter, or inside an HTTP header, and it is transmitted quickly. A JWT contains all the required information about an entity to avoid querying a database more than once. The recipient of a JWT also does not need to call a server to validate the token.
### Security
The information contained within the JSON object can be verified and trusted because it is digitally signed. Although JWTs can also be encrypted to provide secrecy between parties, Auth0-issued JWTs are JSON Web Signatures (JWS), meaning they are signed rather than encrypted. As such, we will focus on signed tokens, which can verify the integrity of the claims contained within them, while encrypted tokens hide those claims from other parties.
JWTs will be signed using a public/private key pair using RSA. When tokens are signed using public/private key pairs, the signature also certifies that only the party holding the private key is the one that signed it. In our case this is Kibana connector.
Before a received JWT is used, it should be properly validated using its signature. Note that a successfully validated token only means that the information contained within the token has not been modified by anyone else. This doesn't mean that others weren't able to see the content, which is stored in plain text. Because of this, you should never store sensitive information inside a JWT and should take other steps to ensure that JWTs are not intercepted, such as by sending JWTs only over HTTPS, following best practices, and using only secure and up-to-date libraries.
| 1.0 | [Actions] Implement generic support for OAuth JWT authentication flow for the alerting connectors. - Inspired by the research issue https://github.com/elastic/kibana/issues/79372 results where we did a couple of POCs for the comparison of the different OAuth flows which is supported by the existing connectors integrations and the most urgent OAuth implementation requirements (for ServiceNow).
Current solution is to add a reusable connectors support for OAuth JWT flow as an alternative to the OAuth Client Credentials flow. Client Credentials is already implemented for Email MS Exchange connector. The reason why we are not able to reuse Client Credentials flow for some connectors types is that ServiceNow does not support OAuth Client Credentials for inbound integration.
JSON web token (JWT), pronounced "jot", is an open standard ([RFC 7519](https://datatracker.ietf.org/doc/html/rfc7519)) that defines a compact and self-contained way for securely transmitting information between parties as a JSON object. Again, JWT is a standard, meaning that all JWTs are tokens, but not all tokens are JWTs.
Because of its relatively small size, a JWT can be sent through a URL, through a POST parameter, or inside an HTTP header, and it is transmitted quickly. A JWT contains all the required information about an entity to avoid querying a database more than once. The recipient of a JWT also does not need to call a server to validate the token.
### Security
The information contained within the JSON object can be verified and trusted because it is digitally signed. Although JWTs can also be encrypted to provide secrecy between parties, Auth0-issued JWTs are JSON Web Signatures (JWS), meaning they are signed rather than encrypted. As such, we will focus on signed tokens, which can verify the integrity of the claims contained within them, while encrypted tokens hide those claims from other parties.
JWTs will be signed using a public/private key pair using RSA. When tokens are signed using public/private key pairs, the signature also certifies that only the party holding the private key is the one that signed it. In our case this is Kibana connector.
Before a received JWT is used, it should be properly validated using its signature. Note that a successfully validated token only means that the information contained within the token has not been modified by anyone else. This doesn't mean that others weren't able to see the content, which is stored in plain text. Because of this, you should never store sensitive information inside a JWT and should take other steps to ensure that JWTs are not intercepted, such as by sending JWTs only over HTTPS, following best practices, and using only secure and up-to-date libraries.
| non_priority | implement generic support for oauth jwt authentication flow for the alerting connectors inspired by the research issue results where we did a couple of pocs for the comparison of the different oauth flows which is supported by the existing connectors integrations and the most urgent oauth implementation requirements for servicenow current solution is to add a reusable connectors support for oauth jwt flow as an alternative to the oauth client credentials flow client credentials is already implemented for email ms exchange connector the reason why we are not able to reuse client credentials flow for some connectors types is that servicenow does not support oauth client credentials for inbound integration json web token jwt pronounced jot is an open standard that defines a compact and self contained way for securely transmitting information between parties as a json object again jwt is a standard meaning that all jwts are tokens but not all tokens are jwts because of its relatively small size a jwt can be sent through a url through a post parameter or inside an http header and it is transmitted quickly a jwt contains all the required information about an entity to avoid querying a database more than once the recipient of a jwt also does not need to call a server to validate the token security the information contained within the json object can be verified and trusted because it is digitally signed although jwts can also be encrypted to provide secrecy between parties issued jwts are json web signatures jws meaning they are signed rather than encrypted as such we will focus on signed tokens which can verify the integrity of the claims contained within them while encrypted tokens hide those claims from other parties jwts will be signed using a public private key pair using rsa when tokens are signed using public private key pairs the signature also certifies that only the party holding the private key is the one that signed it in our case this is kibana connector before a received jwt is used it should be properly validated using its signature note that a successfully validated token only means that the information contained within the token has not been modified by anyone else this doesn t mean that others weren t able to see the content which is stored in plain text because of this you should never store sensitive information inside a jwt and should take other steps to ensure that jwts are not intercepted such as by sending jwts only over https following best practices and using only secure and up to date libraries | 0 |
690,061 | 23,644,547,129 | IssuesEvent | 2022-08-25 20:32:43 | WarwickAI/wai-platform-v2 | https://api.github.com/repos/WarwickAI/wai-platform-v2 | closed | Make user role an array instead of one value | enhancement med-priority | For example, if a user is an `exec` they should also have the `member` role.
Will allow greater security, for example with the merch page, to only allow certain people to create and edit merch items. | 1.0 | Make user role an array instead of one value - For example, if a user is an `exec` they should also have the `member` role.
Will allow greater security, for example with the merch page, to only allow certain people to create and edit merch items. | priority | make user role an array instead of one value for example if a user is an exec they should also have the member role will allow greater security for example with the merch page to only allow certain people to create and edit merch items | 1 |
267,402 | 28,509,009,892 | IssuesEvent | 2023-04-19 01:27:34 | dpteam/RK3188_TABLET | https://api.github.com/repos/dpteam/RK3188_TABLET | closed | CVE-2019-15219 (Medium) detected in linuxv3.0 - autoclosed | Mend: dependency security vulnerability | ## CVE-2019-15219 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv3.0</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/verygreen/linux.git>https://github.com/verygreen/linux.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/dpteam/RK3188_TABLET/commit/0c501f5a0fd72c7b2ac82904235363bd44fd8f9e">0c501f5a0fd72c7b2ac82904235363bd44fd8f9e</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (3)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/usb/misc/sisusbvga/sisusb.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/usb/misc/sisusbvga/sisusb.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/usb/misc/sisusbvga/sisusb.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in the Linux kernel before 5.1.8. There is a NULL pointer dereference caused by a malicious USB device in the drivers/usb/misc/sisusbvga/sisusb.c driver.
<p>Publish Date: 2019-08-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-15219>CVE-2019-15219</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Physical
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-15219">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-15219</a></p>
<p>Release Date: 2019-08-19</p>
<p>Fix Resolution: v5.2-rc3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2019-15219 (Medium) detected in linuxv3.0 - autoclosed - ## CVE-2019-15219 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv3.0</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/verygreen/linux.git>https://github.com/verygreen/linux.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/dpteam/RK3188_TABLET/commit/0c501f5a0fd72c7b2ac82904235363bd44fd8f9e">0c501f5a0fd72c7b2ac82904235363bd44fd8f9e</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (3)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/usb/misc/sisusbvga/sisusb.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/usb/misc/sisusbvga/sisusb.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/usb/misc/sisusbvga/sisusb.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in the Linux kernel before 5.1.8. There is a NULL pointer dereference caused by a malicious USB device in the drivers/usb/misc/sisusbvga/sisusb.c driver.
<p>Publish Date: 2019-08-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-15219>CVE-2019-15219</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Physical
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-15219">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-15219</a></p>
<p>Release Date: 2019-08-19</p>
<p>Fix Resolution: v5.2-rc3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in autoclosed cve medium severity vulnerability vulnerable library linux kernel source tree library home page a href found in head commit a href found in base branch master vulnerable source files drivers usb misc sisusbvga sisusb c drivers usb misc sisusbvga sisusb c drivers usb misc sisusbvga sisusb c vulnerability details an issue was discovered in the linux kernel before there is a null pointer dereference caused by a malicious usb device in the drivers usb misc sisusbvga sisusb c driver publish date url a href cvss score details base score metrics exploitability metrics attack vector physical attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend | 0 |
427,640 | 12,397,936,784 | IssuesEvent | 2020-05-21 00:13:28 | eclipse-ee4j/glassfish | https://api.github.com/repos/eclipse-ee4j/glassfish | closed | Consider better use of start levels in conjuction with autostart dir | Component: OSGi ERR: Assignee Priority: Major Stale Type: Improvement | We hardcode a number of module names in osgi.properties file as core.bundles. We should consider having different startlevels in modules/autostart and moving those bundles to that dir.
#### Affected Versions
[4.0] | 1.0 | Consider better use of start levels in conjuction with autostart dir - We hardcode a number of module names in osgi.properties file as core.bundles. We should consider having different startlevels in modules/autostart and moving those bundles to that dir.
#### Affected Versions
[4.0] | priority | consider better use of start levels in conjuction with autostart dir we hardcode a number of module names in osgi properties file as core bundles we should consider having different startlevels in modules autostart and moving those bundles to that dir affected versions | 1 |
19,384 | 13,940,317,003 | IssuesEvent | 2020-10-22 17:43:49 | ONRR/nrrd | https://api.github.com/repos/ONRR/nrrd | opened | Clarify the difference between compare and nationwide sections | Explore Data P3: Medium usability | Four participants weren't clear that there was a difference between the nationwide and compare sections on Explore Data. Maybe put a US map on the headers for the nationwide sections. | True | Clarify the difference between compare and nationwide sections - Four participants weren't clear that there was a difference between the nationwide and compare sections on Explore Data. Maybe put a US map on the headers for the nationwide sections. | non_priority | clarify the difference between compare and nationwide sections four participants weren t clear that there was a difference between the nationwide and compare sections on explore data maybe put a us map on the headers for the nationwide sections | 0 |
104,282 | 22,621,474,808 | IssuesEvent | 2022-06-30 06:50:49 | FerretDB/FerretDB | https://api.github.com/repos/FerretDB/FerretDB | closed | Investigate fuzzing failure | code/bug not ripe | https://github.com/FerretDB/FerretDB/runs/4726383278
```
fuzz: elapsed: 15s, execs: 12535 (1444/sec), new interesting: 3 (total: 3)
fuzz: elapsed: 16s, execs: 12535 (0/sec), new interesting: 3 (total: 3)
--- FAIL: FuzzArray (16.40s)
fuzzing process hung or terminated unexpectedly while minimizing: EOF
Failing input written to testdata/fuzz/FuzzArray/c55e914aa4e04a5a2e8723631fb03bb4f43286d4afe92137b17360c089b7214a
To re-run:
go test -run=FuzzArray/c55e914aa4e04a5a2e8723631fb03bb4f43286d4afe92137b17360c089b7214a
FAIL
exit status 1
FAIL github.com/FerretDB/FerretDB/internal/bson 16.429s
```
But running tests with this input passes. That looks like a `go test -fuzz` bug. | 1.0 | Investigate fuzzing failure - https://github.com/FerretDB/FerretDB/runs/4726383278
```
fuzz: elapsed: 15s, execs: 12535 (1444/sec), new interesting: 3 (total: 3)
fuzz: elapsed: 16s, execs: 12535 (0/sec), new interesting: 3 (total: 3)
--- FAIL: FuzzArray (16.40s)
fuzzing process hung or terminated unexpectedly while minimizing: EOF
Failing input written to testdata/fuzz/FuzzArray/c55e914aa4e04a5a2e8723631fb03bb4f43286d4afe92137b17360c089b7214a
To re-run:
go test -run=FuzzArray/c55e914aa4e04a5a2e8723631fb03bb4f43286d4afe92137b17360c089b7214a
FAIL
exit status 1
FAIL github.com/FerretDB/FerretDB/internal/bson 16.429s
```
But running tests with this input passes. That looks like a `go test -fuzz` bug. | non_priority | investigate fuzzing failure fuzz elapsed execs sec new interesting total fuzz elapsed execs sec new interesting total fail fuzzarray fuzzing process hung or terminated unexpectedly while minimizing eof failing input written to testdata fuzz fuzzarray to re run go test run fuzzarray fail exit status fail github com ferretdb ferretdb internal bson but running tests with this input passes that looks like a go test fuzz bug | 0 |
19,735 | 5,923,729,695 | IssuesEvent | 2017-05-23 08:42:46 | akvo/akvo-flow-mobile | https://api.github.com/repos/akvo/akvo-flow-mobile | closed | Refactor dependencies management | Code Refactoring | # Overview
Although this is not a priority right now, we should consider refactoring this component at some point, aiming at having a simpler, more maintainable codebase.
| 1.0 | Refactor dependencies management - # Overview
Although this is not a priority right now, we should consider refactoring this component at some point, aiming at having a simpler, more maintainable codebase.
| non_priority | refactor dependencies management overview although this is not a priority right now we should consider refactoring this component at some point aiming at having a simpler more maintainable codebase | 0 |
255,785 | 27,504,281,018 | IssuesEvent | 2023-03-06 01:03:45 | snykiotcubedev/reactos-0.4.13-release | https://api.github.com/repos/snykiotcubedev/reactos-0.4.13-release | opened | CVE-2022-4645 (Medium) detected in reactosReactOS-0.4.13-release-28-g5724391-src | security vulnerability | ## CVE-2022-4645 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>reactosReactOS-0.4.13-release-28-g5724391-src</b></p></summary>
<p>
<p>An operating system based on the best Windows NT design principles</p>
<p>Library home page: <a href=https://sourceforge.net/projects/reactos/>https://sourceforge.net/projects/reactos/</a></p>
<p>Found in base branch: <b>main</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/sdk/include/reactos/libs/libtiff/tif_dir.h</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/sdk/include/reactos/libs/libtiff/tif_dir.h</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
LibTIFF 4.4.0 has an out-of-bounds read in tiffcp in tools/tiffcp.c:948, allowing attackers to cause a denial-of-service via a crafted tiff file. For users that compile libtiff from sources, the fix is available with commit e8131125.
<p>Publish Date: 2023-03-03
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-4645>CVE-2022-4645</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.cve.org/CVERecord?id=CVE-2022-4645">https://www.cve.org/CVERecord?id=CVE-2022-4645</a></p>
<p>Release Date: 2023-03-03</p>
<p>Fix Resolution: v4.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-4645 (Medium) detected in reactosReactOS-0.4.13-release-28-g5724391-src - ## CVE-2022-4645 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>reactosReactOS-0.4.13-release-28-g5724391-src</b></p></summary>
<p>
<p>An operating system based on the best Windows NT design principles</p>
<p>Library home page: <a href=https://sourceforge.net/projects/reactos/>https://sourceforge.net/projects/reactos/</a></p>
<p>Found in base branch: <b>main</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/sdk/include/reactos/libs/libtiff/tif_dir.h</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/sdk/include/reactos/libs/libtiff/tif_dir.h</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
LibTIFF 4.4.0 has an out-of-bounds read in tiffcp in tools/tiffcp.c:948, allowing attackers to cause a denial-of-service via a crafted tiff file. For users that compile libtiff from sources, the fix is available with commit e8131125.
<p>Publish Date: 2023-03-03
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-4645>CVE-2022-4645</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.cve.org/CVERecord?id=CVE-2022-4645">https://www.cve.org/CVERecord?id=CVE-2022-4645</a></p>
<p>Release Date: 2023-03-03</p>
<p>Fix Resolution: v4.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in reactosreactos release src cve medium severity vulnerability vulnerable library reactosreactos release src an operating system based on the best windows nt design principles library home page a href found in base branch main vulnerable source files sdk include reactos libs libtiff tif dir h sdk include reactos libs libtiff tif dir h vulnerability details libtiff has an out of bounds read in tiffcp in tools tiffcp c allowing attackers to cause a denial of service via a crafted tiff file for users that compile libtiff from sources the fix is available with commit publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend | 0 |
274,649 | 20,862,380,549 | IssuesEvent | 2022-03-22 01:01:58 | benstream/CS372 | https://api.github.com/repos/benstream/CS372 | closed | Week 3 Iteration Plan (March 14th – 18th, 2022) | documentation | - [x] #12
- [x] #15
- [x] #19
- [x] #16
- [x] #14: high to low priority completion (viewer's perspective)
- [x] #20 | 1.0 | Week 3 Iteration Plan (March 14th – 18th, 2022) - - [x] #12
- [x] #15
- [x] #19
- [x] #16
- [x] #14: high to low priority completion (viewer's perspective)
- [x] #20 | non_priority | week iteration plan march – high to low priority completion viewer s perspective | 0 |
29,091 | 4,160,167,388 | IssuesEvent | 2016-06-17 12:13:01 | GSE-Project/SS2016-group1 | https://api.github.com/repos/GSE-Project/SS2016-group1 | closed | Request custom stop - Change Component Diagram | -> June 20 designIF | depends on #60
- [x] Change diagram
- [x] add traceability information | 1.0 | Request custom stop - Change Component Diagram - depends on #60
- [x] Change diagram
- [x] add traceability information | non_priority | request custom stop change component diagram depends on change diagram add traceability information | 0 |
20,725 | 11,495,998,002 | IssuesEvent | 2020-02-12 06:45:25 | Azure/azure-cli | https://api.github.com/repos/Azure/azure-cli | closed | Azure ARC Extension - No module named google.auth | AKS Service Attention |
### **This is autogenerated. Please review and update as needed.**
## Describe the bug
**Command Name**
`az connectedk8s connect`
**Errors:**
```
No module named google.auth
Traceback (most recent call last):
python2.7/site-packages/knack/cli.py, ln 206, in invoke
cmd_result = self.invocation.execute(args)
cli/core/commands/__init__.py, ln 528, in execute
self.commands_loader.load_arguments(command)
azure/cli/core/__init__.py, ln 299, in load_arguments
self.command_table[command].load_arguments() # this loads the arguments via reflection
cli/core/commands/__init__.py, ln 291, in load_arguments
super(AzCliCommand, self).load_arguments()
...
connectedk8s/kubernetes/config/__init__.py, ln 19, in <module>
from .kube_config import (list_kube_config_contexts, load_kube_config,
connectedk8s/kubernetes/config/kube_config.py, ln 28, in <module>
import google.auth
ImportError: No module named google.auth
```
## To Reproduce:
Steps to reproduce the behavior. Note that argument values have been redacted, as they may contain sensitive information.
- _Put any pre-requisite steps here..._
- `az connectedk8s connect --name {} --resource-group {}`
## Expected Behavior
## Environment Summary
```
Linux-4.15.0-58-generic-x86_64-with-Ubuntu-16.04-xenial
Python 2.7.12
Shell: bash
azure-cli 2.0.80
Extensions:
connectedk8s 0.1.0
k8sconfiguration 0.1.1
```
## Additional Context
<!--Please don't remove this:-->
<!--auto-generated-->
| 1.0 | Azure ARC Extension - No module named google.auth -
### **This is autogenerated. Please review and update as needed.**
## Describe the bug
**Command Name**
`az connectedk8s connect`
**Errors:**
```
No module named google.auth
Traceback (most recent call last):
python2.7/site-packages/knack/cli.py, ln 206, in invoke
cmd_result = self.invocation.execute(args)
cli/core/commands/__init__.py, ln 528, in execute
self.commands_loader.load_arguments(command)
azure/cli/core/__init__.py, ln 299, in load_arguments
self.command_table[command].load_arguments() # this loads the arguments via reflection
cli/core/commands/__init__.py, ln 291, in load_arguments
super(AzCliCommand, self).load_arguments()
...
connectedk8s/kubernetes/config/__init__.py, ln 19, in <module>
from .kube_config import (list_kube_config_contexts, load_kube_config,
connectedk8s/kubernetes/config/kube_config.py, ln 28, in <module>
import google.auth
ImportError: No module named google.auth
```
## To Reproduce:
Steps to reproduce the behavior. Note that argument values have been redacted, as they may contain sensitive information.
- _Put any pre-requisite steps here..._
- `az connectedk8s connect --name {} --resource-group {}`
## Expected Behavior
## Environment Summary
```
Linux-4.15.0-58-generic-x86_64-with-Ubuntu-16.04-xenial
Python 2.7.12
Shell: bash
azure-cli 2.0.80
Extensions:
connectedk8s 0.1.0
k8sconfiguration 0.1.1
```
## Additional Context
<!--Please don't remove this:-->
<!--auto-generated-->
| non_priority | azure arc extension no module named google auth this is autogenerated please review and update as needed describe the bug command name az connect errors no module named google auth traceback most recent call last site packages knack cli py ln in invoke cmd result self invocation execute args cli core commands init py ln in execute self commands loader load arguments command azure cli core init py ln in load arguments self command table load arguments this loads the arguments via reflection cli core commands init py ln in load arguments super azclicommand self load arguments kubernetes config init py ln in from kube config import list kube config contexts load kube config kubernetes config kube config py ln in import google auth importerror no module named google auth to reproduce steps to reproduce the behavior note that argument values have been redacted as they may contain sensitive information put any pre requisite steps here az connect name resource group expected behavior environment summary linux generic with ubuntu xenial python shell bash azure cli extensions additional context | 0 |
252,829 | 8,042,486,294 | IssuesEvent | 2018-07-31 08:19:05 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | www.google.com - site is not usable | browser-firefox-mobile priority-critical | <!-- @browser: Firefox Mobile 63.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 8.1.0; Mobile; rv:63.0) Gecko/63.0 Firefox/63.0 -->
<!-- @reported_with: mobile-reporter -->
**URL**: https://www.google.com/search?q=galileo+telephone+portable
**Browser / Version**: Firefox Mobile 63.0
**Operating System**: Android 8.1.0
**Tested Another Browser**: Yes
**Problem type**: Site is not usable
**Description**: it says no internet connection while it works with focus
**Steps to Reproduce**:
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | www.google.com - site is not usable - <!-- @browser: Firefox Mobile 63.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 8.1.0; Mobile; rv:63.0) Gecko/63.0 Firefox/63.0 -->
<!-- @reported_with: mobile-reporter -->
**URL**: https://www.google.com/search?q=galileo+telephone+portable
**Browser / Version**: Firefox Mobile 63.0
**Operating System**: Android 8.1.0
**Tested Another Browser**: Yes
**Problem type**: Site is not usable
**Description**: it says no internet connection while it works with focus
**Steps to Reproduce**:
_From [webcompat.com](https://webcompat.com/) with ❤️_ | priority | site is not usable url browser version firefox mobile operating system android tested another browser yes problem type site is not usable description it says no internet connection while it works with focus steps to reproduce from with ❤️ | 1 |
168,122 | 20,742,230,691 | IssuesEvent | 2022-03-14 18:51:41 | snowflakedb/snowflake-jdbc | https://api.github.com/repos/snowflakedb/snowflake-jdbc | opened | CVE-2020-36518 (High) detected in jackson-databind-2.9.8.jar, jackson-databind-2.12.1.jar | security vulnerability | ## CVE-2020-36518 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.9.8.jar</b>, <b>jackson-databind-2.12.1.jar</b></p></summary>
<p>
<details><summary><b>jackson-databind-2.9.8.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /tmp/ws-ua_20220314184624_MDZKQY/archiveExtraction_EZOLZD/JKSEHX/20220314184624/snowflake-jdbc_depth_0/dependencies/arrow-vector-0.15.1/META-INF/maven/org.apache.arrow/arrow-vector/pom.xml</p>
<p>Path to vulnerable library: /sitory/com/fasterxml/jackson/core/jackson-databind/2.9.8/jackson-databind-2.9.8.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.8.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.12.1.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /public_pom.xml</p>
<p>Path to vulnerable library: /sitory/com/fasterxml/jackson/core/jackson-databind/2.12.1/jackson-databind-2.12.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.12.1.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/snowflakedb/snowflake-jdbc/commit/d6393eeb576b1cd6898a6ca1874bb9157ec210db">d6393eeb576b1cd6898a6ca1874bb9157ec210db</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jackson-databind before 2.13.0 allows a Java StackOverflow exception and denial of service via a large depth of nested objects.
WhiteSource Note: After conducting further research, WhiteSource has determined that all versions of com.fasterxml.jackson.core:jackson-databind up to version 2.13.2 are vulnerable to CVE-2020-36518.
<p>Publish Date: 2022-03-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36518>CVE-2020-36518</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2020-36518">https://nvd.nist.gov/vuln/detail/CVE-2020-36518</a></p>
<p>Release Date: 2022-03-11</p>
<p>Fix Resolution: jackson-databind-2.10 - 2.10.1;com.fasterxml.jackson.core.jackson-databind - 2.6.2.v20161117-2150</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.8","packageFilePaths":["/tmp/ws-ua_20220314184624_MDZKQY/archiveExtraction_EZOLZD/JKSEHX/20220314184624/snowflake-jdbc_depth_0/dependencies/arrow-vector-0.15.1/META-INF/maven/org.apache.arrow/arrow-vector/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jackson-databind-2.10 - 2.10.1;com.fasterxml.jackson.core.jackson-databind - 2.6.2.v20161117-2150","isBinary":false},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.12.1","packageFilePaths":["/public_pom.xml"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.12.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jackson-databind-2.10 - 2.10.1;com.fasterxml.jackson.core.jackson-databind - 2.6.2.v20161117-2150","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-36518","vulnerabilityDetails":"jackson-databind before 2.13.0 allows a Java StackOverflow exception and denial of service via a large depth of nested objects.\n WhiteSource Note: After conducting further research, WhiteSource has determined that all versions of com.fasterxml.jackson.core:jackson-databind up to version 2.13.2 are vulnerable to CVE-2020-36518.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36518","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | True | CVE-2020-36518 (High) detected in jackson-databind-2.9.8.jar, jackson-databind-2.12.1.jar - ## CVE-2020-36518 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.9.8.jar</b>, <b>jackson-databind-2.12.1.jar</b></p></summary>
<p>
<details><summary><b>jackson-databind-2.9.8.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /tmp/ws-ua_20220314184624_MDZKQY/archiveExtraction_EZOLZD/JKSEHX/20220314184624/snowflake-jdbc_depth_0/dependencies/arrow-vector-0.15.1/META-INF/maven/org.apache.arrow/arrow-vector/pom.xml</p>
<p>Path to vulnerable library: /sitory/com/fasterxml/jackson/core/jackson-databind/2.9.8/jackson-databind-2.9.8.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.8.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.12.1.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /public_pom.xml</p>
<p>Path to vulnerable library: /sitory/com/fasterxml/jackson/core/jackson-databind/2.12.1/jackson-databind-2.12.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.12.1.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/snowflakedb/snowflake-jdbc/commit/d6393eeb576b1cd6898a6ca1874bb9157ec210db">d6393eeb576b1cd6898a6ca1874bb9157ec210db</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jackson-databind before 2.13.0 allows a Java StackOverflow exception and denial of service via a large depth of nested objects.
WhiteSource Note: After conducting further research, WhiteSource has determined that all versions of com.fasterxml.jackson.core:jackson-databind up to version 2.13.2 are vulnerable to CVE-2020-36518.
<p>Publish Date: 2022-03-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36518>CVE-2020-36518</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2020-36518">https://nvd.nist.gov/vuln/detail/CVE-2020-36518</a></p>
<p>Release Date: 2022-03-11</p>
<p>Fix Resolution: jackson-databind-2.10 - 2.10.1;com.fasterxml.jackson.core.jackson-databind - 2.6.2.v20161117-2150</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.8","packageFilePaths":["/tmp/ws-ua_20220314184624_MDZKQY/archiveExtraction_EZOLZD/JKSEHX/20220314184624/snowflake-jdbc_depth_0/dependencies/arrow-vector-0.15.1/META-INF/maven/org.apache.arrow/arrow-vector/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jackson-databind-2.10 - 2.10.1;com.fasterxml.jackson.core.jackson-databind - 2.6.2.v20161117-2150","isBinary":false},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.12.1","packageFilePaths":["/public_pom.xml"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.12.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jackson-databind-2.10 - 2.10.1;com.fasterxml.jackson.core.jackson-databind - 2.6.2.v20161117-2150","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-36518","vulnerabilityDetails":"jackson-databind before 2.13.0 allows a Java StackOverflow exception and denial of service via a large depth of nested objects.\n WhiteSource Note: After conducting further research, WhiteSource has determined that all versions of com.fasterxml.jackson.core:jackson-databind up to version 2.13.2 are vulnerable to CVE-2020-36518.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36518","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | non_priority | cve high detected in jackson databind jar jackson databind jar cve high severity vulnerability vulnerable libraries jackson databind jar jackson databind jar jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file tmp ws ua mdzkqy archiveextraction ezolzd jksehx snowflake jdbc depth dependencies arrow vector meta inf maven org apache arrow arrow vector pom xml path to vulnerable library sitory com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file public pom xml path to vulnerable library sitory com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details jackson databind before allows a java stackoverflow exception and denial of service via a large depth of nested objects whitesource note after conducting further research whitesource has determined that all versions of com fasterxml jackson core jackson databind up to version are vulnerable to cve publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jackson databind com fasterxml jackson core jackson databind isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion jackson databind com fasterxml jackson core jackson databind isbinary false packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency false dependencytree com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion jackson databind com fasterxml jackson core jackson databind isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails jackson databind before allows a java stackoverflow exception and denial of service via a large depth of nested objects n whitesource note after conducting further research whitesource has determined that all versions of com fasterxml jackson core jackson databind up to version are vulnerable to cve vulnerabilityurl | 0 |
92,501 | 3,871,643,981 | IssuesEvent | 2016-04-11 10:40:52 | osm2vectortiles/osm2vectortiles | https://api.github.com/repos/osm2vectortiles/osm2vectortiles | opened | Implement network and shield field | low-priority | - [ ] Implement field shield of layer road_label
- [ ] Implement field network of layer rail_station_label
- The OSM key [network](http://wiki.openstreetmap.org/wiki/Key:network) can be used to determine in which country/region (network) the road or railway is. | 1.0 | Implement network and shield field - - [ ] Implement field shield of layer road_label
- [ ] Implement field network of layer rail_station_label
- The OSM key [network](http://wiki.openstreetmap.org/wiki/Key:network) can be used to determine in which country/region (network) the road or railway is. | priority | implement network and shield field implement field shield of layer road label implement field network of layer rail station label the osm key can be used to determine in which country region network the road or railway is | 1 |
712,053 | 24,483,321,679 | IssuesEvent | 2022-10-09 05:07:18 | hengband/hengband | https://api.github.com/repos/hengband/hengband | closed | object-kind.cpp/h をobject/ からsystem/ へ移し、更に改名する | refactor Priority:MIDDLE | #2628 の関連チケット
リファクタリング当時はごちゃごちゃで気付かなかったが、k\_info.txt の中身 (とミュータブルな諸々)を示すファイルである
ならば掲題のフォルダがふさわしい
名前もitem-template-type-definition.cpp/h くらいが良いか
備忘録:quark インデックスがstd::string に変わっているのでコメントも差し替える | 1.0 | object-kind.cpp/h をobject/ からsystem/ へ移し、更に改名する - #2628 の関連チケット
リファクタリング当時はごちゃごちゃで気付かなかったが、k\_info.txt の中身 (とミュータブルな諸々)を示すファイルである
ならば掲題のフォルダがふさわしい
名前もitem-template-type-definition.cpp/h くらいが良いか
備忘録:quark インデックスがstd::string に変わっているのでコメントも差し替える | priority | object kind cpp h をobject からsystem へ移し、更に改名する の関連チケット リファクタリング当時はごちゃごちゃで気付かなかったが、k info txt の中身 とミュータブルな諸々 を示すファイルである ならば掲題のフォルダがふさわしい 名前もitem template type definition cpp h くらいが良いか 備忘録:quark インデックスがstd string に変わっているのでコメントも差し替える | 1 |
156,396 | 12,309,069,281 | IssuesEvent | 2020-05-12 08:21:24 | dso-toolkit/dso-toolkit | https://api.github.com/repos/dso-toolkit/dso-toolkit | closed | Line-height headings werken alleen binnen class="dso-rich-content" | status:testable type:bug | Met de toolkit upgrade naar 10.3.0 had ik een taak aangemaakt om de line-height van de headings te controleren. Nu blijkt dat deze enkel binnen de class="dso-rich-content" worden toegepast. De bedoeling was volgens mij om dit overal toe te passen.
In de gebruikerstoepassingen (van team PR12) wordt nergens class="dso-rich-content" gebruikt. Als dit wel gewenst is zal hier een duidelijke instructie voor moeten worden opgesteld. | 1.0 | Line-height headings werken alleen binnen class="dso-rich-content" - Met de toolkit upgrade naar 10.3.0 had ik een taak aangemaakt om de line-height van de headings te controleren. Nu blijkt dat deze enkel binnen de class="dso-rich-content" worden toegepast. De bedoeling was volgens mij om dit overal toe te passen.
In de gebruikerstoepassingen (van team PR12) wordt nergens class="dso-rich-content" gebruikt. Als dit wel gewenst is zal hier een duidelijke instructie voor moeten worden opgesteld. | non_priority | line height headings werken alleen binnen class dso rich content met de toolkit upgrade naar had ik een taak aangemaakt om de line height van de headings te controleren nu blijkt dat deze enkel binnen de class dso rich content worden toegepast de bedoeling was volgens mij om dit overal toe te passen in de gebruikerstoepassingen van team wordt nergens class dso rich content gebruikt als dit wel gewenst is zal hier een duidelijke instructie voor moeten worden opgesteld | 0 |
289,098 | 8,854,938,884 | IssuesEvent | 2019-01-09 03:45:51 | visit-dav/issues-test | https://api.github.com/repos/visit-dav/issues-test | closed | viewercore introduces _ser library dependencies into simV2runtime_par. | bug likelihood medium priority reviewed severity medium | The viewercore library contains the core parts of the viewer and it is used to add viewer functionality to libsim. Since viewercore relies on some AVT stuff and was destined for the viewer, it ends up linking with some _ser avt libraries. When viewercore is linked into the parallel simV2runtime_par library, it brings along its _ser dependencies, even though we really want only the _par versions of the avt libraries. The best thing might be to build the viewercore sources directly into the simV2runtime libraries instead of linking with viewercore. This would permit us to use the _ser and _par libraries that we want explicitly. We could also omit files like ViewerFileServer and ViewerEngineManager, which are not needed by the simV2runtime, and have some _ser dependencies of their own due to libraries like engineproxy. Furthermore, by not using some of those classes, we could revert to not building libraries like mdserverrpc, mdserverproxy, launcherrpc, launcherproxy, when we're doing engine-only builds. As things stand on the trunk, engine-only builds are broken because the simv2 runtimes can't be linked with viewercore, which is not built.
-----------------------REDMINE MIGRATION-----------------------
This ticket was migrated from Redmine. As such, not all
information was able to be captured in the transition. Below is
a complete record of the original redmine ticket.
Ticket number: 2017
Status: Resolved
Project: VisIt
Tracker: Bug
Priority: High
Subject: viewercore introduces _ser library dependencies into simV2runtime_par.
Assigned to: Brad Whitlock
Category: -
Target version: 2.11.0
Author: Brad Whitlock
Start: 10/09/2014
Due date:
% Done: 100%
Estimated time:
Created: 10/09/2014 03:40 pm
Updated: 06/28/2016 03:37 pm
Likelihood: 3 - Occasional
Severity: 3 - Major Irritation
Found in version: 2.7.3
Impact:
Expected Use:
OS: All
Support Group: Any
Description:
The viewercore library contains the core parts of the viewer and it is used to add viewer functionality to libsim. Since viewercore relies on some AVT stuff and was destined for the viewer, it ends up linking with some _ser avt libraries. When viewercore is linked into the parallel simV2runtime_par library, it brings along its _ser dependencies, even though we really want only the _par versions of the avt libraries. The best thing might be to build the viewercore sources directly into the simV2runtime libraries instead of linking with viewercore. This would permit us to use the _ser and _par libraries that we want explicitly. We could also omit files like ViewerFileServer and ViewerEngineManager, which are not needed by the simV2runtime, and have some _ser dependencies of their own due to libraries like engineproxy. Furthermore, by not using some of those classes, we could revert to not building libraries like mdserverrpc, mdserverproxy, launcherrpc, launcherproxy, when we're doing engine-only builds. As things stand on the trunk, engine-only builds are broken because the simv2 runtimes can't be linked with viewercore, which is not built.
Comments:
This needs to be fixed to test simv2 in parallel some more and to test it with static builds. I'm working on this right now. I moved a few things from viewercore into libviewer and changed the build to produce viewercore_ser, viewercore_par libraries with appropriate dependencies. This prevents _ser libraries from linking into the SimV2 parallel runtime library. For static builds, the SimV2 reader is built into the SimV2 runtime library. I turned off the build of some rpc and proxy libraries for engine or server only builds.
| 1.0 | viewercore introduces _ser library dependencies into simV2runtime_par. - The viewercore library contains the core parts of the viewer and it is used to add viewer functionality to libsim. Since viewercore relies on some AVT stuff and was destined for the viewer, it ends up linking with some _ser avt libraries. When viewercore is linked into the parallel simV2runtime_par library, it brings along its _ser dependencies, even though we really want only the _par versions of the avt libraries. The best thing might be to build the viewercore sources directly into the simV2runtime libraries instead of linking with viewercore. This would permit us to use the _ser and _par libraries that we want explicitly. We could also omit files like ViewerFileServer and ViewerEngineManager, which are not needed by the simV2runtime, and have some _ser dependencies of their own due to libraries like engineproxy. Furthermore, by not using some of those classes, we could revert to not building libraries like mdserverrpc, mdserverproxy, launcherrpc, launcherproxy, when we're doing engine-only builds. As things stand on the trunk, engine-only builds are broken because the simv2 runtimes can't be linked with viewercore, which is not built.
-----------------------REDMINE MIGRATION-----------------------
This ticket was migrated from Redmine. As such, not all
information was able to be captured in the transition. Below is
a complete record of the original redmine ticket.
Ticket number: 2017
Status: Resolved
Project: VisIt
Tracker: Bug
Priority: High
Subject: viewercore introduces _ser library dependencies into simV2runtime_par.
Assigned to: Brad Whitlock
Category: -
Target version: 2.11.0
Author: Brad Whitlock
Start: 10/09/2014
Due date:
% Done: 100%
Estimated time:
Created: 10/09/2014 03:40 pm
Updated: 06/28/2016 03:37 pm
Likelihood: 3 - Occasional
Severity: 3 - Major Irritation
Found in version: 2.7.3
Impact:
Expected Use:
OS: All
Support Group: Any
Description:
The viewercore library contains the core parts of the viewer and it is used to add viewer functionality to libsim. Since viewercore relies on some AVT stuff and was destined for the viewer, it ends up linking with some _ser avt libraries. When viewercore is linked into the parallel simV2runtime_par library, it brings along its _ser dependencies, even though we really want only the _par versions of the avt libraries. The best thing might be to build the viewercore sources directly into the simV2runtime libraries instead of linking with viewercore. This would permit us to use the _ser and _par libraries that we want explicitly. We could also omit files like ViewerFileServer and ViewerEngineManager, which are not needed by the simV2runtime, and have some _ser dependencies of their own due to libraries like engineproxy. Furthermore, by not using some of those classes, we could revert to not building libraries like mdserverrpc, mdserverproxy, launcherrpc, launcherproxy, when we're doing engine-only builds. As things stand on the trunk, engine-only builds are broken because the simv2 runtimes can't be linked with viewercore, which is not built.
Comments:
This needs to be fixed to test simv2 in parallel some more and to test it with static builds. I'm working on this right now. I moved a few things from viewercore into libviewer and changed the build to produce viewercore_ser, viewercore_par libraries with appropriate dependencies. This prevents _ser libraries from linking into the SimV2 parallel runtime library. For static builds, the SimV2 reader is built into the SimV2 runtime library. I turned off the build of some rpc and proxy libraries for engine or server only builds.
| priority | viewercore introduces ser library dependencies into par the viewercore library contains the core parts of the viewer and it is used to add viewer functionality to libsim since viewercore relies on some avt stuff and was destined for the viewer it ends up linking with some ser avt libraries when viewercore is linked into the parallel par library it brings along its ser dependencies even though we really want only the par versions of the avt libraries the best thing might be to build the viewercore sources directly into the libraries instead of linking with viewercore this would permit us to use the ser and par libraries that we want explicitly we could also omit files like viewerfileserver and viewerenginemanager which are not needed by the and have some ser dependencies of their own due to libraries like engineproxy furthermore by not using some of those classes we could revert to not building libraries like mdserverrpc mdserverproxy launcherrpc launcherproxy when we re doing engine only builds as things stand on the trunk engine only builds are broken because the runtimes can t be linked with viewercore which is not built redmine migration this ticket was migrated from redmine as such not all information was able to be captured in the transition below is a complete record of the original redmine ticket ticket number status resolved project visit tracker bug priority high subject viewercore introduces ser library dependencies into par assigned to brad whitlock category target version author brad whitlock start due date done estimated time created pm updated pm likelihood occasional severity major irritation found in version impact expected use os all support group any description the viewercore library contains the core parts of the viewer and it is used to add viewer functionality to libsim since viewercore relies on some avt stuff and was destined for the viewer it ends up linking with some ser avt libraries when viewercore is linked into the parallel par library it brings along its ser dependencies even though we really want only the par versions of the avt libraries the best thing might be to build the viewercore sources directly into the libraries instead of linking with viewercore this would permit us to use the ser and par libraries that we want explicitly we could also omit files like viewerfileserver and viewerenginemanager which are not needed by the and have some ser dependencies of their own due to libraries like engineproxy furthermore by not using some of those classes we could revert to not building libraries like mdserverrpc mdserverproxy launcherrpc launcherproxy when we re doing engine only builds as things stand on the trunk engine only builds are broken because the runtimes can t be linked with viewercore which is not built comments this needs to be fixed to test in parallel some more and to test it with static builds i m working on this right now i moved a few things from viewercore into libviewer and changed the build to produce viewercore ser viewercore par libraries with appropriate dependencies this prevents ser libraries from linking into the parallel runtime library for static builds the reader is built into the runtime library i turned off the build of some rpc and proxy libraries for engine or server only builds | 1 |
358,422 | 25,191,246,299 | IssuesEvent | 2022-11-12 01:30:45 | MitchTurner/naumachia | https://api.github.com/repos/MitchTurner/naumachia | closed | Architectural diagram of Naumachia | documentation | Let's help people understand the structure and why things work the way they do. It might still change a lot, but at least we can document those changes and stuff if we have a baseline. | 1.0 | Architectural diagram of Naumachia - Let's help people understand the structure and why things work the way they do. It might still change a lot, but at least we can document those changes and stuff if we have a baseline. | non_priority | architectural diagram of naumachia let s help people understand the structure and why things work the way they do it might still change a lot but at least we can document those changes and stuff if we have a baseline | 0 |
265,395 | 8,353,777,639 | IssuesEvent | 2018-10-02 11:13:26 | handsontable/handsontable-pro | https://api.github.com/repos/handsontable/handsontable-pro | closed | [Multi-column sorting] Click on the top-left overlay will throw an exception | Priority: high Regression Status: Released Type: Bug | ### Description
As the title says, when the `multiColumnSorting` plugin is enabled.

### Your environment
* Handsontable version: 6.0.0
| 1.0 | [Multi-column sorting] Click on the top-left overlay will throw an exception - ### Description
As the title says, when the `multiColumnSorting` plugin is enabled.

### Your environment
* Handsontable version: 6.0.0
| priority | click on the top left overlay will throw an exception description as the title says when the multicolumnsorting plugin is enabled your environment handsontable version | 1 |
52,674 | 3,026,141,409 | IssuesEvent | 2015-08-03 13:33:34 | twosigma/beaker-notebook | https://api.github.com/repos/twosigma/beaker-notebook | closed | Add automated memory tests | enhancement Priority 1 | ##### Two types of interactions to test at first:
* Adding and removing plots
* Adding and removing graphs | 1.0 | Add automated memory tests - ##### Two types of interactions to test at first:
* Adding and removing plots
* Adding and removing graphs | priority | add automated memory tests two types of interactions to test at first adding and removing plots adding and removing graphs | 1 |
362,417 | 25,374,377,936 | IssuesEvent | 2022-11-21 13:03:23 | rabbitmq/khepri | https://api.github.com/repos/rabbitmq/khepri | closed | Make it easy to find the import/export guide in the documentation | documentation | Currently there is no obvious link in the main documentation menu and no paragraph in the Overview page. Therefore it is difficult to know that everything is explained in the `khepri_import_export` module.
This was [reported by *Ipil* on the Erlang forums](https://erlangforums.com/t/khepri-a-tree-like-replicated-on-disk-database-library-for-erlang-and-elixir-introduction-feedbacks/438/63). | 1.0 | Make it easy to find the import/export guide in the documentation - Currently there is no obvious link in the main documentation menu and no paragraph in the Overview page. Therefore it is difficult to know that everything is explained in the `khepri_import_export` module.
This was [reported by *Ipil* on the Erlang forums](https://erlangforums.com/t/khepri-a-tree-like-replicated-on-disk-database-library-for-erlang-and-elixir-introduction-feedbacks/438/63). | non_priority | make it easy to find the import export guide in the documentation currently there is no obvious link in the main documentation menu and no paragraph in the overview page therefore it is difficult to know that everything is explained in the khepri import export module this was | 0 |
783,171 | 27,521,009,707 | IssuesEvent | 2023-03-06 15:02:37 | fractal-analytics-platform/fractal-server | https://api.github.com/repos/fractal-analytics-platform/fractal-server | closed | Update common after new validators | High Priority | See https://github.com/fractal-analytics-platform/fractal-common/issues/18 (and partly #525).
The idea here is that fractal-common should be the only source for the Pydantic schemas that we use to encode endpoint requests and responses. Note that using Pydantic validators (instead of more advanced typing, e.g. using `min_length=1` for strings or [Constrained Types](https://docs.pydantic.dev/usage/types/#constrained-types)) should allow us to provide more detailed (and customizable) error messages.
The goal of this issue (together with https://github.com/fractal-analytics-platform/fractal-common/issues/18) is also to reduce as much as possible multiple definitions of validators, which would otherwise also appear the web client - see e.g. https://github.com/fractal-analytics-platform/fractal-web/issues/13. | 1.0 | Update common after new validators - See https://github.com/fractal-analytics-platform/fractal-common/issues/18 (and partly #525).
The idea here is that fractal-common should be the only source for the Pydantic schemas that we use to encode endpoint requests and responses. Note that using Pydantic validators (instead of more advanced typing, e.g. using `min_length=1` for strings or [Constrained Types](https://docs.pydantic.dev/usage/types/#constrained-types)) should allow us to provide more detailed (and customizable) error messages.
The goal of this issue (together with https://github.com/fractal-analytics-platform/fractal-common/issues/18) is also to reduce as much as possible multiple definitions of validators, which would otherwise also appear the web client - see e.g. https://github.com/fractal-analytics-platform/fractal-web/issues/13. | priority | update common after new validators see and partly the idea here is that fractal common should be the only source for the pydantic schemas that we use to encode endpoint requests and responses note that using pydantic validators instead of more advanced typing e g using min length for strings or should allow us to provide more detailed and customizable error messages the goal of this issue together with is also to reduce as much as possible multiple definitions of validators which would otherwise also appear the web client see e g | 1 |
22,300 | 30,854,055,368 | IssuesEvent | 2023-08-02 19:03:30 | cohenlabUNC/clpipe | https://api.github.com/repos/cohenlabUNC/clpipe | closed | Add 3dTProject Implementation as Alternate Filtering Step | postprocess2 1.8.1 Req large | For now we don't have a scrub file available, so you'll need to alter the filtering workflow to be capable of running without a scrub file.
After this is done and 3dTproject is registered as an available implementation of filtering, we'll need a test to make sure that the filtering works without scrubbing.
Once scrubbing is available we'll also need to test that the step works with scrubbing and using the correct scrub values from that step.
Finally, we will have to make sure that if the user decides to scrub AND TF, that the implementation is forced to the 3dTproject variant (and the user is warned of this) | 1.0 | Add 3dTProject Implementation as Alternate Filtering Step - For now we don't have a scrub file available, so you'll need to alter the filtering workflow to be capable of running without a scrub file.
After this is done and 3dTproject is registered as an available implementation of filtering, we'll need a test to make sure that the filtering works without scrubbing.
Once scrubbing is available we'll also need to test that the step works with scrubbing and using the correct scrub values from that step.
Finally, we will have to make sure that if the user decides to scrub AND TF, that the implementation is forced to the 3dTproject variant (and the user is warned of this) | non_priority | add implementation as alternate filtering step for now we don t have a scrub file available so you ll need to alter the filtering workflow to be capable of running without a scrub file after this is done and is registered as an available implementation of filtering we ll need a test to make sure that the filtering works without scrubbing once scrubbing is available we ll also need to test that the step works with scrubbing and using the correct scrub values from that step finally we will have to make sure that if the user decides to scrub and tf that the implementation is forced to the variant and the user is warned of this | 0 |
171,810 | 20,998,532,108 | IssuesEvent | 2022-03-29 15:20:38 | gdcorp-action-public-forks/maven-settings-xml-action | https://api.github.com/repos/gdcorp-action-public-forks/maven-settings-xml-action | closed | CVE-2021-3807 (High) detected in ansi-regex-5.0.0.tgz, ansi-regex-4.1.0.tgz - autoclosed | security vulnerability | ## CVE-2021-3807 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ansi-regex-5.0.0.tgz</b>, <b>ansi-regex-4.1.0.tgz</b></p></summary>
<p>
<details><summary><b>ansi-regex-5.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/glob-all/node_modules/ansi-regex/package.json,/node_modules/inquirer/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- mocha-eslint-6.0.0.tgz (Root Library)
- eslint-6.8.0.tgz
- inquirer-7.2.0.tgz
- strip-ansi-6.0.0.tgz
- :x: **ansi-regex-5.0.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-4.1.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/mocha-eslint/node_modules/ansi-regex/package.json,/node_modules/table/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- mocha-eslint-6.0.0.tgz (Root Library)
- eslint-6.8.0.tgz
- strip-ansi-5.2.0.tgz
- :x: **ansi-regex-4.1.0.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ansi-regex is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3807>CVE-2021-3807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p>
<p>Release Date: 2021-09-17</p>
<p>Fix Resolution (ansi-regex): 5.0.1</p>
<p>Direct dependency fix Resolution (mocha-eslint): 7.0.0</p><p>Fix Resolution (ansi-regex): 4.1.1</p>
<p>Direct dependency fix Resolution (mocha-eslint): 7.0.0</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"mocha-eslint","packageVersion":"6.0.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"mocha-eslint:6.0.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"7.0.0","isBinary":false},{"packageType":"javascript/Node.js","packageName":"mocha-eslint","packageVersion":"6.0.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"mocha-eslint:6.0.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"7.0.0","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2021-3807","vulnerabilityDetails":"ansi-regex is vulnerable to Inefficient Regular Expression Complexity","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3807","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | True | CVE-2021-3807 (High) detected in ansi-regex-5.0.0.tgz, ansi-regex-4.1.0.tgz - autoclosed - ## CVE-2021-3807 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ansi-regex-5.0.0.tgz</b>, <b>ansi-regex-4.1.0.tgz</b></p></summary>
<p>
<details><summary><b>ansi-regex-5.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/glob-all/node_modules/ansi-regex/package.json,/node_modules/inquirer/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- mocha-eslint-6.0.0.tgz (Root Library)
- eslint-6.8.0.tgz
- inquirer-7.2.0.tgz
- strip-ansi-6.0.0.tgz
- :x: **ansi-regex-5.0.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-4.1.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/mocha-eslint/node_modules/ansi-regex/package.json,/node_modules/table/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- mocha-eslint-6.0.0.tgz (Root Library)
- eslint-6.8.0.tgz
- strip-ansi-5.2.0.tgz
- :x: **ansi-regex-4.1.0.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ansi-regex is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3807>CVE-2021-3807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p>
<p>Release Date: 2021-09-17</p>
<p>Fix Resolution (ansi-regex): 5.0.1</p>
<p>Direct dependency fix Resolution (mocha-eslint): 7.0.0</p><p>Fix Resolution (ansi-regex): 4.1.1</p>
<p>Direct dependency fix Resolution (mocha-eslint): 7.0.0</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"mocha-eslint","packageVersion":"6.0.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"mocha-eslint:6.0.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"7.0.0","isBinary":false},{"packageType":"javascript/Node.js","packageName":"mocha-eslint","packageVersion":"6.0.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"mocha-eslint:6.0.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"7.0.0","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2021-3807","vulnerabilityDetails":"ansi-regex is vulnerable to Inefficient Regular Expression Complexity","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3807","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | non_priority | cve high detected in ansi regex tgz ansi regex tgz autoclosed cve high severity vulnerability vulnerable libraries ansi regex tgz ansi regex tgz ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file package json path to vulnerable library node modules glob all node modules ansi regex package json node modules inquirer node modules ansi regex package json dependency hierarchy mocha eslint tgz root library eslint tgz inquirer tgz strip ansi tgz x ansi regex tgz vulnerable library ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file package json path to vulnerable library node modules mocha eslint node modules ansi regex package json node modules table node modules ansi regex package json dependency hierarchy mocha eslint tgz root library eslint tgz strip ansi tgz x ansi regex tgz vulnerable library found in base branch main vulnerability details ansi regex is vulnerable to inefficient regular expression complexity publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ansi regex direct dependency fix resolution mocha eslint fix resolution ansi regex direct dependency fix resolution mocha eslint rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree mocha eslint isminimumfixversionavailable true minimumfixversion isbinary false packagetype javascript node js packagename mocha eslint packageversion packagefilepaths istransitivedependency false dependencytree mocha eslint isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails ansi regex is vulnerable to inefficient regular expression complexity vulnerabilityurl | 0 |
70,843 | 8,583,890,906 | IssuesEvent | 2018-11-13 21:04:46 | kids-first/kf-ui-release-coordinator | https://api.github.com/repos/kids-first/kf-ui-release-coordinator | closed | Planner Page | Design | page layout shouldn't be wrapped in a card
- move the `this is a major release` to the top as that's an important item that needs to be made more prevelant, could be a toggle
Studies section: study id tags could reflect the status of that study
Services to be run for this release could follow the same recommendations as #74
`Start Release` button could be aligned right. Right now it competes for attention with the Status indicator which is a separate thing than the actions that are available to take on the new release that was just created | 1.0 | Planner Page - page layout shouldn't be wrapped in a card
- move the `this is a major release` to the top as that's an important item that needs to be made more prevelant, could be a toggle
Studies section: study id tags could reflect the status of that study
Services to be run for this release could follow the same recommendations as #74
`Start Release` button could be aligned right. Right now it competes for attention with the Status indicator which is a separate thing than the actions that are available to take on the new release that was just created | non_priority | planner page page layout shouldn t be wrapped in a card move the this is a major release to the top as that s an important item that needs to be made more prevelant could be a toggle studies section study id tags could reflect the status of that study services to be run for this release could follow the same recommendations as start release button could be aligned right right now it competes for attention with the status indicator which is a separate thing than the actions that are available to take on the new release that was just created | 0 |
684,837 | 23,434,786,871 | IssuesEvent | 2022-08-15 08:35:28 | mayuNishikawa/CALL_ME | https://api.github.com/repos/mayuNishikawa/CALL_ME | closed | Step3 チームを作る機能 | enhancement higi priority | - [x] teamsテーブル、assignsテーブルを作成
- [x] teamモデルにvalidation, アソシエーション、メソッドを追加。
- [x] assignsモデルにアソシエーション追加
- [x] team新規作成、詳細、編集画面を作成
- [x] team削除機能を作成 | 1.0 | Step3 チームを作る機能 - - [x] teamsテーブル、assignsテーブルを作成
- [x] teamモデルにvalidation, アソシエーション、メソッドを追加。
- [x] assignsモデルにアソシエーション追加
- [x] team新規作成、詳細、編集画面を作成
- [x] team削除機能を作成 | priority | チームを作る機能 teamsテーブル、assignsテーブルを作成 teamモデルにvalidation アソシエーション、メソッドを追加。 assignsモデルにアソシエーション追加 team新規作成、詳細、編集画面を作成 team削除機能を作成 | 1 |
757,133 | 26,497,698,750 | IssuesEvent | 2023-01-18 07:42:51 | RobotLocomotion/drake | https://api.github.com/repos/RobotLocomotion/drake | closed | Using Eigen::Array<Polynomial, Dynamic, Dynamic> for PolynomialMatrix in PiecewisePolynomial | priority: backlog component: mathematical program | Currently we use `Eigen::Matrix<Polynomial, Eigen::Dynamic, Eigen::Dynamic>` as the type of `PolynomialMatrix` inside `PiecewisePolynomial` class. But this can cause confusion since we are doing element-wise product, when multiplying two piecewise polynomials, as we have seen in #4854
I think the solution is to use `Eigen::Array` instead of `Eigen::Matrix`, as
```C++
typedef Eigen::Array<PolynomialType, Eigen::Dynamic, Eigen::Dynamic> PolynomialMatrix;
```
`Eigen::Array` supports element-wise product and addition.
@siyuanfeng-tri what do you think? | 1.0 | Using Eigen::Array<Polynomial, Dynamic, Dynamic> for PolynomialMatrix in PiecewisePolynomial - Currently we use `Eigen::Matrix<Polynomial, Eigen::Dynamic, Eigen::Dynamic>` as the type of `PolynomialMatrix` inside `PiecewisePolynomial` class. But this can cause confusion since we are doing element-wise product, when multiplying two piecewise polynomials, as we have seen in #4854
I think the solution is to use `Eigen::Array` instead of `Eigen::Matrix`, as
```C++
typedef Eigen::Array<PolynomialType, Eigen::Dynamic, Eigen::Dynamic> PolynomialMatrix;
```
`Eigen::Array` supports element-wise product and addition.
@siyuanfeng-tri what do you think? | priority | using eigen array for polynomialmatrix in piecewisepolynomial currently we use eigen matrix as the type of polynomialmatrix inside piecewisepolynomial class but this can cause confusion since we are doing element wise product when multiplying two piecewise polynomials as we have seen in i think the solution is to use eigen array instead of eigen matrix as c typedef eigen array polynomialmatrix eigen array supports element wise product and addition siyuanfeng tri what do you think | 1 |
438,923 | 12,663,474,754 | IssuesEvent | 2020-06-18 01:29:11 | vmware/clarity | https://api.github.com/repos/vmware/clarity | closed | Datagrid memory leak and DOM elements leak. | component: datagrid flag: has workaround priority: 1 low status: needs investigation type: bug | ```
[x] bug
[ ] feature request
[ ] enhancement
```
### Expected behavior
Datagrid not to leak memory and DOM elements.
### Actual behavior
After constantly updating the table using a socket, memory leaks and element leaks are created.
### Reproduction of behavior
- https://stackblitz.com/edit/angular-4-5-clarity-datagrid-memoy-leak
- Turn on the performance tab in chrome for one minute and you'll be surprised by the error
- https://monosnap.com/file/nz5FJKsxRrx7f5SVdH5xr2UaY6PCD8
### Environment details
* **Angular version:** 4.4.5
* **Clarity version:** 0.10.23
* **OS and version:**
* **Browser:** [ Chrome 63.0.3239.132 ]
| 1.0 | Datagrid memory leak and DOM elements leak. - ```
[x] bug
[ ] feature request
[ ] enhancement
```
### Expected behavior
Datagrid not to leak memory and DOM elements.
### Actual behavior
After constantly updating the table using a socket, memory leaks and element leaks are created.
### Reproduction of behavior
- https://stackblitz.com/edit/angular-4-5-clarity-datagrid-memoy-leak
- Turn on the performance tab in chrome for one minute and you'll be surprised by the error
- https://monosnap.com/file/nz5FJKsxRrx7f5SVdH5xr2UaY6PCD8
### Environment details
* **Angular version:** 4.4.5
* **Clarity version:** 0.10.23
* **OS and version:**
* **Browser:** [ Chrome 63.0.3239.132 ]
| priority | datagrid memory leak and dom elements leak bug feature request enhancement expected behavior datagrid not to leak memory and dom elements actual behavior after constantly updating the table using a socket memory leaks and element leaks are created reproduction of behavior turn on the performance tab in chrome for one minute and you ll be surprised by the error environment details angular version clarity version os and version browser | 1 |
461,312 | 13,228,302,908 | IssuesEvent | 2020-08-18 05:50:43 | hazelcast/hazelcast | https://api.github.com/repos/hazelcast/hazelcast | closed | SQL math expressions | Estimation: S Internal breaking change Module: SQL Priority: High Source: Internal Team: Core Type: Enhancement | We need to implement base math expressions:
1. `COS`/`SIN`/`TAN`/`COT`/`ACOS`/`ASIN`/`ATAN`
1. `EXP`/`LN`/`LOG10`
1. `RAND`
1. `ABS`
1. `SIGN`
1. `DEGREES`/`RADIANS`
1. `ROUND`/`TRUNCATE`
1. `CEIL`/`FLOOR` | 1.0 | SQL math expressions - We need to implement base math expressions:
1. `COS`/`SIN`/`TAN`/`COT`/`ACOS`/`ASIN`/`ATAN`
1. `EXP`/`LN`/`LOG10`
1. `RAND`
1. `ABS`
1. `SIGN`
1. `DEGREES`/`RADIANS`
1. `ROUND`/`TRUNCATE`
1. `CEIL`/`FLOOR` | priority | sql math expressions we need to implement base math expressions cos sin tan cot acos asin atan exp ln rand abs sign degrees radians round truncate ceil floor | 1 |
370,447 | 10,932,421,002 | IssuesEvent | 2019-11-23 17:35:56 | collinbarrett/FilterLists | https://api.github.com/repos/collinbarrett/FilterLists | closed | The changes from #1124 don't seem to have gone live yet | bug high priority ops | This is most visible in how the ABP versions of *1hosts* aren't shown on the landing page, neither are any of the other new lists.
Is there any particular reason for it? | 1.0 | The changes from #1124 don't seem to have gone live yet - This is most visible in how the ABP versions of *1hosts* aren't shown on the landing page, neither are any of the other new lists.
Is there any particular reason for it? | priority | the changes from don t seem to have gone live yet this is most visible in how the abp versions of aren t shown on the landing page neither are any of the other new lists is there any particular reason for it | 1 |
476,734 | 13,749,245,407 | IssuesEvent | 2020-10-06 10:11:57 | brave/brave-browser | https://api.github.com/repos/brave/brave-browser | closed | Missing translations for Fingerprint settings | OS/Android QA/Yes l10n priority/P1 release-notes/exclude release/blocking | <!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description <!-- Provide a brief description of the issue -->
Missing translations for Fingerprint settings
## Steps to reproduce <!-- Please add a series of steps to reproduce the issue -->
1. Install 1.15.69 build
2. Set locale to non-EN
3. Visit a page and open fingerprint settings in shields has English text instead of translated one
## Actual result <!-- Please add screenshots if needed -->

## Expected result
Translated text
## Issue reproduces how often <!-- [Easily reproduced/Intermittent issue/No steps to reproduce] -->
Easy
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current Play Store version? No
- Can you reproduce this issue with the current Play Store Beta version? Yes
- Can you reproduce this issue with the current Play Store Nightly version? Yes
## Device details
- Install type (ARM, x86): All
- Device type (Phone, Tablet, Phablet): All
- Android version: All
## Brave version
1.15.69
### Website problems only
- Does the issue resolve itself when disabling Brave Shields? NA
- Does the issue resolve itself when disabling Brave Rewards? NA
- Is the issue reproducible on the latest version of Chrome? NA
### Additional information
<!-- Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue -->
| 1.0 | Missing translations for Fingerprint settings - <!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description <!-- Provide a brief description of the issue -->
Missing translations for Fingerprint settings
## Steps to reproduce <!-- Please add a series of steps to reproduce the issue -->
1. Install 1.15.69 build
2. Set locale to non-EN
3. Visit a page and open fingerprint settings in shields has English text instead of translated one
## Actual result <!-- Please add screenshots if needed -->

## Expected result
Translated text
## Issue reproduces how often <!-- [Easily reproduced/Intermittent issue/No steps to reproduce] -->
Easy
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current Play Store version? No
- Can you reproduce this issue with the current Play Store Beta version? Yes
- Can you reproduce this issue with the current Play Store Nightly version? Yes
## Device details
- Install type (ARM, x86): All
- Device type (Phone, Tablet, Phablet): All
- Android version: All
## Brave version
1.15.69
### Website problems only
- Does the issue resolve itself when disabling Brave Shields? NA
- Does the issue resolve itself when disabling Brave Rewards? NA
- Is the issue reproducible on the latest version of Chrome? NA
### Additional information
<!-- Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue -->
| priority | missing translations for fingerprint settings have you searched for similar issues before submitting this issue please check the open issues and add a note before logging a new issue please use the template below to provide information about the issue insufficient info will get the issue closed it will only be reopened after sufficient info is provided description missing translations for fingerprint settings steps to reproduce install build set locale to non en visit a page and open fingerprint settings in shields has english text instead of translated one actual result expected result translated text issue reproduces how often easy version channel information can you reproduce this issue with the current play store version no can you reproduce this issue with the current play store beta version yes can you reproduce this issue with the current play store nightly version yes device details install type arm all device type phone tablet phablet all android version all brave version website problems only does the issue resolve itself when disabling brave shields na does the issue resolve itself when disabling brave rewards na is the issue reproducible on the latest version of chrome na additional information | 1 |
266,383 | 8,366,801,073 | IssuesEvent | 2018-10-04 10:11:26 | architecture-building-systems/CityEnergyAnalyst | https://api.github.com/repos/architecture-building-systems/CityEnergyAnalyst | closed | pyliburo is replaced by py4design, thus the installation of the environment needs to take this change into account | Priority 1 bug | pyliburo is renamed as py4design, so the change needs to be made. Also, the folder structure in py4design is different than what it used to be for pyliburo. This impacts the radiation scripts in CEA.
The problem is faced when we are creating the CEA environment afresh | 1.0 | pyliburo is replaced by py4design, thus the installation of the environment needs to take this change into account - pyliburo is renamed as py4design, so the change needs to be made. Also, the folder structure in py4design is different than what it used to be for pyliburo. This impacts the radiation scripts in CEA.
The problem is faced when we are creating the CEA environment afresh | priority | pyliburo is replaced by thus the installation of the environment needs to take this change into account pyliburo is renamed as so the change needs to be made also the folder structure in is different than what it used to be for pyliburo this impacts the radiation scripts in cea the problem is faced when we are creating the cea environment afresh | 1 |
48,901 | 13,425,014,074 | IssuesEvent | 2020-09-06 08:19:56 | searchboy-sudo/headless-wp-nuxt | https://api.github.com/repos/searchboy-sudo/headless-wp-nuxt | opened | CVE-2019-6286 (Medium) detected in node-sass-v4.13.1, node-sass-4.13.1.tgz | security vulnerability | ## CVE-2019-6286 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>node-sass-4.13.1.tgz</b></p></summary>
<p>
<details><summary><b>node-sass-4.13.1.tgz</b></p></summary>
<p>Wrapper around libsass</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.13.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.13.1.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/headless-wp-nuxt/package.json</p>
<p>Path to vulnerable library: /headless-wp-nuxt/node_modules/node-sass/package.json</p>
<p>
Dependency Hierarchy:
- :x: **node-sass-4.13.1.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/searchboy-sudo/headless-wp-nuxt/commit/748e38948b04db4c74d2e3dae8a217d0ecbc395c">748e38948b04db4c74d2e3dae8a217d0ecbc395c</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In LibSass 3.5.5, a heap-based buffer over-read exists in Sass::Prelexer::skip_over_scopes in prelexer.hpp when called from Sass::Parser::parse_import(), a similar issue to CVE-2018-11693.
<p>Publish Date: 2019-01-14
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-6286>CVE-2019-6286</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6286">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6286</a></p>
<p>Release Date: 2019-08-06</p>
<p>Fix Resolution: LibSass - 3.6.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2019-6286 (Medium) detected in node-sass-v4.13.1, node-sass-4.13.1.tgz - ## CVE-2019-6286 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>node-sass-4.13.1.tgz</b></p></summary>
<p>
<details><summary><b>node-sass-4.13.1.tgz</b></p></summary>
<p>Wrapper around libsass</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.13.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.13.1.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/headless-wp-nuxt/package.json</p>
<p>Path to vulnerable library: /headless-wp-nuxt/node_modules/node-sass/package.json</p>
<p>
Dependency Hierarchy:
- :x: **node-sass-4.13.1.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/searchboy-sudo/headless-wp-nuxt/commit/748e38948b04db4c74d2e3dae8a217d0ecbc395c">748e38948b04db4c74d2e3dae8a217d0ecbc395c</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In LibSass 3.5.5, a heap-based buffer over-read exists in Sass::Prelexer::skip_over_scopes in prelexer.hpp when called from Sass::Parser::parse_import(), a similar issue to CVE-2018-11693.
<p>Publish Date: 2019-01-14
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-6286>CVE-2019-6286</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6286">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6286</a></p>
<p>Release Date: 2019-08-06</p>
<p>Fix Resolution: LibSass - 3.6.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in node sass node sass tgz cve medium severity vulnerability vulnerable libraries node sass tgz node sass tgz wrapper around libsass library home page a href path to dependency file tmp ws scm headless wp nuxt package json path to vulnerable library headless wp nuxt node modules node sass package json dependency hierarchy x node sass tgz vulnerable library found in head commit a href vulnerability details in libsass a heap based buffer over read exists in sass prelexer skip over scopes in prelexer hpp when called from sass parser parse import a similar issue to cve publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution libsass step up your open source security game with whitesource | 0 |
51,679 | 10,710,978,758 | IssuesEvent | 2019-10-25 04:31:05 | eclipse/vorto | https://api.github.com/repos/eclipse/vorto | closed | Generation of models slow due to deep structured mapping resolving | Code Generators Mappings | This problem occurs, that generator invocation is extremely slow for bigger models. The mappings for the generator are always resolved unnecessarily upon every invocation which impacts performance. | 1.0 | Generation of models slow due to deep structured mapping resolving - This problem occurs, that generator invocation is extremely slow for bigger models. The mappings for the generator are always resolved unnecessarily upon every invocation which impacts performance. | non_priority | generation of models slow due to deep structured mapping resolving this problem occurs that generator invocation is extremely slow for bigger models the mappings for the generator are always resolved unnecessarily upon every invocation which impacts performance | 0 |
15,531 | 10,317,092,124 | IssuesEvent | 2019-08-30 11:47:56 | MicrosoftDocs/azure-docs | https://api.github.com/repos/MicrosoftDocs/azure-docs | closed | Ruby error | Pri2 cognitive-services/svc cxp face-api/subsvc product-question triaged | I create a new file, name it faceDetection.rb and paste the above code. Then I changed the key and the west central server to West US and I keep getting a Failed to open TCP connection error. Anyone else is having troubles?
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: e9162af7-ddc6-967d-ad40-c5fe88b7a1ef
* Version Independent ID: e914e383-58c2-a0a6-6578-4fa0afe91e6c
* Content: [Quickstart: Detect faces in an image using the REST API and Ruby - Azure Cognitive Services](https://docs.microsoft.com/en-us/azure/cognitive-services/face/quickstarts/ruby#feedback)
* Content Source: [articles/cognitive-services/Face/QuickStarts/Ruby.md](https://github.com/Microsoft/azure-docs/blob/master/articles/cognitive-services/Face/QuickStarts/Ruby.md)
* Service: **cognitive-services**
* Sub-service: **face-api**
* GitHub Login: @PatrickFarley
* Microsoft Alias: **pafarley** | 1.0 | Ruby error - I create a new file, name it faceDetection.rb and paste the above code. Then I changed the key and the west central server to West US and I keep getting a Failed to open TCP connection error. Anyone else is having troubles?
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: e9162af7-ddc6-967d-ad40-c5fe88b7a1ef
* Version Independent ID: e914e383-58c2-a0a6-6578-4fa0afe91e6c
* Content: [Quickstart: Detect faces in an image using the REST API and Ruby - Azure Cognitive Services](https://docs.microsoft.com/en-us/azure/cognitive-services/face/quickstarts/ruby#feedback)
* Content Source: [articles/cognitive-services/Face/QuickStarts/Ruby.md](https://github.com/Microsoft/azure-docs/blob/master/articles/cognitive-services/Face/QuickStarts/Ruby.md)
* Service: **cognitive-services**
* Sub-service: **face-api**
* GitHub Login: @PatrickFarley
* Microsoft Alias: **pafarley** | non_priority | ruby error i create a new file name it facedetection rb and paste the above code then i changed the key and the west central server to west us and i keep getting a failed to open tcp connection error anyone else is having troubles document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service cognitive services sub service face api github login patrickfarley microsoft alias pafarley | 0 |
137,494 | 11,139,015,683 | IssuesEvent | 2019-12-21 01:14:37 | rancher/rancher | https://api.github.com/repos/rancher/rancher | closed | Forward port: Permissions are not updated in cluster when a cluster role is modified | [zube]: To Test team/ca | Forward port of: https://github.com/rancher/rancher/issues/23292
Addressed by https://github.com/rancher/rancher/pull/24661/commits/6dee0d8cae16a5020f752e7de5526614e520850c | 1.0 | Forward port: Permissions are not updated in cluster when a cluster role is modified - Forward port of: https://github.com/rancher/rancher/issues/23292
Addressed by https://github.com/rancher/rancher/pull/24661/commits/6dee0d8cae16a5020f752e7de5526614e520850c | non_priority | forward port permissions are not updated in cluster when a cluster role is modified forward port of addressed by | 0 |
438,120 | 12,619,567,073 | IssuesEvent | 2020-06-13 01:16:00 | skylight-hq/skylight.digital | https://api.github.com/repos/skylight-hq/skylight.digital | closed | Auto generate pages for blog post authors, blog post tags, and project team members | priority:low | Right now each page has to be set up manually. | 1.0 | Auto generate pages for blog post authors, blog post tags, and project team members - Right now each page has to be set up manually. | priority | auto generate pages for blog post authors blog post tags and project team members right now each page has to be set up manually | 1 |
5,861 | 4,047,281,829 | IssuesEvent | 2016-05-23 03:57:52 | PowerShell/platyPS | https://api.github.com/repos/PowerShell/platyPS | opened | Cmdlet Design Review notes | roadmap usability | ### Notes from cmdlet design review
We did a review with @JamesWTruher @BrucePay @jongeller @joeyaiello @SteveL-MSFT
These items was not addressed in v0.4.0 release.
I keeping them in a separate issue for future releases
* [ ] Metadadta hashtable should validate types [string] of Key-Values
* [ ] OnlineVersionUrl should be [uri] ?
* [ ] NoMetada and Metadata should be in different parameter sets, even if that mean multiply syntaxes count
* [ ] Get-HelpPreview should be a proxy function to Get-Help. It should support `-Full -Example -Parameter`, etc
-----------------------------------------------
Feel free to leave your feedback about cmdlets design | True | Cmdlet Design Review notes - ### Notes from cmdlet design review
We did a review with @JamesWTruher @BrucePay @jongeller @joeyaiello @SteveL-MSFT
These items was not addressed in v0.4.0 release.
I keeping them in a separate issue for future releases
* [ ] Metadadta hashtable should validate types [string] of Key-Values
* [ ] OnlineVersionUrl should be [uri] ?
* [ ] NoMetada and Metadata should be in different parameter sets, even if that mean multiply syntaxes count
* [ ] Get-HelpPreview should be a proxy function to Get-Help. It should support `-Full -Example -Parameter`, etc
-----------------------------------------------
Feel free to leave your feedback about cmdlets design | non_priority | cmdlet design review notes notes from cmdlet design review we did a review with jameswtruher brucepay jongeller joeyaiello stevel msft these items was not addressed in release i keeping them in a separate issue for future releases metadadta hashtable should validate types of key values onlineversionurl should be nometada and metadata should be in different parameter sets even if that mean multiply syntaxes count get helppreview should be a proxy function to get help it should support full example parameter etc feel free to leave your feedback about cmdlets design | 0 |
88,873 | 25,524,160,668 | IssuesEvent | 2022-11-28 23:47:42 | liu-benny/web-services-project | https://api.github.com/repos/liu-benny/web-services-project | closed | Implement GET method for /clinics/{clinic_id}/patients/{patient_id}/appointments | Build #2 | Add in AppointmentModel for the GEt method in appointments_routes | 1.0 | Implement GET method for /clinics/{clinic_id}/patients/{patient_id}/appointments - Add in AppointmentModel for the GEt method in appointments_routes | non_priority | implement get method for clinics clinic id patients patient id appointments add in appointmentmodel for the get method in appointments routes | 0 |
41,953 | 10,833,777,216 | IssuesEvent | 2019-11-11 13:41:49 | trellis-ldp/trellis | https://api.github.com/repos/trellis-ldp/trellis | closed | Update relevant javax.* dependencies to jakarta.* | area/api build | Ideally all of the javax.* dependencies will start using jakarta.*, but the priority should be for `inject` and `cdi-api` | 1.0 | Update relevant javax.* dependencies to jakarta.* - Ideally all of the javax.* dependencies will start using jakarta.*, but the priority should be for `inject` and `cdi-api` | non_priority | update relevant javax dependencies to jakarta ideally all of the javax dependencies will start using jakarta but the priority should be for inject and cdi api | 0 |
26,657 | 13,087,115,434 | IssuesEvent | 2020-08-02 10:22:06 | scylladb/scylla | https://api.github.com/repos/scylladb/scylla | closed | Scylla node performance degrades over time with a constant workload | User Request performance | This is Scylla's bug tracker, to be used for reporting bugs only.
If you have a question about Scylla, and not a bug, please ask it in
our mailing-list at scylladb-dev@googlegroups.com or in our slack channel.
- [X] I have read the disclaimer above, and I am reporting a suspected malfunction in Scylla.
*Installation details*
Scylla version (or git commit hash): 3.1.0-0.20191012.9c3cdded9
Cluster size: 14 nodes
OS (RHEL/CentOS/Ubuntu/AWS AMI): Ubuntu
From roughly Jan. 25 to Mar 25, the performance of our Scylla cluster declined dramatically. This was shown in several metrics (charts to follow):
1. CPU usage rose.
2. Disk reads (but not writes) rose.
3. Cache miss rate (but not hit rate) rose.
4. Query response latency rose.
Other metrics (e.g. write requests, read requests, total database size, average key/value size) remained constant.
Restarting machines did not fix this. A new machine added to the cluster, however, showed load metrics (CPU, disk reads, cache miss rate, query latency) that were much lower than the other machines, and identical to that seen two months earlier.
Two important notes:
1. This is the same cluster as described in https://github.com/scylladb/scylla/issues/6110 . The schema is [there](https://github.com/scylladb/scylla/issues/6110#issuecomment-607377323), along with earlier graphs.
2. Due to the aforementioned high query latency, this Scylla cluster had to be taken out of production and replaced with a fresh cluster. The machines are still running, but only a small amount of traffic is going to them.
2-month cluster summary. The orange line, tribute-scylla061, is the new machine.






One day of data from when the cluster was under production loads. The blue line, 172.20.87.188, is the new machine.




Data from the last week, showing the new cluster coming online. Note that the new cluster, with the same production load as the old cluster had, has drastically lower rates of disk reads and cache misses.

As ever, more graphs and metrics by request. Thanks for your time! | True | Scylla node performance degrades over time with a constant workload - This is Scylla's bug tracker, to be used for reporting bugs only.
If you have a question about Scylla, and not a bug, please ask it in
our mailing-list at scylladb-dev@googlegroups.com or in our slack channel.
- [X] I have read the disclaimer above, and I am reporting a suspected malfunction in Scylla.
*Installation details*
Scylla version (or git commit hash): 3.1.0-0.20191012.9c3cdded9
Cluster size: 14 nodes
OS (RHEL/CentOS/Ubuntu/AWS AMI): Ubuntu
From roughly Jan. 25 to Mar 25, the performance of our Scylla cluster declined dramatically. This was shown in several metrics (charts to follow):
1. CPU usage rose.
2. Disk reads (but not writes) rose.
3. Cache miss rate (but not hit rate) rose.
4. Query response latency rose.
Other metrics (e.g. write requests, read requests, total database size, average key/value size) remained constant.
Restarting machines did not fix this. A new machine added to the cluster, however, showed load metrics (CPU, disk reads, cache miss rate, query latency) that were much lower than the other machines, and identical to that seen two months earlier.
Two important notes:
1. This is the same cluster as described in https://github.com/scylladb/scylla/issues/6110 . The schema is [there](https://github.com/scylladb/scylla/issues/6110#issuecomment-607377323), along with earlier graphs.
2. Due to the aforementioned high query latency, this Scylla cluster had to be taken out of production and replaced with a fresh cluster. The machines are still running, but only a small amount of traffic is going to them.
2-month cluster summary. The orange line, tribute-scylla061, is the new machine.






One day of data from when the cluster was under production loads. The blue line, 172.20.87.188, is the new machine.




Data from the last week, showing the new cluster coming online. Note that the new cluster, with the same production load as the old cluster had, has drastically lower rates of disk reads and cache misses.

As ever, more graphs and metrics by request. Thanks for your time! | non_priority | scylla node performance degrades over time with a constant workload this is scylla s bug tracker to be used for reporting bugs only if you have a question about scylla and not a bug please ask it in our mailing list at scylladb dev googlegroups com or in our slack channel i have read the disclaimer above and i am reporting a suspected malfunction in scylla installation details scylla version or git commit hash cluster size nodes os rhel centos ubuntu aws ami ubuntu from roughly jan to mar the performance of our scylla cluster declined dramatically this was shown in several metrics charts to follow cpu usage rose disk reads but not writes rose cache miss rate but not hit rate rose query response latency rose other metrics e g write requests read requests total database size average key value size remained constant restarting machines did not fix this a new machine added to the cluster however showed load metrics cpu disk reads cache miss rate query latency that were much lower than the other machines and identical to that seen two months earlier two important notes this is the same cluster as described in the schema is along with earlier graphs due to the aforementioned high query latency this scylla cluster had to be taken out of production and replaced with a fresh cluster the machines are still running but only a small amount of traffic is going to them month cluster summary the orange line tribute is the new machine one day of data from when the cluster was under production loads the blue line is the new machine data from the last week showing the new cluster coming online note that the new cluster with the same production load as the old cluster had has drastically lower rates of disk reads and cache misses as ever more graphs and metrics by request thanks for your time | 0 |
129,392 | 10,573,223,526 | IssuesEvent | 2019-10-07 11:27:32 | olehan/kek | https://api.github.com/repos/olehan/kek | closed | Create a proper benchmark tests | test | Not really sure when I can do it, but I plan to rent a dedicated server and create some gud benchmarks, so you could see the clear performance of the lib. | 1.0 | Create a proper benchmark tests - Not really sure when I can do it, but I plan to rent a dedicated server and create some gud benchmarks, so you could see the clear performance of the lib. | non_priority | create a proper benchmark tests not really sure when i can do it but i plan to rent a dedicated server and create some gud benchmarks so you could see the clear performance of the lib | 0 |
256,216 | 8,127,038,622 | IssuesEvent | 2018-08-17 06:17:52 | aowen87/BAR | https://api.github.com/repos/aowen87/BAR | closed | Append version number onto build_visit and visit-install that we put on the Web. | Expected Use: 3 - Occasional Feature Impact: 2 - Low Priority: Normal Support Group: DOE/ASC | cq-id: VisIt00008818
cq-submitter: Brad Whitlock
cq-submit-date: 12/01/08
A couple of external users have become confused with build_visit and visit-install and have ended up using them with wrong versions of the binary distributions and source code. The users both suggested that we append the version number to the scripts that we make available for download on the Web.
-----------------------REDMINE MIGRATION-----------------------
This ticket was migrated from Redmine. As such, not all
information was able to be captured in the transition. Below is
a complete record of the original redmine ticket.
Ticket number: 126
Status: Resolved
Project: VisIt
Tracker: Feature
Priority: Normal
Subject: Append version number onto build_visit and visit-install that we put on the Web.
Assigned to: Eric Brugger
Category:
Target version: 2.0.2
Author: Cyrus Harrison
Start:
Due date:
% Done: 100
Estimated time:
Created: 06/21/2010 07:16 pm
Updated: 07/20/2010 02:22 pm
Likelihood:
Severity:
Found in version:
Impact: 2 - Low
Expected Use: 3 - Occasional
OS: All
Support Group: DOE/ASC
Description:
cq-id: VisIt00008818
cq-submitter: Brad Whitlock
cq-submit-date: 12/01/08
A couple of external users have become confused with build_visit and visit-install and have ended up using them with wrong versions of the binary distributions and source code. The users both suggested that we append the version number to the scripts that we make available for download on the Web.
Comments:
I added the version number to the build_visit and visit-install scripts when the scripts are added to the web site.
| 1.0 | Append version number onto build_visit and visit-install that we put on the Web. - cq-id: VisIt00008818
cq-submitter: Brad Whitlock
cq-submit-date: 12/01/08
A couple of external users have become confused with build_visit and visit-install and have ended up using them with wrong versions of the binary distributions and source code. The users both suggested that we append the version number to the scripts that we make available for download on the Web.
-----------------------REDMINE MIGRATION-----------------------
This ticket was migrated from Redmine. As such, not all
information was able to be captured in the transition. Below is
a complete record of the original redmine ticket.
Ticket number: 126
Status: Resolved
Project: VisIt
Tracker: Feature
Priority: Normal
Subject: Append version number onto build_visit and visit-install that we put on the Web.
Assigned to: Eric Brugger
Category:
Target version: 2.0.2
Author: Cyrus Harrison
Start:
Due date:
% Done: 100
Estimated time:
Created: 06/21/2010 07:16 pm
Updated: 07/20/2010 02:22 pm
Likelihood:
Severity:
Found in version:
Impact: 2 - Low
Expected Use: 3 - Occasional
OS: All
Support Group: DOE/ASC
Description:
cq-id: VisIt00008818
cq-submitter: Brad Whitlock
cq-submit-date: 12/01/08
A couple of external users have become confused with build_visit and visit-install and have ended up using them with wrong versions of the binary distributions and source code. The users both suggested that we append the version number to the scripts that we make available for download on the Web.
Comments:
I added the version number to the build_visit and visit-install scripts when the scripts are added to the web site.
| priority | append version number onto build visit and visit install that we put on the web cq id cq submitter brad whitlock cq submit date a couple of external users have become confused with build visit and visit install and have ended up using them with wrong versions of the binary distributions and source code the users both suggested that we append the version number to the scripts that we make available for download on the web redmine migration this ticket was migrated from redmine as such not all information was able to be captured in the transition below is a complete record of the original redmine ticket ticket number status resolved project visit tracker feature priority normal subject append version number onto build visit and visit install that we put on the web assigned to eric brugger category target version author cyrus harrison start due date done estimated time created pm updated pm likelihood severity found in version impact low expected use occasional os all support group doe asc description cq id cq submitter brad whitlock cq submit date a couple of external users have become confused with build visit and visit install and have ended up using them with wrong versions of the binary distributions and source code the users both suggested that we append the version number to the scripts that we make available for download on the web comments i added the version number to the build visit and visit install scripts when the scripts are added to the web site | 1 |
696,611 | 23,907,166,861 | IssuesEvent | 2022-09-09 02:55:08 | philhawksworth/the-united-effort-orginization | https://api.github.com/repos/philhawksworth/the-united-effort-orginization | closed | Pull data from both housing and units Airtables | internal cleanup Priority 3 | We currently pull all data from the `Units` table in Airtable. There also exists a `Housing Database` table that has property-level data such as address or contact info. The two tables are linked, which makes data entry much easier. However, we do not get data directly from the `Housing Database` table and instead rely on numerous lookup fields to get property-level data into the `Units` table so that the website can pull it from there.
A better solution would be to pull the data we need from each table and then have the website combine or nest the data appropriately. | 1.0 | Pull data from both housing and units Airtables - We currently pull all data from the `Units` table in Airtable. There also exists a `Housing Database` table that has property-level data such as address or contact info. The two tables are linked, which makes data entry much easier. However, we do not get data directly from the `Housing Database` table and instead rely on numerous lookup fields to get property-level data into the `Units` table so that the website can pull it from there.
A better solution would be to pull the data we need from each table and then have the website combine or nest the data appropriately. | priority | pull data from both housing and units airtables we currently pull all data from the units table in airtable there also exists a housing database table that has property level data such as address or contact info the two tables are linked which makes data entry much easier however we do not get data directly from the housing database table and instead rely on numerous lookup fields to get property level data into the units table so that the website can pull it from there a better solution would be to pull the data we need from each table and then have the website combine or nest the data appropriately | 1 |
167,792 | 26,552,718,277 | IssuesEvent | 2023-01-20 09:20:45 | WordPress/gutenberg | https://api.github.com/repos/WordPress/gutenberg | closed | Add a shortcut for template browsing | Needs Design Feedback [Feature] Full Site Editing [Feature] Site Editor | The template list screen we've been working on for https://github.com/WordPress/gutenberg/issues/36597 will serve well as a centralised management screen with searching, bulk actions, and other such features in the future.
It serves less effectively as a navigational tool. Moving from one template to another is quite a tedious workflow. You must:
1. Click the W button
2. Select "Templates"
3. Find and click the template you want to edit
This is a ~15second operation:
https://user-images.githubusercontent.com/846565/144053060-2298f413-6507-41e5-abe0-5c0036628a87.mp4
We can make this much smoother by adding a shortcut for template browsing. It was suggested in https://github.com/WordPress/gutenberg/issues/36667 that we might do this initially with a combination of a dedicated side panel and the introduction of a new "tool" in the top bar.
Here's a quick demo of how that could work:
https://user-images.githubusercontent.com/846565/144054358-c054926d-5edd-48f0-8296-881f5b4d81e2.mp4
| 1.0 | Add a shortcut for template browsing - The template list screen we've been working on for https://github.com/WordPress/gutenberg/issues/36597 will serve well as a centralised management screen with searching, bulk actions, and other such features in the future.
It serves less effectively as a navigational tool. Moving from one template to another is quite a tedious workflow. You must:
1. Click the W button
2. Select "Templates"
3. Find and click the template you want to edit
This is a ~15second operation:
https://user-images.githubusercontent.com/846565/144053060-2298f413-6507-41e5-abe0-5c0036628a87.mp4
We can make this much smoother by adding a shortcut for template browsing. It was suggested in https://github.com/WordPress/gutenberg/issues/36667 that we might do this initially with a combination of a dedicated side panel and the introduction of a new "tool" in the top bar.
Here's a quick demo of how that could work:
https://user-images.githubusercontent.com/846565/144054358-c054926d-5edd-48f0-8296-881f5b4d81e2.mp4
| non_priority | add a shortcut for template browsing the template list screen we ve been working on for will serve well as a centralised management screen with searching bulk actions and other such features in the future it serves less effectively as a navigational tool moving from one template to another is quite a tedious workflow you must click the w button select templates find and click the template you want to edit this is a operation we can make this much smoother by adding a shortcut for template browsing it was suggested in that we might do this initially with a combination of a dedicated side panel and the introduction of a new tool in the top bar here s a quick demo of how that could work | 0 |
99,302 | 4,052,307,141 | IssuesEvent | 2016-05-24 01:39:35 | nvs/gem | https://api.github.com/repos/nvs/gem | opened | A health bar remains for an invisible unit | Priority: Later Status: Not Started Type: Bug | Happens fairly regularly, and just annoys the hell out of me. | 1.0 | A health bar remains for an invisible unit - Happens fairly regularly, and just annoys the hell out of me. | priority | a health bar remains for an invisible unit happens fairly regularly and just annoys the hell out of me | 1 |
625,536 | 19,751,917,507 | IssuesEvent | 2022-01-15 06:12:13 | DISSINET/InkVisitor | https://api.github.com/repos/DISSINET/InkVisitor | opened | Finalize discussion on Statement fragments | data model priority | - Statement-type composite Concepts such as "priests are greedy": should they be C decomposed just linearly into components "priests", "is", and "greedy", or should they be S entities?
- If S, it would be some special S' not sitting under any T and not attached to R.
- If S, TBD whether it should not be the same thing as statement templates.
Adam is right that these should not be projected and analysed as part of dissent - only their instances used somewhere. | 1.0 | Finalize discussion on Statement fragments - - Statement-type composite Concepts such as "priests are greedy": should they be C decomposed just linearly into components "priests", "is", and "greedy", or should they be S entities?
- If S, it would be some special S' not sitting under any T and not attached to R.
- If S, TBD whether it should not be the same thing as statement templates.
Adam is right that these should not be projected and analysed as part of dissent - only their instances used somewhere. | priority | finalize discussion on statement fragments statement type composite concepts such as priests are greedy should they be c decomposed just linearly into components priests is and greedy or should they be s entities if s it would be some special s not sitting under any t and not attached to r if s tbd whether it should not be the same thing as statement templates adam is right that these should not be projected and analysed as part of dissent only their instances used somewhere | 1 |
36,632 | 2,806,369,063 | IssuesEvent | 2015-05-15 01:37:28 | duckduckgo/zeroclickinfo-spice | https://api.github.com/repos/duckduckgo/zeroclickinfo-spice | closed | Transit Switzerland: API not always providing platform, leads to bad result | Bug Priority: High | https://duckduckgo.com/?q=next+train+to+geneva+from+paris&ia=swisstrains
The footer in the IA shows "Platform " with no number because the API response doesn't contain one. We should check to make sure we have a platform, or show that one has not been assigned yet (I'm assuming that's the case?)
https://duckduckgo.com/js/spice/transit/switzerland/paris/geneva
/cc @tagawa @mattr555 | 1.0 | Transit Switzerland: API not always providing platform, leads to bad result - https://duckduckgo.com/?q=next+train+to+geneva+from+paris&ia=swisstrains
The footer in the IA shows "Platform " with no number because the API response doesn't contain one. We should check to make sure we have a platform, or show that one has not been assigned yet (I'm assuming that's the case?)
https://duckduckgo.com/js/spice/transit/switzerland/paris/geneva
/cc @tagawa @mattr555 | priority | transit switzerland api not always providing platform leads to bad result the footer in the ia shows platform with no number because the api response doesn t contain one we should check to make sure we have a platform or show that one has not been assigned yet i m assuming that s the case cc tagawa | 1 |
94,587 | 27,238,584,179 | IssuesEvent | 2023-02-21 18:16:40 | hyperledger/caliper | https://api.github.com/repos/hyperledger/caliper | closed | remove node as the chaincode language for fabric integration testing | enhancement build | Couple of reasons for this
1. Fabric doesn't pin exact versions of it's dependencies which means that changes to the dependency tree outside of our control (in fact it's outside of fabric's control because they don't pin dependencies) can break the code. This has happened today with grpc-js 1.8.1->1.8.2 breaking all node chaincodes. We could add a shrinkwrap to fix the versions but that brings me to my second point
2. Node chaincode is not as efficient as Go chaincode. Also npm install has to be run to prepare the chaincode for each run whereas we go chaincode we can pre-vendor and keep that in the code tree. Both will make it faster for integration testing and reduce the testing time and thus reduce power usage. Also this is less likely to break.
| 1.0 | remove node as the chaincode language for fabric integration testing - Couple of reasons for this
1. Fabric doesn't pin exact versions of it's dependencies which means that changes to the dependency tree outside of our control (in fact it's outside of fabric's control because they don't pin dependencies) can break the code. This has happened today with grpc-js 1.8.1->1.8.2 breaking all node chaincodes. We could add a shrinkwrap to fix the versions but that brings me to my second point
2. Node chaincode is not as efficient as Go chaincode. Also npm install has to be run to prepare the chaincode for each run whereas we go chaincode we can pre-vendor and keep that in the code tree. Both will make it faster for integration testing and reduce the testing time and thus reduce power usage. Also this is less likely to break.
| non_priority | remove node as the chaincode language for fabric integration testing couple of reasons for this fabric doesn t pin exact versions of it s dependencies which means that changes to the dependency tree outside of our control in fact it s outside of fabric s control because they don t pin dependencies can break the code this has happened today with grpc js breaking all node chaincodes we could add a shrinkwrap to fix the versions but that brings me to my second point node chaincode is not as efficient as go chaincode also npm install has to be run to prepare the chaincode for each run whereas we go chaincode we can pre vendor and keep that in the code tree both will make it faster for integration testing and reduce the testing time and thus reduce power usage also this is less likely to break | 0 |
247,932 | 26,761,986,255 | IssuesEvent | 2023-01-31 07:48:21 | billmcchesney1/Hangar | https://api.github.com/repos/billmcchesney1/Hangar | closed | CVE-2022-23529 (High) detected in jsonwebtoken-8.5.1.tgz - autoclosed | security vulnerability | ## CVE-2022-23529 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jsonwebtoken-8.5.1.tgz</b></p></summary>
<p>JSON Web Token implementation (symmetric and asymmetric)</p>
<p>Library home page: <a href="https://registry.npmjs.org/jsonwebtoken/-/jsonwebtoken-8.5.1.tgz">https://registry.npmjs.org/jsonwebtoken/-/jsonwebtoken-8.5.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/jsonwebtoken/package.json</p>
<p>
Dependency Hierarchy:
- bolt-2.3.0.tgz (Root Library)
- oauth-1.2.0.tgz
- :x: **jsonwebtoken-8.5.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
** REJECT ** DO NOT USE THIS CANDIDATE NUMBER. ConsultIDs: none. Reason: The issue is not a vulnerability. Notes: none.
<p>Publish Date: 2022-12-21
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-23529>CVE-2022-23529</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/auth0/node-jsonwebtoken/security/advisories/GHSA-27h2-hvpr-p74q">https://github.com/auth0/node-jsonwebtoken/security/advisories/GHSA-27h2-hvpr-p74q</a></p>
<p>Release Date: 2022-12-21</p>
<p>Fix Resolution: jsonwebtoken - 9.0.0</p>
</p>
</details>
<p></p>
| True | CVE-2022-23529 (High) detected in jsonwebtoken-8.5.1.tgz - autoclosed - ## CVE-2022-23529 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jsonwebtoken-8.5.1.tgz</b></p></summary>
<p>JSON Web Token implementation (symmetric and asymmetric)</p>
<p>Library home page: <a href="https://registry.npmjs.org/jsonwebtoken/-/jsonwebtoken-8.5.1.tgz">https://registry.npmjs.org/jsonwebtoken/-/jsonwebtoken-8.5.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/jsonwebtoken/package.json</p>
<p>
Dependency Hierarchy:
- bolt-2.3.0.tgz (Root Library)
- oauth-1.2.0.tgz
- :x: **jsonwebtoken-8.5.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
** REJECT ** DO NOT USE THIS CANDIDATE NUMBER. ConsultIDs: none. Reason: The issue is not a vulnerability. Notes: none.
<p>Publish Date: 2022-12-21
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-23529>CVE-2022-23529</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/auth0/node-jsonwebtoken/security/advisories/GHSA-27h2-hvpr-p74q">https://github.com/auth0/node-jsonwebtoken/security/advisories/GHSA-27h2-hvpr-p74q</a></p>
<p>Release Date: 2022-12-21</p>
<p>Fix Resolution: jsonwebtoken - 9.0.0</p>
</p>
</details>
<p></p>
| non_priority | cve high detected in jsonwebtoken tgz autoclosed cve high severity vulnerability vulnerable library jsonwebtoken tgz json web token implementation symmetric and asymmetric library home page a href path to dependency file package json path to vulnerable library node modules jsonwebtoken package json dependency hierarchy bolt tgz root library oauth tgz x jsonwebtoken tgz vulnerable library found in base branch main vulnerability details reject do not use this candidate number consultids none reason the issue is not a vulnerability notes none publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jsonwebtoken | 0 |
88,439 | 15,800,924,637 | IssuesEvent | 2021-04-03 02:00:04 | AlexRogalskiy/github-action-tag-replacer | https://api.github.com/repos/AlexRogalskiy/github-action-tag-replacer | opened | CVE-2020-11022 (Medium) detected in jquery-1.8.1.min.js | security vulnerability | ## CVE-2020-11022 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.8.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js</a></p>
<p>Path to dependency file: github-action-tag-replacer/node_modules/redeyed/examples/browser/index.html</p>
<p>Path to vulnerable library: github-action-tag-replacer/node_modules/redeyed/examples/browser/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.8.1.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/github-action-tag-replacer/commit/6248069cf5fef67e089b0efa1b8401b4046ff155">6248069cf5fef67e089b0efa1b8401b4046ff155</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In jQuery versions greater than or equal to 1.2 and before 3.5.0, passing HTML from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11022>CVE-2020-11022</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/">https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jQuery - 3.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-11022 (Medium) detected in jquery-1.8.1.min.js - ## CVE-2020-11022 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.8.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js</a></p>
<p>Path to dependency file: github-action-tag-replacer/node_modules/redeyed/examples/browser/index.html</p>
<p>Path to vulnerable library: github-action-tag-replacer/node_modules/redeyed/examples/browser/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.8.1.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/github-action-tag-replacer/commit/6248069cf5fef67e089b0efa1b8401b4046ff155">6248069cf5fef67e089b0efa1b8401b4046ff155</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In jQuery versions greater than or equal to 1.2 and before 3.5.0, passing HTML from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11022>CVE-2020-11022</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/">https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jQuery - 3.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in jquery min js cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file github action tag replacer node modules redeyed examples browser index html path to vulnerable library github action tag replacer node modules redeyed examples browser index html dependency hierarchy x jquery min js vulnerable library found in head commit a href vulnerability details in jquery versions greater than or equal to and before passing html from untrusted sources even after sanitizing it to one of jquery s dom manipulation methods i e html append and others may execute untrusted code this problem is patched in jquery publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery step up your open source security game with whitesource | 0 |
77,680 | 15,569,825,243 | IssuesEvent | 2021-03-17 01:04:45 | veshitala/flask-blogger | https://api.github.com/repos/veshitala/flask-blogger | opened | CVE-2020-10177 (Medium) detected in Pillow-5.4.1-cp27-cp27mu-manylinux1_x86_64.whl | security vulnerability | ## CVE-2020-10177 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Pillow-5.4.1-cp27-cp27mu-manylinux1_x86_64.whl</b></p></summary>
<p>Python Imaging Library (Fork)</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/0d/f3/421598450cb9503f4565d936860763b5af413a61009d87a5ab1e34139672/Pillow-5.4.1-cp27-cp27mu-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/0d/f3/421598450cb9503f4565d936860763b5af413a61009d87a5ab1e34139672/Pillow-5.4.1-cp27-cp27mu-manylinux1_x86_64.whl</a></p>
<p>Path to vulnerable library: flask-blogger/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Pillow-5.4.1-cp27-cp27mu-manylinux1_x86_64.whl** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Pillow before 7.1.0 has multiple out-of-bounds reads in libImaging/FliDecode.c.
<p>Publish Date: 2020-06-25
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10177>CVE-2020-10177</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/python-pillow/Pillow/commit/41b554bc56982ee4f30238a7677c0f4ff90a73a8">https://github.com/python-pillow/Pillow/commit/41b554bc56982ee4f30238a7677c0f4ff90a73a8</a></p>
<p>Release Date: 2020-06-25</p>
<p>Fix Resolution: 7.1.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-10177 (Medium) detected in Pillow-5.4.1-cp27-cp27mu-manylinux1_x86_64.whl - ## CVE-2020-10177 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Pillow-5.4.1-cp27-cp27mu-manylinux1_x86_64.whl</b></p></summary>
<p>Python Imaging Library (Fork)</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/0d/f3/421598450cb9503f4565d936860763b5af413a61009d87a5ab1e34139672/Pillow-5.4.1-cp27-cp27mu-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/0d/f3/421598450cb9503f4565d936860763b5af413a61009d87a5ab1e34139672/Pillow-5.4.1-cp27-cp27mu-manylinux1_x86_64.whl</a></p>
<p>Path to vulnerable library: flask-blogger/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Pillow-5.4.1-cp27-cp27mu-manylinux1_x86_64.whl** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Pillow before 7.1.0 has multiple out-of-bounds reads in libImaging/FliDecode.c.
<p>Publish Date: 2020-06-25
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10177>CVE-2020-10177</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/python-pillow/Pillow/commit/41b554bc56982ee4f30238a7677c0f4ff90a73a8">https://github.com/python-pillow/Pillow/commit/41b554bc56982ee4f30238a7677c0f4ff90a73a8</a></p>
<p>Release Date: 2020-06-25</p>
<p>Fix Resolution: 7.1.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in pillow whl cve medium severity vulnerability vulnerable library pillow whl python imaging library fork library home page a href path to vulnerable library flask blogger requirements txt dependency hierarchy x pillow whl vulnerable library vulnerability details pillow before has multiple out of bounds reads in libimaging flidecode c publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
231,500 | 17,691,570,247 | IssuesEvent | 2021-08-24 10:35:52 | SimoneGasperini/cms-cmSim | https://api.github.com/repos/SimoneGasperini/cms-cmSim | closed | Start documenting the code | documentation | I suggest to start working on code documentation.
I'd begin with simple docstrings. Personally I prefer Google style docstrings as described [here](https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings). | 1.0 | Start documenting the code - I suggest to start working on code documentation.
I'd begin with simple docstrings. Personally I prefer Google style docstrings as described [here](https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings). | non_priority | start documenting the code i suggest to start working on code documentation i d begin with simple docstrings personally i prefer google style docstrings as described | 0 |
80,715 | 15,557,509,526 | IssuesEvent | 2021-03-16 09:13:50 | Regalis11/Barotrauma | https://api.github.com/repos/Regalis11/Barotrauma | closed | Signal Check components can't send spaces | Bug Code | - [x] I have searched the issue tracker to check if the issue has already been reported.
**Description**
a value of " " (one or more space) doesn't get sent in signal check outputs
**Steps To Reproduce**
put space (one or more) as signal check's outputs (false/true)
**Version**
0.12
**Additional information**
Add any other context about the problem here.
| 1.0 | Signal Check components can't send spaces - - [x] I have searched the issue tracker to check if the issue has already been reported.
**Description**
a value of " " (one or more space) doesn't get sent in signal check outputs
**Steps To Reproduce**
put space (one or more) as signal check's outputs (false/true)
**Version**
0.12
**Additional information**
Add any other context about the problem here.
| non_priority | signal check components can t send spaces i have searched the issue tracker to check if the issue has already been reported description a value of one or more space doesn t get sent in signal check outputs steps to reproduce put space one or more as signal check s outputs false true version additional information add any other context about the problem here | 0 |
671,447 | 22,761,544,955 | IssuesEvent | 2022-07-07 21:47:20 | markmac99/UKmon-shared | https://api.github.com/repos/markmac99/UKmon-shared | closed | Add container support for cartopy | enhancement Low priority | This would allow OS maps to be used as the backdrop for the ground-track maps.
However cartopy is poorly supported and has to be built from source, along with GEOS, PROJ4 and SQLITE3, which makes it quite a task. | 1.0 | Add container support for cartopy - This would allow OS maps to be used as the backdrop for the ground-track maps.
However cartopy is poorly supported and has to be built from source, along with GEOS, PROJ4 and SQLITE3, which makes it quite a task. | priority | add container support for cartopy this would allow os maps to be used as the backdrop for the ground track maps however cartopy is poorly supported and has to be built from source along with geos and which makes it quite a task | 1 |
74,419 | 3,439,718,874 | IssuesEvent | 2015-12-14 11:00:32 | ceylon/ceylon-ide-eclipse | https://api.github.com/repos/ceylon/ceylon-ide-eclipse | closed | Eclipse unexpected behavior | bug on last release high priority | hi, I get an exception "`The ExpressionVisitor caused an exception visiting a InvocationExpression node: "com.redhat.ceylon.model.loader.ModelResolutionException: JDT reference binding without a JDT IType element : com.google.common.cache.Cache" at com.redhat.ceylon.eclipse.core.model.JDTModelLoader.toType(JDTModelLoader.java:1148)`"
when I try to run my code in eclipse, but it compiles and runs fine from the linux terminal.
I was trying to use a java html templating engine in ceylon when this happened.
To reproduce the issue, my code is below.
module.ceylon file
```ceylon
native("jvm") module cryna "1.0.0" {
import ceylon.net "1.2.0";
import "com.mitchellbosecke:pebble" "1.6.0";
import java.base "8";
}
```
package.ceylon
```ceylon
shared package cryna;
```
and finally run.ceylon
```ceylon
import ceylon.net.http.server {
newServer,
Endpoint,
startsWith
}
import java.io {
StringWriter
}
import com.mitchellbosecke.pebble {
PebbleEngine
}
import com.mitchellbosecke.pebble.loader {
FileLoader
}
import com.mitchellbosecke.pebble.template {
PebbleTemplate
}
shared void run() => newServer {
Endpoint {
path = startsWith("/");
(req, res) {
variable StringWriter wr = StringWriter();
FileLoader fl = FileLoader();
PebbleEngine engine = PebbleEngine(fl);
PebbleTemplate compiledTemplate = engine.getTemplate("/home/masood/Documents/misc/experimental/templates/home.html");
compiledTemplate.evaluate(wr);
return res.writeString(wr.string);
};
}
}.start();
```
you can put any regular html page in the engine.getTemplate
| 1.0 | Eclipse unexpected behavior - hi, I get an exception "`The ExpressionVisitor caused an exception visiting a InvocationExpression node: "com.redhat.ceylon.model.loader.ModelResolutionException: JDT reference binding without a JDT IType element : com.google.common.cache.Cache" at com.redhat.ceylon.eclipse.core.model.JDTModelLoader.toType(JDTModelLoader.java:1148)`"
when I try to run my code in eclipse, but it compiles and runs fine from the linux terminal.
I was trying to use a java html templating engine in ceylon when this happened.
To reproduce the issue, my code is below.
module.ceylon file
```ceylon
native("jvm") module cryna "1.0.0" {
import ceylon.net "1.2.0";
import "com.mitchellbosecke:pebble" "1.6.0";
import java.base "8";
}
```
package.ceylon
```ceylon
shared package cryna;
```
and finally run.ceylon
```ceylon
import ceylon.net.http.server {
newServer,
Endpoint,
startsWith
}
import java.io {
StringWriter
}
import com.mitchellbosecke.pebble {
PebbleEngine
}
import com.mitchellbosecke.pebble.loader {
FileLoader
}
import com.mitchellbosecke.pebble.template {
PebbleTemplate
}
shared void run() => newServer {
Endpoint {
path = startsWith("/");
(req, res) {
variable StringWriter wr = StringWriter();
FileLoader fl = FileLoader();
PebbleEngine engine = PebbleEngine(fl);
PebbleTemplate compiledTemplate = engine.getTemplate("/home/masood/Documents/misc/experimental/templates/home.html");
compiledTemplate.evaluate(wr);
return res.writeString(wr.string);
};
}
}.start();
```
you can put any regular html page in the engine.getTemplate
| priority | eclipse unexpected behavior hi i get an exception the expressionvisitor caused an exception visiting a invocationexpression node com redhat ceylon model loader modelresolutionexception jdt reference binding without a jdt itype element com google common cache cache at com redhat ceylon eclipse core model jdtmodelloader totype jdtmodelloader java when i try to run my code in eclipse but it compiles and runs fine from the linux terminal i was trying to use a java html templating engine in ceylon when this happened to reproduce the issue my code is below module ceylon file ceylon native jvm module cryna import ceylon net import com mitchellbosecke pebble import java base package ceylon ceylon shared package cryna and finally run ceylon ceylon import ceylon net http server newserver endpoint startswith import java io stringwriter import com mitchellbosecke pebble pebbleengine import com mitchellbosecke pebble loader fileloader import com mitchellbosecke pebble template pebbletemplate shared void run newserver endpoint path startswith req res variable stringwriter wr stringwriter fileloader fl fileloader pebbleengine engine pebbleengine fl pebbletemplate compiledtemplate engine gettemplate home masood documents misc experimental templates home html compiledtemplate evaluate wr return res writestring wr string start you can put any regular html page in the engine gettemplate | 1 |
600,061 | 18,288,623,083 | IssuesEvent | 2021-10-05 13:06:41 | brave/brave-browser | https://api.github.com/repos/brave/brave-browser | closed | Pending tips for Gemini creators not sent when switching from Uphold to Gemini | feature/rewards priority/P3 QA/Yes release-notes/exclude OS/Desktop feature/gemini-wallet feature/Uphold-wallet | <!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description
<!--Provide a brief description of the issue-->
If a user changes their custodial wallet from Uphold to Gemini, pending tips that were earmarked for a Gemini creator are not being sent out once the change to Gemini is made.
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
1. Have a profile w/ Uphold KYC'd user wallet
2. Tip a Gemini verified creator
3. Confirm the tip goes to the pending tips list
4. Disconnect from Uphold and connect KYC'd Gemini wallet
## Actual result:
<!--Please add screenshots if needed-->
Pending tip is not sent to the Gemini creator, it remains in the pending list. I tried the following per suggestions to get this to happen:
- refreshing the publisher list (manually from panel)
- advancing computer clock 1 day
- advancing computer clock 3 days
<img width="945" alt="Screen Shot 2021-08-30 at 12 42 03 PM" src="https://user-images.githubusercontent.com/28145373/131381643-4ffac67b-7c32-4205-954b-833bbe4fea2c.png">
## Expected result:
At some point the pending tip for Gemini creator should be sent out
## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
easily
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
1.29.75
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current release? 1.28.x n/a, 1.29.x yes
- Can you reproduce this issue with the beta channel? unsure
- Can you reproduce this issue with the nightly channel? 1.31.x yes
## Other Additional Information:
- Does the issue resolve itself when disabling Brave Shields?
- Does the issue resolve itself when disabling Brave Rewards?
- Is the issue reproducible on the latest version of Chrome?
## Miscellaneous Information:
<!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue-->
cc @brave/legacy_qa @Miyayes @jumde @rebron | 1.0 | Pending tips for Gemini creators not sent when switching from Uphold to Gemini - <!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description
<!--Provide a brief description of the issue-->
If a user changes their custodial wallet from Uphold to Gemini, pending tips that were earmarked for a Gemini creator are not being sent out once the change to Gemini is made.
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
1. Have a profile w/ Uphold KYC'd user wallet
2. Tip a Gemini verified creator
3. Confirm the tip goes to the pending tips list
4. Disconnect from Uphold and connect KYC'd Gemini wallet
## Actual result:
<!--Please add screenshots if needed-->
Pending tip is not sent to the Gemini creator, it remains in the pending list. I tried the following per suggestions to get this to happen:
- refreshing the publisher list (manually from panel)
- advancing computer clock 1 day
- advancing computer clock 3 days
<img width="945" alt="Screen Shot 2021-08-30 at 12 42 03 PM" src="https://user-images.githubusercontent.com/28145373/131381643-4ffac67b-7c32-4205-954b-833bbe4fea2c.png">
## Expected result:
At some point the pending tip for Gemini creator should be sent out
## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
easily
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
1.29.75
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current release? 1.28.x n/a, 1.29.x yes
- Can you reproduce this issue with the beta channel? unsure
- Can you reproduce this issue with the nightly channel? 1.31.x yes
## Other Additional Information:
- Does the issue resolve itself when disabling Brave Shields?
- Does the issue resolve itself when disabling Brave Rewards?
- Is the issue reproducible on the latest version of Chrome?
## Miscellaneous Information:
<!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue-->
cc @brave/legacy_qa @Miyayes @jumde @rebron | priority | pending tips for gemini creators not sent when switching from uphold to gemini have you searched for similar issues before submitting this issue please check the open issues and add a note before logging a new issue please use the template below to provide information about the issue insufficient info will get the issue closed it will only be reopened after sufficient info is provided description if a user changes their custodial wallet from uphold to gemini pending tips that were earmarked for a gemini creator are not being sent out once the change to gemini is made steps to reproduce have a profile w uphold kyc d user wallet tip a gemini verified creator confirm the tip goes to the pending tips list disconnect from uphold and connect kyc d gemini wallet actual result pending tip is not sent to the gemini creator it remains in the pending list i tried the following per suggestions to get this to happen refreshing the publisher list manually from panel advancing computer clock day advancing computer clock days img width alt screen shot at pm src expected result at some point the pending tip for gemini creator should be sent out reproduces how often easily brave version brave version info version channel information can you reproduce this issue with the current release x n a x yes can you reproduce this issue with the beta channel unsure can you reproduce this issue with the nightly channel x yes other additional information does the issue resolve itself when disabling brave shields does the issue resolve itself when disabling brave rewards is the issue reproducible on the latest version of chrome miscellaneous information cc brave legacy qa miyayes jumde rebron | 1 |
29,254 | 2,714,206,549 | IssuesEvent | 2015-04-10 00:43:08 | hamiltont/clasp | https://api.github.com/repos/hamiltont/clasp | opened | Option for custom Android system, boot, and kernel images. | Low priority | _From @bamos on September 24, 2014 15:33_
Specifically to run nbd -- http://bamos.github.io/2014/09/08/nbd-android/
_Copied from original issue: hamiltont/attack#53_ | 1.0 | Option for custom Android system, boot, and kernel images. - _From @bamos on September 24, 2014 15:33_
Specifically to run nbd -- http://bamos.github.io/2014/09/08/nbd-android/
_Copied from original issue: hamiltont/attack#53_ | priority | option for custom android system boot and kernel images from bamos on september specifically to run nbd copied from original issue hamiltont attack | 1 |
275,555 | 8,576,942,762 | IssuesEvent | 2018-11-12 22:02:15 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | store.hp.com - site is not usable | browser-firefox-mobile priority-important | <!-- @browser: Firefox Mobile 64.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:64.0) Gecko/64.0 Firefox/64.0 -->
<!-- @reported_with: mobile-reporter -->
**URL**: https://store.hp.com/us/en/pdp/hp-pavilion-laptop-15z-touch-optional-3je92av-1?source=aw
**Browser / Version**: Firefox Mobile 64.0
**Operating System**: Android
**Tested Another Browser**: No
**Problem type**: Site is not usable
**Description**: unable to use drop down menu for specifications, features, etc.
**Steps to Reproduce**:
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | store.hp.com - site is not usable - <!-- @browser: Firefox Mobile 64.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:64.0) Gecko/64.0 Firefox/64.0 -->
<!-- @reported_with: mobile-reporter -->
**URL**: https://store.hp.com/us/en/pdp/hp-pavilion-laptop-15z-touch-optional-3je92av-1?source=aw
**Browser / Version**: Firefox Mobile 64.0
**Operating System**: Android
**Tested Another Browser**: No
**Problem type**: Site is not usable
**Description**: unable to use drop down menu for specifications, features, etc.
**Steps to Reproduce**:
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | priority | store hp com site is not usable url browser version firefox mobile operating system android tested another browser no problem type site is not usable description unable to use drop down menu for specifications features etc steps to reproduce browser configuration none from with ❤️ | 1 |
228,531 | 7,552,579,968 | IssuesEvent | 2018-04-19 01:09:32 | OperationCode/operationcode_backend | https://api.github.com/repos/OperationCode/operationcode_backend | closed | Rake task to tag community leaders | Priority: Low Status: In Progress Type: Feature | <!-- Please fill out one of the sections below based on the type of issue you're creating -->
# Feature
## Why is this feature being added?
<!-- What problem is it solving? What value does it add? -->
Now that https://github.com/OperationCode/operationcode_backend/pull/292 is merged, we need a rake task to initially tag the appropriate users in prod with the community leader tag.
## What should your feature do?
- [ ] Collaborate with @hollomancer (or whomever he deems appropriate) to determine what prod users should be tagged
- [ ] Creating the rake task to [tag community leaders](https://github.com/OperationCode/operationcode_backend/blob/master/app/models/user.rb#L2) | 1.0 | Rake task to tag community leaders - <!-- Please fill out one of the sections below based on the type of issue you're creating -->
# Feature
## Why is this feature being added?
<!-- What problem is it solving? What value does it add? -->
Now that https://github.com/OperationCode/operationcode_backend/pull/292 is merged, we need a rake task to initially tag the appropriate users in prod with the community leader tag.
## What should your feature do?
- [ ] Collaborate with @hollomancer (or whomever he deems appropriate) to determine what prod users should be tagged
- [ ] Creating the rake task to [tag community leaders](https://github.com/OperationCode/operationcode_backend/blob/master/app/models/user.rb#L2) | priority | rake task to tag community leaders feature why is this feature being added now that is merged we need a rake task to initially tag the appropriate users in prod with the community leader tag what should your feature do collaborate with hollomancer or whomever he deems appropriate to determine what prod users should be tagged creating the rake task to | 1 |
227,874 | 18,106,773,369 | IssuesEvent | 2021-09-22 20:02:02 | Oldes/Rebol-issues | https://api.github.com/repos/Oldes/Rebol-issues | closed | append broken for self-append | Status.important Test.written Type.bug Datatype: block! CC.resolved | _Submitted by:_ **Sunanda**
R2:
``` rebol
a: copy [1]
append a a
== [1 1]
```
``` rebol
***
```
R3/alpha 54:
``` rebol
a: copy [1]
append a a
== [1] ;; <=== not what I'd expect!
```
``` rebol
head a ;; just checking
== [1]
```
``` rebol
***
```
``` rebol
copy is needed to restore previous behavior:
append a copy a
== [1 1]
```
``` rebol
****
```
Insert does not have the same issue:
``` rebol
a: copy [1]
insert a a
head a
== [1 1]
```
``` rebol
a: copy [1]
append a a
```
---
<sup>**Imported from:** **[CureCode](https://www.curecode.org/rebol3/ticket.rsp?id=814)** [ Version: alpha 55 Type: Bug Platform: All Category: n/a Reproduce: Always Fixed-in:alpha 55 ]</sup>
<sup>**Imported from**: https://github.com/rebol/rebol-issues/issues/814</sup>
Comments:
---
> **Rebolbot** commented on May 17, 2009:
_Submitted by:_ **Carl**
That's a good one Sunanda. I'll add it to my test suite.
---
> **Rebolbot** added **Type.bug** and **Status.important** on Jan 12, 2016
--- | 1.0 | append broken for self-append - _Submitted by:_ **Sunanda**
R2:
``` rebol
a: copy [1]
append a a
== [1 1]
```
``` rebol
***
```
R3/alpha 54:
``` rebol
a: copy [1]
append a a
== [1] ;; <=== not what I'd expect!
```
``` rebol
head a ;; just checking
== [1]
```
``` rebol
***
```
``` rebol
copy is needed to restore previous behavior:
append a copy a
== [1 1]
```
``` rebol
****
```
Insert does not have the same issue:
``` rebol
a: copy [1]
insert a a
head a
== [1 1]
```
``` rebol
a: copy [1]
append a a
```
---
<sup>**Imported from:** **[CureCode](https://www.curecode.org/rebol3/ticket.rsp?id=814)** [ Version: alpha 55 Type: Bug Platform: All Category: n/a Reproduce: Always Fixed-in:alpha 55 ]</sup>
<sup>**Imported from**: https://github.com/rebol/rebol-issues/issues/814</sup>
Comments:
---
> **Rebolbot** commented on May 17, 2009:
_Submitted by:_ **Carl**
That's a good one Sunanda. I'll add it to my test suite.
---
> **Rebolbot** added **Type.bug** and **Status.important** on Jan 12, 2016
--- | non_priority | append broken for self append submitted by sunanda rebol a copy append a a rebol alpha rebol a copy append a a not what i d expect rebol head a just checking rebol rebol copy is needed to restore previous behavior append a copy a rebol insert does not have the same issue rebol a copy insert a a head a rebol a copy append a a imported from imported from comments rebolbot commented on may submitted by carl that s a good one sunanda i ll add it to my test suite rebolbot added type bug and status important on jan | 0 |
444,018 | 31,017,255,462 | IssuesEvent | 2023-08-10 00:13:07 | pixie-io/pixie | https://api.github.com/repos/pixie-io/pixie | closed | the install command from document is wrong | kind/documentation | **Describe the bug**
According to https://docs.px.dev/installing-pixie/install-guides/self-hosted-pixie,
```
export LATEST_CLOUD_RELEASE=$(git tag | grep -Po '(?<=release/cloud/v)[^\-]*$' | sort -t '.' -k1,1nr -k2,2nr -k3,3nr | head -n 1)
```
it is not ok for mac.
<img width="1439" alt="image" src="https://github.com/pixie-io/pixie/assets/19339970/6ab411e1-373d-4aa8-ba2b-ce943d8774f4">
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem. Please make sure the screenshot does not contain any sensitive information such as API keys or access tokens.
**Logs**
Please attach the logs by running the following command:
```
./px collect-logs
```
**App information (please complete the following information):**
- Pixie version
- K8s cluster version
- Node Kernel version
- Browser version
**Additional context**
Add any other context about the problem here.
| 1.0 | the install command from document is wrong - **Describe the bug**
According to https://docs.px.dev/installing-pixie/install-guides/self-hosted-pixie,
```
export LATEST_CLOUD_RELEASE=$(git tag | grep -Po '(?<=release/cloud/v)[^\-]*$' | sort -t '.' -k1,1nr -k2,2nr -k3,3nr | head -n 1)
```
it is not ok for mac.
<img width="1439" alt="image" src="https://github.com/pixie-io/pixie/assets/19339970/6ab411e1-373d-4aa8-ba2b-ce943d8774f4">
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem. Please make sure the screenshot does not contain any sensitive information such as API keys or access tokens.
**Logs**
Please attach the logs by running the following command:
```
./px collect-logs
```
**App information (please complete the following information):**
- Pixie version
- K8s cluster version
- Node Kernel version
- Browser version
**Additional context**
Add any other context about the problem here.
| non_priority | the install command from document is wrong describe the bug according to export latest cloud release git tag grep po release cloud v sort t head n it is not ok for mac img width alt image src to reproduce steps to reproduce the behavior go to click on scroll down to see error expected behavior a clear and concise description of what you expected to happen screenshots if applicable add screenshots to help explain your problem please make sure the screenshot does not contain any sensitive information such as api keys or access tokens logs please attach the logs by running the following command px collect logs app information please complete the following information pixie version cluster version node kernel version browser version additional context add any other context about the problem here | 0 |
35,022 | 2,789,753,415 | IssuesEvent | 2015-05-08 21:16:41 | google/google-visualization-api-issues | https://api.github.com/repos/google/google-visualization-api-issues | opened | Visualization to understand XML | Priority-Low Type-Enhancement | Original [issue 55](https://code.google.com/p/google-visualization-api-issues/issues/detail?id=55) created by google-admin on 2009-09-16T14:22:48.000Z:
<b>What would you like to see us add to this API?</b>
It would have been nice if all visualization also understands XML.
In that case we have to only fetch the XML from the server and pass it to
Visualization. This will remove the coding need to fill the Visualization
data.
<b>What component is this issue related to (PieChart, LineChart, DataTable,</b>
<b>Query, etc)?</b>
<b>*********************************************************</b>
<b>For developers viewing this issue: please click the 'star' icon to be</b>
<b>notified of future changes, and to let us know how many of you are</b>
<b>interested in seeing it resolved.</b>
<b>*********************************************************</b>
| 1.0 | Visualization to understand XML - Original [issue 55](https://code.google.com/p/google-visualization-api-issues/issues/detail?id=55) created by google-admin on 2009-09-16T14:22:48.000Z:
<b>What would you like to see us add to this API?</b>
It would have been nice if all visualization also understands XML.
In that case we have to only fetch the XML from the server and pass it to
Visualization. This will remove the coding need to fill the Visualization
data.
<b>What component is this issue related to (PieChart, LineChart, DataTable,</b>
<b>Query, etc)?</b>
<b>*********************************************************</b>
<b>For developers viewing this issue: please click the 'star' icon to be</b>
<b>notified of future changes, and to let us know how many of you are</b>
<b>interested in seeing it resolved.</b>
<b>*********************************************************</b>
| priority | visualization to understand xml original created by google admin on what would you like to see us add to this api it would have been nice if all visualization also understands xml in that case we have to only fetch the xml from the server and pass it to visualization this will remove the coding need to fill the visualization data what component is this issue related to piechart linechart datatable query etc for developers viewing this issue please click the star icon to be notified of future changes and to let us know how many of you are interested in seeing it resolved | 1 |
193,458 | 6,885,114,881 | IssuesEvent | 2017-11-21 15:11:55 | qhacks/hacker-dashboard | https://api.github.com/repos/qhacks/hacker-dashboard | closed | Update font to be Encode Sans | priority: required (medium) | **Problem**
Current the font family is not Encode, this should be set and used throughout the app
**Requirements**
- [ ] Install Encode Sans
| 1.0 | Update font to be Encode Sans - **Problem**
Current the font family is not Encode, this should be set and used throughout the app
**Requirements**
- [ ] Install Encode Sans
| priority | update font to be encode sans problem current the font family is not encode this should be set and used throughout the app requirements install encode sans | 1 |
752,787 | 26,325,874,240 | IssuesEvent | 2023-01-10 06:32:28 | modrinth/knossos | https://api.github.com/repos/modrinth/knossos | closed | Inconsistancies between certian elements on whether they can be selected | priority: mid type: functional accessibility | ### Environment
Website
### Describe the bug
Some elements like log out and change theme buttons can't be selected, but almost every other button can be.

### Steps To Reproduce
1. Go to a user's profile
2. Select the entire page (CTRL + A)
3. You will see that some buttons aren't selected
### Expected behavior
The buttons should be consistently selectable. Either they should or they shouldn't be selectable.
### Additional context
_No response_ | 1.0 | Inconsistancies between certian elements on whether they can be selected - ### Environment
Website
### Describe the bug
Some elements like log out and change theme buttons can't be selected, but almost every other button can be.

### Steps To Reproduce
1. Go to a user's profile
2. Select the entire page (CTRL + A)
3. You will see that some buttons aren't selected
### Expected behavior
The buttons should be consistently selectable. Either they should or they shouldn't be selectable.
### Additional context
_No response_ | priority | inconsistancies between certian elements on whether they can be selected environment website describe the bug some elements like log out and change theme buttons can t be selected but almost every other button can be steps to reproduce go to a user s profile select the entire page ctrl a you will see that some buttons aren t selected expected behavior the buttons should be consistently selectable either they should or they shouldn t be selectable additional context no response | 1 |
84,974 | 15,728,375,373 | IssuesEvent | 2021-03-29 13:45:21 | ssobue/oauth2-provider | https://api.github.com/repos/ssobue/oauth2-provider | closed | CVE-2020-36187 (High) detected in jackson-databind-2.9.9.jar | security vulnerability | ## CVE-2020-36187 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.9.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /oauth2-provider/pom.xml</p>
<p>Path to vulnerable library: /root/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.9/jackson-databind-2.9.9.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.1.5.RELEASE.jar (Root Library)
- spring-boot-starter-json-2.1.5.RELEASE.jar
- :x: **jackson-databind-2.9.9.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.tomcat.dbcp.dbcp.datasources.SharedPoolDataSource.
<p>Publish Date: 2021-01-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36187>CVE-2020-36187</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2997">https://github.com/FasterXML/jackson-databind/issues/2997</a></p>
<p>Release Date: 2021-01-06</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-36187 (High) detected in jackson-databind-2.9.9.jar - ## CVE-2020-36187 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.9.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /oauth2-provider/pom.xml</p>
<p>Path to vulnerable library: /root/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.9/jackson-databind-2.9.9.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.1.5.RELEASE.jar (Root Library)
- spring-boot-starter-json-2.1.5.RELEASE.jar
- :x: **jackson-databind-2.9.9.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.tomcat.dbcp.dbcp.datasources.SharedPoolDataSource.
<p>Publish Date: 2021-01-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36187>CVE-2020-36187</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2997">https://github.com/FasterXML/jackson-databind/issues/2997</a></p>
<p>Release Date: 2021-01-06</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file provider pom xml path to vulnerable library root repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy spring boot starter web release jar root library spring boot starter json release jar x jackson databind jar vulnerable library vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org apache tomcat dbcp dbcp datasources sharedpooldatasource publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind step up your open source security game with whitesource | 0 |
189,322 | 22,047,017,672 | IssuesEvent | 2022-05-30 03:43:16 | panasalap/linux-4.19.72_test1 | https://api.github.com/repos/panasalap/linux-4.19.72_test1 | closed | WS-2021-0458 (Medium) detected in linux-yoctov5.4.51 - autoclosed | security vulnerability | ## WS-2021-0458 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in HEAD commit: <a href="https://github.com/panasalap/linux-4.19.72/commit/f1b7c617b9b8f4135ab2f75a0c407cc44d43683f">f1b7c617b9b8f4135ab2f75a0c407cc44d43683f</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/net/ethernet/intel/i40e/i40e_main.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Linux/Kernel is vulnerable to freeing of uninitialized misc IRQ vector in drivers/net/ethernet/intel/i40e/i40e_main.c
<p>Publish Date: 2021-11-29
<p>URL: <a href=https://github.com/gregkh/linux/commit/75099439209d3cda439a1d9b00d19a50f0066fef>WS-2021-0458</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://osv.dev/vulnerability/UVI-2021-1001686">https://osv.dev/vulnerability/UVI-2021-1001686</a></p>
<p>Release Date: 2021-11-29</p>
<p>Fix Resolution: Linux/Kernel - v5.14.12, v5.10.73, v5.4.153, v4.19.211, v5.15-rc5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2021-0458 (Medium) detected in linux-yoctov5.4.51 - autoclosed - ## WS-2021-0458 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in HEAD commit: <a href="https://github.com/panasalap/linux-4.19.72/commit/f1b7c617b9b8f4135ab2f75a0c407cc44d43683f">f1b7c617b9b8f4135ab2f75a0c407cc44d43683f</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/net/ethernet/intel/i40e/i40e_main.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Linux/Kernel is vulnerable to freeing of uninitialized misc IRQ vector in drivers/net/ethernet/intel/i40e/i40e_main.c
<p>Publish Date: 2021-11-29
<p>URL: <a href=https://github.com/gregkh/linux/commit/75099439209d3cda439a1d9b00d19a50f0066fef>WS-2021-0458</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://osv.dev/vulnerability/UVI-2021-1001686">https://osv.dev/vulnerability/UVI-2021-1001686</a></p>
<p>Release Date: 2021-11-29</p>
<p>Fix Resolution: Linux/Kernel - v5.14.12, v5.10.73, v5.4.153, v4.19.211, v5.15-rc5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | ws medium detected in linux autoclosed ws medium severity vulnerability vulnerable library linux yocto linux embedded kernel library home page a href found in head commit a href found in base branch master vulnerable source files drivers net ethernet intel main c vulnerability details in linux kernel is vulnerable to freeing of uninitialized misc irq vector in drivers net ethernet intel main c publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution linux kernel step up your open source security game with whitesource | 0 |
315,062 | 9,606,237,929 | IssuesEvent | 2019-05-11 08:29:40 | StrangeLoopGames/EcoIssues | https://api.github.com/repos/StrangeLoopGames/EcoIssues | opened | [0.8.2.0 staging-6] Not all Tutorials appear after tutorial restart. | Medium Priority Regression Staging | Step to reproduce:
- start new world.
- skip all tutorials.
- restart tutorial.
- start to do all tutorials.
- Digging and Mining tutorials don't appear. | 1.0 | [0.8.2.0 staging-6] Not all Tutorials appear after tutorial restart. - Step to reproduce:
- start new world.
- skip all tutorials.
- restart tutorial.
- start to do all tutorials.
- Digging and Mining tutorials don't appear. | priority | not all tutorials appear after tutorial restart step to reproduce start new world skip all tutorials restart tutorial start to do all tutorials digging and mining tutorials don t appear | 1 |
249,988 | 7,966,219,553 | IssuesEvent | 2018-07-14 19:05:41 | tootsuite/mastodon | https://api.github.com/repos/tootsuite/mastodon | closed | Content of “Getting started” | enhancement priority - medium ui | “Getting started” was seemingly originally intended as a content screen with some tips to start using Mastodon. Its contents used to read as follow:
> You can follow people if you know their username and the domain they are on by entering an e-mail-esque address into the search form.
> If the target user is on the same domain as you, just the username will work. The same rule applies to mentioning people in statuses.
After #262 (and possibly other related threads), the screen gained a new purpose, being a navigation menu as well.
By the way, the current copy reads:
> Mastodon is open source software. You can contribute or report issues on GitHub at [tootsuite/mastodon]. [Various apps are available].
Issue #1856 has for purpose to fix the navigation logic, possibly by freeing “Getting started” from its hybrid nature, by having it being a content screen only and implementing the menu as an independent element. See also #1850 about the name of “Getting started”, which can be influenced by what will be discussed here.
This thread is therefore about the textual content of “Getting started” (not about its role or not as a menu), what should it say, link to, etc. I’ll now write a separate message with first ideas and considerations :) | 1.0 | Content of “Getting started” - “Getting started” was seemingly originally intended as a content screen with some tips to start using Mastodon. Its contents used to read as follow:
> You can follow people if you know their username and the domain they are on by entering an e-mail-esque address into the search form.
> If the target user is on the same domain as you, just the username will work. The same rule applies to mentioning people in statuses.
After #262 (and possibly other related threads), the screen gained a new purpose, being a navigation menu as well.
By the way, the current copy reads:
> Mastodon is open source software. You can contribute or report issues on GitHub at [tootsuite/mastodon]. [Various apps are available].
Issue #1856 has for purpose to fix the navigation logic, possibly by freeing “Getting started” from its hybrid nature, by having it being a content screen only and implementing the menu as an independent element. See also #1850 about the name of “Getting started”, which can be influenced by what will be discussed here.
This thread is therefore about the textual content of “Getting started” (not about its role or not as a menu), what should it say, link to, etc. I’ll now write a separate message with first ideas and considerations :) | priority | content of “getting started” “getting started” was seemingly originally intended as a content screen with some tips to start using mastodon its contents used to read as follow you can follow people if you know their username and the domain they are on by entering an e mail esque address into the search form if the target user is on the same domain as you just the username will work the same rule applies to mentioning people in statuses after and possibly other related threads the screen gained a new purpose being a navigation menu as well by the way the current copy reads mastodon is open source software you can contribute or report issues on github at issue has for purpose to fix the navigation logic possibly by freeing “getting started” from its hybrid nature by having it being a content screen only and implementing the menu as an independent element see also about the name of “getting started” which can be influenced by what will be discussed here this thread is therefore about the textual content of “getting started” not about its role or not as a menu what should it say link to etc i’ll now write a separate message with first ideas and considerations | 1 |
129,738 | 10,584,779,791 | IssuesEvent | 2019-10-08 16:05:00 | elastic/cloud-on-k8s | https://api.github.com/repos/elastic/cloud-on-k8s | closed | Flaky TestMutationHTTPSToHTTP | >flaky_test >test v1.0.0-beta1 | ```
13:04:29 --- FAIL: TestMutationHTTPSToHTTP/Elasticsearch_cluster_health_should_not_have_been_red_during_mutation_process (0.00s)
13:04:29 steps_mutation.go:116:
13:04:29 Error Trace: steps_mutation.go:116
13:04:29 Error: Not equal:
13:04:29 expected: 0
13:04:29 actual : 11
13:04:29 Test: TestMutationHTTPSToHTTP/Elasticsearch_cluster_health_should_not_have_been_red_during_mutation_process
13:04:29 steps_mutation.go:118: Elasticsearch cluster health check failure at 2019-10-08 13:03:52.782423029 +0000 UTC m=+1871.932566060: Get https://test-mutation-http-to-https-9zbr-es-masterdata-1.test-mutation-http-to-https-9zbr-es-masterdata.e2e-yp3cy-mercury:9200/_cluster/health: http: server gave HTTP response to HTTPS client
```
https://devops-ci.elastic.co/job/cloud-on-k8s-e2e-tests/441/ | 2.0 | Flaky TestMutationHTTPSToHTTP - ```
13:04:29 --- FAIL: TestMutationHTTPSToHTTP/Elasticsearch_cluster_health_should_not_have_been_red_during_mutation_process (0.00s)
13:04:29 steps_mutation.go:116:
13:04:29 Error Trace: steps_mutation.go:116
13:04:29 Error: Not equal:
13:04:29 expected: 0
13:04:29 actual : 11
13:04:29 Test: TestMutationHTTPSToHTTP/Elasticsearch_cluster_health_should_not_have_been_red_during_mutation_process
13:04:29 steps_mutation.go:118: Elasticsearch cluster health check failure at 2019-10-08 13:03:52.782423029 +0000 UTC m=+1871.932566060: Get https://test-mutation-http-to-https-9zbr-es-masterdata-1.test-mutation-http-to-https-9zbr-es-masterdata.e2e-yp3cy-mercury:9200/_cluster/health: http: server gave HTTP response to HTTPS client
```
https://devops-ci.elastic.co/job/cloud-on-k8s-e2e-tests/441/ | non_priority | flaky testmutationhttpstohttp fail testmutationhttpstohttp elasticsearch cluster health should not have been red during mutation process steps mutation go error trace steps mutation go error not equal expected actual test testmutationhttpstohttp elasticsearch cluster health should not have been red during mutation process steps mutation go elasticsearch cluster health check failure at utc m get http server gave http response to https client | 0 |
15,558 | 3,330,235,613 | IssuesEvent | 2015-11-11 09:16:18 | geetsisbac/ZUUIL5XN2UPJRUK244AZSHGO | https://api.github.com/repos/geetsisbac/ZUUIL5XN2UPJRUK244AZSHGO | closed | Qa5Ry66Z2x1imgtdi3colXEA5JhaxB5CI5bFOsR+qnfrqXMBN1Q5gbxm0gOziUyzRq/eKucOWPJNfwwZlmNcLwc4fQhXMIloV7vc5cmScOzbuElyXdo0mWerjlrOOw/hRz+zlyj9tZZ0KV1him5IiKXM6I3VcU7mtNUrsmIQ6QI= | design | whuvnkmiQd9jbuFenVRrmIt/utcyS6ZP/fb0w4Iin0XWuZKImbgWkA+aBylBT0/hwvocvRhRc5etTO0EDvnqmEiGwZLDED0gt1urq3Ib3YH/E2pAuHNHCRuA7CHrnZYD71L+KSzocLjL6v8VD7zW2EcwVd61ZVlK7FpiNN7cuSxf6e0amM/8QWfyUMRr0NXj8XgGK+YOMaenhUVw7QgYvLwrb1bN0ANFlrJoYxN7RU/oPp+bc5lNuxsdDiAEpHVhOXerJ82mRQsbyp37+9JsVTLJsqSKe/G95FQyexrFL3YeBWKvWhItq1+tMH84NG6c8K7GKZ+NJFAlWDNmWkBnjNE1eZxQfUim19/FVjRXzaV1YRdNEp12EydjbTlyVxhfG0gkQmUekPFHrjQkT7eM7lA57PSoIGAx9JiVvH3+exJsIJ90xarKEXYUa5v59ksrGw7F7hn6AWH1pDelWzUp6QvlqVVz6tZvAjZn8MFYuVBzYmsr2wsdC52RJ2r2DJYgRJvwWFS7QdMkgS3q51EagU0N2iOPPuvfbmWaS5N2dvwVJmdjV0TsgFrTYomgLfpkwMM8el7qf5HTQVHj5tL7Y11odrAL8HWIBEJFbsYDHB6TUnIkhetQLMnyjVMl/IqpUcoLCzPHi8wjlmZieTT22a6pFf50UoYw7vkdvBDMt5Hx79LVN8Scp6Y7qvTOo3r3hmXOCzeAZDYHaccxV+9LU4ER5PLZJDMXLrNaVXSbCmANMPJu1QvQuPUh5ePVlKTB3GqBOzwbQQxCUnzi+XlqywWjaBAd45Rk5hSDRq/IN5xdr+RwPe5eUJuN1333YaNzUd/BfPgne2ftw00Nxr2qHO+gnKd5evQBTQJyMowJqP1skmqWUMAOFJH6es8TnDMaGLWg7I8c8CLqLVKAyc9AT7sYXAsaUGDbzntqsZBMLZ3DWB/jctio/ssK7OyNLoSNq4aPpp2RAT26sndU1d/9A17QqCDdA1iKsh39+HLaahMVtBBNa8chqfgJ4yFTU5vhXwZbhJaBsKLvWAmGA+rVQXBksH93XvnQ54XorWfcOpmdEQbqBas/29oh1e+n7yr7v/shj9POkGf+gnU4m+kGRRuFJTb/GXl60tjgzAtSpPc9xZflG0dk9dZnORXIbC6vhhZEsGPV1rQzpAycyz5O7/9myjsOftETRmKzJSnJPuJs+AEz+uNNh/O5SKr0IAb2PgpdLj2wQrGaREOt3UsnoWU6ZfufZBZHZV/x/lFpTRNXWRUe4urRiAJ/igF4K8WOXxM3atRgT7Xzfc1gxW/Ot8oN63KfXzXt+MpikqssPuqQ+Hp0LbvGXdtlMJ5ONuQSd0Yc3BvNUefR31Mq1ilDpCHogzp9Nvw8Te5zIqODULs= | 1.0 | Qa5Ry66Z2x1imgtdi3colXEA5JhaxB5CI5bFOsR+qnfrqXMBN1Q5gbxm0gOziUyzRq/eKucOWPJNfwwZlmNcLwc4fQhXMIloV7vc5cmScOzbuElyXdo0mWerjlrOOw/hRz+zlyj9tZZ0KV1him5IiKXM6I3VcU7mtNUrsmIQ6QI= - whuvnkmiQd9jbuFenVRrmIt/utcyS6ZP/fb0w4Iin0XWuZKImbgWkA+aBylBT0/hwvocvRhRc5etTO0EDvnqmEiGwZLDED0gt1urq3Ib3YH/E2pAuHNHCRuA7CHrnZYD71L+KSzocLjL6v8VD7zW2EcwVd61ZVlK7FpiNN7cuSxf6e0amM/8QWfyUMRr0NXj8XgGK+YOMaenhUVw7QgYvLwrb1bN0ANFlrJoYxN7RU/oPp+bc5lNuxsdDiAEpHVhOXerJ82mRQsbyp37+9JsVTLJsqSKe/G95FQyexrFL3YeBWKvWhItq1+tMH84NG6c8K7GKZ+NJFAlWDNmWkBnjNE1eZxQfUim19/FVjRXzaV1YRdNEp12EydjbTlyVxhfG0gkQmUekPFHrjQkT7eM7lA57PSoIGAx9JiVvH3+exJsIJ90xarKEXYUa5v59ksrGw7F7hn6AWH1pDelWzUp6QvlqVVz6tZvAjZn8MFYuVBzYmsr2wsdC52RJ2r2DJYgRJvwWFS7QdMkgS3q51EagU0N2iOPPuvfbmWaS5N2dvwVJmdjV0TsgFrTYomgLfpkwMM8el7qf5HTQVHj5tL7Y11odrAL8HWIBEJFbsYDHB6TUnIkhetQLMnyjVMl/IqpUcoLCzPHi8wjlmZieTT22a6pFf50UoYw7vkdvBDMt5Hx79LVN8Scp6Y7qvTOo3r3hmXOCzeAZDYHaccxV+9LU4ER5PLZJDMXLrNaVXSbCmANMPJu1QvQuPUh5ePVlKTB3GqBOzwbQQxCUnzi+XlqywWjaBAd45Rk5hSDRq/IN5xdr+RwPe5eUJuN1333YaNzUd/BfPgne2ftw00Nxr2qHO+gnKd5evQBTQJyMowJqP1skmqWUMAOFJH6es8TnDMaGLWg7I8c8CLqLVKAyc9AT7sYXAsaUGDbzntqsZBMLZ3DWB/jctio/ssK7OyNLoSNq4aPpp2RAT26sndU1d/9A17QqCDdA1iKsh39+HLaahMVtBBNa8chqfgJ4yFTU5vhXwZbhJaBsKLvWAmGA+rVQXBksH93XvnQ54XorWfcOpmdEQbqBas/29oh1e+n7yr7v/shj9POkGf+gnU4m+kGRRuFJTb/GXl60tjgzAtSpPc9xZflG0dk9dZnORXIbC6vhhZEsGPV1rQzpAycyz5O7/9myjsOftETRmKzJSnJPuJs+AEz+uNNh/O5SKr0IAb2PgpdLj2wQrGaREOt3UsnoWU6ZfufZBZHZV/x/lFpTRNXWRUe4urRiAJ/igF4K8WOXxM3atRgT7Xzfc1gxW/Ot8oN63KfXzXt+MpikqssPuqQ+Hp0LbvGXdtlMJ5ONuQSd0Yc3BvNUefR31Mq1ilDpCHogzp9Nvw8Te5zIqODULs= | non_priority | hrz opp jctio kgrrufjtb aez unnh x mpikqsspuqq | 0 |
9,687 | 2,615,165,860 | IssuesEvent | 2015-03-01 06:46:13 | chrsmith/reaver-wps | https://api.github.com/repos/chrsmith/reaver-wps | opened | %90.90 issue | auto-migrated Priority-Triage Type-Defect | ```
i try
bt5 r3 - reaver 1.4
bt5 r3 - reaver 1.3
xiaopan - reaver 1.4
xiaopan - reaver 1.3
always same issue. stuck in %90,90 and repating same pin..
3 different AP (but same modem model. Huweai)
```
Original issue reported on code.google.com by `s.wra...@gmail.com` on 12 Apr 2013 at 7:38 | 1.0 | %90.90 issue - ```
i try
bt5 r3 - reaver 1.4
bt5 r3 - reaver 1.3
xiaopan - reaver 1.4
xiaopan - reaver 1.3
always same issue. stuck in %90,90 and repating same pin..
3 different AP (but same modem model. Huweai)
```
Original issue reported on code.google.com by `s.wra...@gmail.com` on 12 Apr 2013 at 7:38 | non_priority | issue i try reaver reaver xiaopan reaver xiaopan reaver always same issue stuck in and repating same pin different ap but same modem model huweai original issue reported on code google com by s wra gmail com on apr at | 0 |
47,045 | 2,971,671,925 | IssuesEvent | 2015-07-14 08:43:07 | clementine-player/Clementine | https://api.github.com/repos/clementine-player/Clementine | closed | the forward and backward keys support | enhancement imported Priority-Medium | _From [kmusoy](https://code.google.com/u/104377330607044402928/) on December 18, 2010 06:52:34_
Please describe the feature you're requesting in as much detail as possible. my mouse has two keys to go backward and forward. but clementine don't support them.
_Original issue: http://code.google.com/p/clementine-player/issues/detail?id=1129_ | 1.0 | the forward and backward keys support - _From [kmusoy](https://code.google.com/u/104377330607044402928/) on December 18, 2010 06:52:34_
Please describe the feature you're requesting in as much detail as possible. my mouse has two keys to go backward and forward. but clementine don't support them.
_Original issue: http://code.google.com/p/clementine-player/issues/detail?id=1129_ | priority | the forward and backward keys support from on december please describe the feature you re requesting in as much detail as possible my mouse has two keys to go backward and forward but clementine don t support them original issue | 1 |
642,430 | 20,887,734,564 | IssuesEvent | 2022-03-23 07:47:32 | bedita/manager | https://api.github.com/repos/bedita/manager | closed | Intercept error on upload file bigger than post_max_size or max_upload_size | bug Priority - Normal | When uploading a file bigger than post_max_size or max_upload_size, BEdita Manager raises a CSRF error, because request body is empty (globals `$_FILES` and `$_POST` are empty).
We should intercept the error and provide a proper message to the user.
Adding a field `MAX_FILE_SIZE` to the form, allows php server to intercept the error (as explained in https://www.php.net/manual/en/features.file-upload.post-method.php), opening a blank page with error `413 Request Entity Too Large`
i.e.:
```
{{ Form.hidden('MAX_FILE_SIZE', {'value': '3000'})|raw }}
```
Still the error page is not friendly. We should intercept somehow the error.
Tip: use javascript file size validation, before uploading.
| 1.0 | Intercept error on upload file bigger than post_max_size or max_upload_size - When uploading a file bigger than post_max_size or max_upload_size, BEdita Manager raises a CSRF error, because request body is empty (globals `$_FILES` and `$_POST` are empty).
We should intercept the error and provide a proper message to the user.
Adding a field `MAX_FILE_SIZE` to the form, allows php server to intercept the error (as explained in https://www.php.net/manual/en/features.file-upload.post-method.php), opening a blank page with error `413 Request Entity Too Large`
i.e.:
```
{{ Form.hidden('MAX_FILE_SIZE', {'value': '3000'})|raw }}
```
Still the error page is not friendly. We should intercept somehow the error.
Tip: use javascript file size validation, before uploading.
| priority | intercept error on upload file bigger than post max size or max upload size when uploading a file bigger than post max size or max upload size bedita manager raises a csrf error because request body is empty globals files and post are empty we should intercept the error and provide a proper message to the user adding a field max file size to the form allows php server to intercept the error as explained in opening a blank page with error request entity too large i e form hidden max file size value raw still the error page is not friendly we should intercept somehow the error tip use javascript file size validation before uploading | 1 |
208,712 | 7,157,646,535 | IssuesEvent | 2018-01-26 20:41:39 | status-im/status-react | https://api.github.com/repos/status-im/status-react | closed | App version for release build is 0.9.14d3 instead of 0.9.13 | high-priority release |
### Description
[comment]: # (Feature or Bug? i.e Type: Bug)
*Type*: Bug
[comment]: # (Describe the feature you would like, or briefly summarise the bug and what you did, what you expected to happen, and what actually happens. Sections below)
*Summary*:
On latest release build 70:
iOS app version: 0.9.14d3
Android app version: 0.9.13-3-g848bcaea+

Note that on release build 67 versions were as expected (0.9.13)
#### Expected behavior
[comment]: # (Describe what you expected to happen.)
iOS app version: 0.9.13
Android app version: 0.9.13
#### Actual behavior
[comment]: # (Describe what actually happened.)
iOS app version: 0.9.14d3
Android app version: 0.9.13-3-g848bcaea+
### Reproduction
[comment]: # (Describe how we can replicate the bug step by step.)
- Install Status
- Check app version:
for Android: OS Settings
for iOS: General -> iPhone Storage -> Status
* Status version: release build 70
* Operating System: Android and iOS real devices
| 1.0 | App version for release build is 0.9.14d3 instead of 0.9.13 -
### Description
[comment]: # (Feature or Bug? i.e Type: Bug)
*Type*: Bug
[comment]: # (Describe the feature you would like, or briefly summarise the bug and what you did, what you expected to happen, and what actually happens. Sections below)
*Summary*:
On latest release build 70:
iOS app version: 0.9.14d3
Android app version: 0.9.13-3-g848bcaea+

Note that on release build 67 versions were as expected (0.9.13)
#### Expected behavior
[comment]: # (Describe what you expected to happen.)
iOS app version: 0.9.13
Android app version: 0.9.13
#### Actual behavior
[comment]: # (Describe what actually happened.)
iOS app version: 0.9.14d3
Android app version: 0.9.13-3-g848bcaea+
### Reproduction
[comment]: # (Describe how we can replicate the bug step by step.)
- Install Status
- Check app version:
for Android: OS Settings
for iOS: General -> iPhone Storage -> Status
* Status version: release build 70
* Operating System: Android and iOS real devices
| priority | app version for release build is instead of description feature or bug i e type bug type bug describe the feature you would like or briefly summarise the bug and what you did what you expected to happen and what actually happens sections below summary on latest release build ios app version android app version note that on release build versions were as expected expected behavior describe what you expected to happen ios app version android app version actual behavior describe what actually happened ios app version android app version reproduction describe how we can replicate the bug step by step install status check app version for android os settings for ios general iphone storage status status version release build operating system android and ios real devices | 1 |
50,415 | 13,527,846,823 | IssuesEvent | 2020-09-15 15:55:36 | jgeraigery/salt | https://api.github.com/repos/jgeraigery/salt | opened | CVE-2018-14040 (Medium) detected in bootstrap-2.3.0.js, bootstrap-2.3.0.min.js | security vulnerability | ## CVE-2018-14040 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bootstrap-2.3.0.js</b>, <b>bootstrap-2.3.0.min.js</b></p></summary>
<p>
<details><summary><b>bootstrap-2.3.0.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/2.3.0/js/bootstrap.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/2.3.0/js/bootstrap.js</a></p>
<p>Path to vulnerable library: salt/doc/_themes/saltstack/static/js/vendor/bootstrap.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-2.3.0.js** (Vulnerable Library)
</details>
<details><summary><b>bootstrap-2.3.0.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/2.3.0/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/2.3.0/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: salt/doc/_themes/saltstack/static/js/vendor/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-2.3.0.min.js** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/jgeraigery/salt/commit/1362ce577176c220b7b67ea2b8560e112474f1c9">1362ce577176c220b7b67ea2b8560e112474f1c9</a></p>
<p>Found in base branch: <b>develop</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap before 4.1.2, XSS is possible in the collapse data-parent attribute.
<p>Publish Date: 2018-07-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040>CVE-2018-14040</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/twbs/bootstrap/pull/26630">https://github.com/twbs/bootstrap/pull/26630</a></p>
<p>Release Date: 2018-07-13</p>
<p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"2.3.0","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:2.3.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"},{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"2.3.0","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:2.3.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"}],"vulnerabilityIdentifier":"CVE-2018-14040","vulnerabilityDetails":"In Bootstrap before 4.1.2, XSS is possible in the collapse data-parent attribute.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> --> | True | CVE-2018-14040 (Medium) detected in bootstrap-2.3.0.js, bootstrap-2.3.0.min.js - ## CVE-2018-14040 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bootstrap-2.3.0.js</b>, <b>bootstrap-2.3.0.min.js</b></p></summary>
<p>
<details><summary><b>bootstrap-2.3.0.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/2.3.0/js/bootstrap.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/2.3.0/js/bootstrap.js</a></p>
<p>Path to vulnerable library: salt/doc/_themes/saltstack/static/js/vendor/bootstrap.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-2.3.0.js** (Vulnerable Library)
</details>
<details><summary><b>bootstrap-2.3.0.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/2.3.0/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/2.3.0/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: salt/doc/_themes/saltstack/static/js/vendor/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-2.3.0.min.js** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/jgeraigery/salt/commit/1362ce577176c220b7b67ea2b8560e112474f1c9">1362ce577176c220b7b67ea2b8560e112474f1c9</a></p>
<p>Found in base branch: <b>develop</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap before 4.1.2, XSS is possible in the collapse data-parent attribute.
<p>Publish Date: 2018-07-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040>CVE-2018-14040</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/twbs/bootstrap/pull/26630">https://github.com/twbs/bootstrap/pull/26630</a></p>
<p>Release Date: 2018-07-13</p>
<p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"2.3.0","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:2.3.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"},{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"2.3.0","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:2.3.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"}],"vulnerabilityIdentifier":"CVE-2018-14040","vulnerabilityDetails":"In Bootstrap before 4.1.2, XSS is possible in the collapse data-parent attribute.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> --> | non_priority | cve medium detected in bootstrap js bootstrap min js cve medium severity vulnerability vulnerable libraries bootstrap js bootstrap min js bootstrap js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library salt doc themes saltstack static js vendor bootstrap js dependency hierarchy x bootstrap js vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library salt doc themes saltstack static js vendor bootstrap min js dependency hierarchy x bootstrap min js vulnerable library found in head commit a href found in base branch develop vulnerability details in bootstrap before xss is possible in the collapse data parent attribute publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org webjars npm bootstrap org webjars bootstrap isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails in bootstrap before xss is possible in the collapse data parent attribute vulnerabilityurl | 0 |
411,027 | 12,013,410,856 | IssuesEvent | 2020-04-10 08:47:52 | AY1920S2-CS2103-W15-2/main | https://api.github.com/repos/AY1920S2-CS2103-W15-2/main | closed | Implement more reactive GUI components | priority.Medium type.Enhancement | Currently, many tasks, such as viewing the list of attributes/metrics/questions/interviewees are only achievable via commands on the command line interface. As these features are very frequently used, it would be very convenient to implement the above lists as tabs, similar to a web browser, so that the user can navigate between them easily. | 1.0 | Implement more reactive GUI components - Currently, many tasks, such as viewing the list of attributes/metrics/questions/interviewees are only achievable via commands on the command line interface. As these features are very frequently used, it would be very convenient to implement the above lists as tabs, similar to a web browser, so that the user can navigate between them easily. | priority | implement more reactive gui components currently many tasks such as viewing the list of attributes metrics questions interviewees are only achievable via commands on the command line interface as these features are very frequently used it would be very convenient to implement the above lists as tabs similar to a web browser so that the user can navigate between them easily | 1 |
561,005 | 16,608,957,208 | IssuesEvent | 2021-06-02 09:08:05 | arnog/mathlive | https://api.github.com/repos/arnog/mathlive | closed | Arrow key navigation gets stuck | bug high priority | ## Description
I typed something and then pressed the right arrow key. To my surprise, it didn't work. Instead, I only heard the "plonk" sound.

## Steps to Reproduce
1. Go to https://cortexjs.io/mathlive/
2. Paste in `\begin{equation*} x=\frac{-b\pm\sqrt{b^2-4ac}}{2a}\R \end{equation*}`
3. Place the at the beginning and repeatedly press the right arrow key, until you get stuck just in front of the `\R`
### Actual Behavior
Caret gets stuck. I believe this happens with any instance of `\R` or `\Z` or `\N`. Funnily enough, it does not happen with `\C`
### Expected Behavior
I expected the caret to move to after the `\R`
### Environment
**MathLive version** 0.68.0
**Operating System** Windows 10
**Browser** Firefox 88, Microsoft Edge
| 1.0 | Arrow key navigation gets stuck - ## Description
I typed something and then pressed the right arrow key. To my surprise, it didn't work. Instead, I only heard the "plonk" sound.

## Steps to Reproduce
1. Go to https://cortexjs.io/mathlive/
2. Paste in `\begin{equation*} x=\frac{-b\pm\sqrt{b^2-4ac}}{2a}\R \end{equation*}`
3. Place the at the beginning and repeatedly press the right arrow key, until you get stuck just in front of the `\R`
### Actual Behavior
Caret gets stuck. I believe this happens with any instance of `\R` or `\Z` or `\N`. Funnily enough, it does not happen with `\C`
### Expected Behavior
I expected the caret to move to after the `\R`
### Environment
**MathLive version** 0.68.0
**Operating System** Windows 10
**Browser** Firefox 88, Microsoft Edge
| priority | arrow key navigation gets stuck description i typed something and then pressed the right arrow key to my surprise it didn t work instead i only heard the plonk sound steps to reproduce go to paste in begin equation x frac b pm sqrt b r end equation place the at the beginning and repeatedly press the right arrow key until you get stuck just in front of the r actual behavior caret gets stuck i believe this happens with any instance of r or z or n funnily enough it does not happen with c expected behavior i expected the caret to move to after the r environment mathlive version operating system windows browser firefox microsoft edge | 1 |
279,422 | 8,665,231,727 | IssuesEvent | 2018-11-28 22:37:48 | OfficeDev/office-ui-fabric-react | https://api.github.com/repos/OfficeDev/office-ui-fabric-react | opened | Facepile: overflow button title not rendered in Firefox | Component: Facepile Priority 3: Fit and finish Type: Bug :bug: | ### Environment Information
- __Package version(s)__: latest and prior
- __Browser and OS versions__: Firefox _only_
### Please provide a reproduction of the bug in a codepen:
The `Facepile` overflow button's title is not rendered in Firefox likely due to how Firefox interprets `<button>` and its child elements (it is more strict). This can be seen by choosing the "descriptive" type of "Overflow Button Type" in the "Facepile with overflow buttons" example: https://developer.microsoft.com/en-us/fabric#/components/facepile
Discovered why reviewing #7220.
#### Actual behavior:
Overflow button title is **not** rendered in Firefox _only_.

#### Expected behavior:
Overflow button title is rendered across _all supported browsers_.
#### Priorities and help requested:
Are you willing to submit a PR to fix? Yes
Requested priority: Low
Products/sites affected: Whomever uses Facepile and supports Firefox
| 1.0 | Facepile: overflow button title not rendered in Firefox - ### Environment Information
- __Package version(s)__: latest and prior
- __Browser and OS versions__: Firefox _only_
### Please provide a reproduction of the bug in a codepen:
The `Facepile` overflow button's title is not rendered in Firefox likely due to how Firefox interprets `<button>` and its child elements (it is more strict). This can be seen by choosing the "descriptive" type of "Overflow Button Type" in the "Facepile with overflow buttons" example: https://developer.microsoft.com/en-us/fabric#/components/facepile
Discovered why reviewing #7220.
#### Actual behavior:
Overflow button title is **not** rendered in Firefox _only_.

#### Expected behavior:
Overflow button title is rendered across _all supported browsers_.
#### Priorities and help requested:
Are you willing to submit a PR to fix? Yes
Requested priority: Low
Products/sites affected: Whomever uses Facepile and supports Firefox
| priority | facepile overflow button title not rendered in firefox environment information package version s latest and prior browser and os versions firefox only please provide a reproduction of the bug in a codepen the facepile overflow button s title is not rendered in firefox likely due to how firefox interprets and its child elements it is more strict this can be seen by choosing the descriptive type of overflow button type in the facepile with overflow buttons example discovered why reviewing actual behavior overflow button title is not rendered in firefox only expected behavior overflow button title is rendered across all supported browsers priorities and help requested are you willing to submit a pr to fix yes requested priority low products sites affected whomever uses facepile and supports firefox | 1 |
60,284 | 3,122,382,159 | IssuesEvent | 2015-09-06 13:48:38 | HubTurbo/HubTurbo | https://api.github.com/repos/HubTurbo/HubTurbo | closed | New milestones cannot be used in filters unless the cache is refreshed | feature-milestones priority.high type.bug | It happened again.
`milestone:V5.51` does not capture issues assigned to it unless I refresh the cache. Note that the milestone V5.51 was created after the previous refresh of cache. | 1.0 | New milestones cannot be used in filters unless the cache is refreshed - It happened again.
`milestone:V5.51` does not capture issues assigned to it unless I refresh the cache. Note that the milestone V5.51 was created after the previous refresh of cache. | priority | new milestones cannot be used in filters unless the cache is refreshed it happened again milestone does not capture issues assigned to it unless i refresh the cache note that the milestone was created after the previous refresh of cache | 1 |
571,086 | 17,023,239,775 | IssuesEvent | 2021-07-03 01:00:38 | tomhughes/trac-tickets | https://api.github.com/repos/tomhughes/trac-tickets | closed | Osmarender residential landuse areas too light. | Component: osmarender Priority: trivial Resolution: fixed Type: enhancement | **[Submitted to the original trac issue database at 6.58pm, Wednesday, 23rd April 2008]**
They're barely perceptible. Would love it if landuse=residential were just a few bits darker.
Would also like natural=scrub to be a slightly different colour to natural=wood, but I understand many people think there are enough different shades of green in osmarender as it is. | 1.0 | Osmarender residential landuse areas too light. - **[Submitted to the original trac issue database at 6.58pm, Wednesday, 23rd April 2008]**
They're barely perceptible. Would love it if landuse=residential were just a few bits darker.
Would also like natural=scrub to be a slightly different colour to natural=wood, but I understand many people think there are enough different shades of green in osmarender as it is. | priority | osmarender residential landuse areas too light they re barely perceptible would love it if landuse residential were just a few bits darker would also like natural scrub to be a slightly different colour to natural wood but i understand many people think there are enough different shades of green in osmarender as it is | 1 |
576,444 | 17,087,092,229 | IssuesEvent | 2021-07-08 13:12:13 | bireme/fi-admin | https://api.github.com/repos/bireme/fi-admin | closed | Favor gerar DeCS 2021 em formato ISIS | priority 1 tesauro | Para permitir a geração da planilha de mudanças anuais aos códigos hierárquicos com todas as informações necessários para o usuário, além do arquivo mostrar as colunas dos descritores de 2020, ele mostra nas últimas colunas os descritores de 2021 referentes a esses códigos.
Por isso, e também para que tenhamos a versão definitiva congelada da edição do DeCS 2021 em formato ISIS, solicito novamente a sua criação.
Observação: Esse issue pode ser feito antes do issue #1231 referente ao carregamento das notas de indexação modificadas, já que as notas não constam na planilha mencionada acima. | 1.0 | Favor gerar DeCS 2021 em formato ISIS - Para permitir a geração da planilha de mudanças anuais aos códigos hierárquicos com todas as informações necessários para o usuário, além do arquivo mostrar as colunas dos descritores de 2020, ele mostra nas últimas colunas os descritores de 2021 referentes a esses códigos.
Por isso, e também para que tenhamos a versão definitiva congelada da edição do DeCS 2021 em formato ISIS, solicito novamente a sua criação.
Observação: Esse issue pode ser feito antes do issue #1231 referente ao carregamento das notas de indexação modificadas, já que as notas não constam na planilha mencionada acima. | priority | favor gerar decs em formato isis para permitir a geração da planilha de mudanças anuais aos códigos hierárquicos com todas as informações necessários para o usuário além do arquivo mostrar as colunas dos descritores de ele mostra nas últimas colunas os descritores de referentes a esses códigos por isso e também para que tenhamos a versão definitiva congelada da edição do decs em formato isis solicito novamente a sua criação observação esse issue pode ser feito antes do issue referente ao carregamento das notas de indexação modificadas já que as notas não constam na planilha mencionada acima | 1 |
289,931 | 32,009,875,670 | IssuesEvent | 2023-09-21 17:13:55 | hinoshiba/news | https://api.github.com/repos/hinoshiba/news | opened | [SecurityWeek] Atlassian Security Updates Patch High-Severity Vulnerabilities | SecurityWeek |
Atlassian has released patches for multiple high-severity vulnerabilities in Jira, Confluence, Bitbucket, and Bamboo products.
The post [Atlassian Security Updates Patch High-Severity Vulnerabilities](https://www.securityweek.com/atlassian-security-updates-patch-high-severity-vulnerabilities/) appeared first on [SecurityWeek](https://www.securityweek.com).
<https://www.securityweek.com/atlassian-security-updates-patch-high-severity-vulnerabilities/>
| True | [SecurityWeek] Atlassian Security Updates Patch High-Severity Vulnerabilities -
Atlassian has released patches for multiple high-severity vulnerabilities in Jira, Confluence, Bitbucket, and Bamboo products.
The post [Atlassian Security Updates Patch High-Severity Vulnerabilities](https://www.securityweek.com/atlassian-security-updates-patch-high-severity-vulnerabilities/) appeared first on [SecurityWeek](https://www.securityweek.com).
<https://www.securityweek.com/atlassian-security-updates-patch-high-severity-vulnerabilities/>
| non_priority | atlassian security updates patch high severity vulnerabilities atlassian has released patches for multiple high severity vulnerabilities in jira confluence bitbucket and bamboo products the post appeared first on | 0 |
162,629 | 20,235,413,238 | IssuesEvent | 2022-02-14 01:06:39 | tildabio/aact | https://api.github.com/repos/tildabio/aact | opened | CVE-2022-23633 (High) detected in actionpack-6.0.0.gem | security vulnerability | ## CVE-2022-23633 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>actionpack-6.0.0.gem</b></p></summary>
<p>Web apps on Rails. Simple, battle-tested conventions for building and testing MVC web applications. Works with any Rack-compatible server.</p>
<p>Library home page: <a href="https://rubygems.org/gems/actionpack-6.0.0.gem">https://rubygems.org/gems/actionpack-6.0.0.gem</a></p>
<p>
Dependency Hierarchy:
- :x: **actionpack-6.0.0.gem** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Action Pack is a framework for handling and responding to web requests. Under certain circumstances response bodies will not be closed. In the event a response is *not* notified of a `close`, `ActionDispatch::Executor` will not know to reset thread local state for the next request. This can lead to data being leaked to subsequent requests.This has been fixed in Rails 7.0.2.1, 6.1.4.5, 6.0.4.5, and 5.2.6.1. Upgrading is highly recommended, but to work around this problem a middleware described in GHSA-wh98-p28r-vrc9 can be used.
<p>Publish Date: 2022-02-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-23633>CVE-2022-23633</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/rails/rails/security/advisories/GHSA-wh98-p28r-vrc9">https://github.com/rails/rails/security/advisories/GHSA-wh98-p28r-vrc9</a></p>
<p>Release Date: 2022-02-11</p>
<p>Fix Resolution: 5.2.6.2, 6.0.4.6, 6.1.4.6, 7.0.2.2
</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-23633 (High) detected in actionpack-6.0.0.gem - ## CVE-2022-23633 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>actionpack-6.0.0.gem</b></p></summary>
<p>Web apps on Rails. Simple, battle-tested conventions for building and testing MVC web applications. Works with any Rack-compatible server.</p>
<p>Library home page: <a href="https://rubygems.org/gems/actionpack-6.0.0.gem">https://rubygems.org/gems/actionpack-6.0.0.gem</a></p>
<p>
Dependency Hierarchy:
- :x: **actionpack-6.0.0.gem** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Action Pack is a framework for handling and responding to web requests. Under certain circumstances response bodies will not be closed. In the event a response is *not* notified of a `close`, `ActionDispatch::Executor` will not know to reset thread local state for the next request. This can lead to data being leaked to subsequent requests.This has been fixed in Rails 7.0.2.1, 6.1.4.5, 6.0.4.5, and 5.2.6.1. Upgrading is highly recommended, but to work around this problem a middleware described in GHSA-wh98-p28r-vrc9 can be used.
<p>Publish Date: 2022-02-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-23633>CVE-2022-23633</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/rails/rails/security/advisories/GHSA-wh98-p28r-vrc9">https://github.com/rails/rails/security/advisories/GHSA-wh98-p28r-vrc9</a></p>
<p>Release Date: 2022-02-11</p>
<p>Fix Resolution: 5.2.6.2, 6.0.4.6, 6.1.4.6, 7.0.2.2
</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in actionpack gem cve high severity vulnerability vulnerable library actionpack gem web apps on rails simple battle tested conventions for building and testing mvc web applications works with any rack compatible server library home page a href dependency hierarchy x actionpack gem vulnerable library found in base branch master vulnerability details action pack is a framework for handling and responding to web requests under certain circumstances response bodies will not be closed in the event a response is not notified of a close actiondispatch executor will not know to reset thread local state for the next request this can lead to data being leaked to subsequent requests this has been fixed in rails and upgrading is highly recommended but to work around this problem a middleware described in ghsa can be used publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
714,764 | 24,574,081,439 | IssuesEvent | 2022-10-13 10:55:01 | joomlahenk/fabrik | https://api.github.com/repos/joomlahenk/fabrik | closed | autofill form plugin: Confirmation alert keeps popping-up | Medium Priority | Doesn't save 'Autofill confirmation' setting.
Confirmation altert is showing up twice. | 1.0 | autofill form plugin: Confirmation alert keeps popping-up - Doesn't save 'Autofill confirmation' setting.
Confirmation altert is showing up twice. | priority | autofill form plugin confirmation alert keeps popping up doesn t save autofill confirmation setting confirmation altert is showing up twice | 1 |
753,281 | 26,343,129,097 | IssuesEvent | 2023-01-10 19:27:35 | yugabyte/yugabyte-db | https://api.github.com/repos/yugabyte/yugabyte-db | closed | [DocDB] Excessive re-allocations through SubDocument::AllocateChild | kind/enhancement area/docdb priority/medium | Jira Link: [DB-4568](https://yugabyte.atlassian.net/browse/DB-4568)
### Description
Based on perf analysis with benchbase and ScanG7 workload it turned out we are spending ~25% of CPU inside `SubDocument::AllocateChild` and we clear `SubDocument DocRowwiseIterator::row_` and do reallocation on every row.
| 1.0 | [DocDB] Excessive re-allocations through SubDocument::AllocateChild - Jira Link: [DB-4568](https://yugabyte.atlassian.net/browse/DB-4568)
### Description
Based on perf analysis with benchbase and ScanG7 workload it turned out we are spending ~25% of CPU inside `SubDocument::AllocateChild` and we clear `SubDocument DocRowwiseIterator::row_` and do reallocation on every row.
| priority | excessive re allocations through subdocument allocatechild jira link description based on perf analysis with benchbase and workload it turned out we are spending of cpu inside subdocument allocatechild and we clear subdocument docrowwiseiterator row and do reallocation on every row | 1 |
34,508 | 9,383,432,860 | IssuesEvent | 2019-04-05 03:26:25 | tensorflow/tensorflow | https://api.github.com/repos/tensorflow/tensorflow | closed | :gather_string failed (Exit 127) | stat:awaiting response subtype:windows type:build/install | Hello all,
I'm having issues installing tensorflow on my windows system using bazel. If there is any other alternative or fix please let me know
- **Have I written custom code (as opposed to using a stock example script provided in TensorFlow)**: No
- **OS Platform and Distribution (e.g., Linux Ubuntu 16.04)**: Windows 10 Build 17134
- **Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device**:
- **TensorFlow installed from (source or binary)**: Yes
- **TensorFlow version (use command below)**: Latest version (I'm building it off current master, can't see which version it is)
- **Python version**: 3.6.6
- **Bazel version (if compiling from source)**: 0.23.2
- **GCC/Compiler version (if compiling from source)**: Using VS2017 with v140 toolkit
- **CUDA/cuDNN version**: CUDA; 10.1.105 cudnn:7.5.0
- **GPU model and memory**: GTX 1060 6GB
- **Exact command to reproduce**:
git clone https://github.com/tensorflow/tensorflow.git
cd tensorflow
python ./configure.py
bazel build --config=opt --config=cuda --define=no_tensorflow_py_deps=true --copt=-nvcc_options=disable-warnings //tensorflow/tools/pip_package:build_pip_package
### Describe the problem
HItting this error
ERROR: C:/users/darshan/tensorflow/tensorflow/lite/python/testdata/BUILD:35:1: Executing genrule //tensorflow/lite/python/testdata:gather_string failed (Exit 127): bash.exe failed: error executing command
### Source code / logs
INFO: From Linking tensorflow/lite/toco/toco:
LINK : warning LNK4044: unrecognized option '/Wl,-rpath,../local_config_cuda/cuda/lib64'; ignored
LINK : warning LNK4044: unrecognized option '/Wl,-rpath,../local_config_cuda/cuda/extras/CUPTI/lib64'; ignored
LINK : warning LNK4044: unrecognized option '/ldl'; ignored
LINK : warning LNK4044: unrecognized option '/ldl'; ignored
LINK : warning LNK4044: unrecognized option '/ldl'; ignored
Creating library bazel-out/x64_windows-opt/bin/tensorflow/lite/toco/toco.lib and object bazel-out/x64_windows-opt/bin/tensorflow/lite/toco/toco.exp
libcudnn_plugin.lo(cuda_dnn.o) : warning LNK4217: locally defined symbol ?ThenBlasGemm@Stream@stream_executor@@QEAAAEAV12@W4Transpose@blas@2@0_K11MAEBV?$DeviceMemory@M@2@H2HMPEAV52@H@Z (public: class stream_executor::Stream & __cdecl stream_executor::Stream::ThenBlasGemm(enum stream_executor::blas::Transpose,enum stream_executor::blas::Transpose,unsigned __int64,unsigned __int64,unsigned __int64,float,class stream_executor::DeviceMemory<float> const &,int,class stream_executor::DeviceMemory<float> const &,int,float,class stream_executor::DeviceMemory<float> *,int)) imported in function "public: virtual bool __cdecl stream_executor::gpu::CudnnSupport::DoMatMul(class stream_executor::Stream *,class stream_executor::DeviceMemory<float> const &,class stream_executor::DeviceMemory<float> const &,class stream_executor::dnn::BatchDescriptor const &,class stream_executor::dnn::BatchDescriptor const &,class stream_executor::DeviceMemory<float> *)" (?DoMatMul@CudnnSupport@gpu@stream_executor@@UEAA_NPEAVStream@3@AEBV?$DeviceMemory@M@3@1AEBVBatchDescriptor@dnn@3@2PEAV53@@Z)
ERROR: C:/users/darshan/tensorflow/tensorflow/lite/python/testdata/BUILD:35:1: Executing genrule //tensorflow/lite/python/testdata:gather_string failed (Exit 127): bash.exe failed: error executing command
cd C:/users/darshan/_bazel_darshan/jojkqojs/execroot/org_tensorflow
SET CUDA_TOOLKIT_PATH=C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v10.1
SET CUDNN_INSTALL_PATH=C:/Program Files/cuda
SET PATH=D:\Programs\msys2\usr\bin;D:\Programs\msys2\bin;D:\Programs\Python Installed\Scripts\;D:\Programs\Python Installed\;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\libnvvp;C:\Program Files (x86)\Common Files\Oracle\Java\javapath;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Windows\System32\OpenSSH\;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Strawberry\c\bin;C:\Strawberry\perl\site\bin;C:\Strawberry\perl\bin;C:\Program Files (x86)\GnuWin32\bin;C:\Program Files\Git\cmd;D:\Software\MPC;C:\Program Files\dotnet\;C:\Program Files\Microsoft SQL Server\130\Tools\Binn\;C:\Program Files\Java\jre1.8.0_181\bin;C:\Program Files\PuTTY\;C:\Program Files (x86)\Microsoft SQL Server\Client SDK\ODBC\130\Tools\Binn\;C:\Program Files (x86)\Microsoft SQL Server\140\Tools\Binn\;C:\Program Files (x86)\Microsoft SQL Server\140\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\140\Tools\Binn\ManagementStudio\;C:\Program Files\NVIDIA Corporation\NVIDIA NvDLISR;C:\Program Files\NVIDIA Corporation\Nsight Compute 2019.1\;C:\Users\Darshan\AppData\Local\Microsoft\WindowsApps;C:\Users\Darshan\AppData\Local\Programs\Microsoft VS Code\bin;C:\Users\Darshan\AppData\Roaming\Dashlane\6.5.0.12978\bin\Firefox_Extension\{442718d9-475e-452a-b3e1-fb1ee16b8e9f}\components;C:\Users\Darshan\AppData\Roaming\Dashlane\6.5.0.12978\ucrt;C:\Users\Darshan\AppData\Roaming\Dashlane\6.5.0.12978\bin\Qt;C:\Users\Darshan\AppData\Roaming\Dashlane\6.5.0.12978\ucrt;C:\Users\Darshan\AppData\Roaming\Dashlane\6.5.0.12978\bin\Ssl;C:\Users\Darshan\AppData\Local\GitHubDesktop\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\extras\CUPTI\lib64;D:\Programs\Python Installed;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\tools;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\lib\x64;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\include;D:\Programs;C:\Program Files\Java\jdk1.8.0_201;D:\Programs\msys2\usr\bin;C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC;
SET PYTHON_BIN_PATH=D:/Programs/Python Installed/python.exe
SET PYTHON_LIB_PATH=D:/Programs/Python Installed/lib/site-packages
SET TF_CUDA_CLANG=0
SET TF_CUDA_COMPUTE_CAPABILITIES=6.1
SET TF_CUDA_VERSION=10.1
SET TF_CUDNN_VERSION=7
SET TF_NEED_CUDA=1
SET TF_NEED_OPENCL_SYCL=0
SET TF_NEED_ROCM=0
D:/Programs/msys2/usr/bin/bash.exe -c source external/bazel_tools/tools/genrule/genrule-setup.sh; bazel-out/x64_windows-opt/bin/tensorflow/lite/toco/toco --input_format=TENSORFLOW_GRAPHDEF --output_format=TFLITE --input_file=tensorflow/lite/python/testdata/gather.pbtxt --output_file=bazel-out/x64_windows-opt/genfiles/tensorflow/lite/python/testdata/gather_string.tflite --input_arrays=input,indices --output_arrays=output
Execution platform: @bazel_tools//platforms:host_platform
C:/users/darshan/_bazel_darshan/jojkqojs/execroot/org_tensorflow/bazel-out/x64_windows-opt/bin/tensorflow/lite/toco/toco: error while loading shared libraries: cudnn64_7.dll: cannot open shared object file: No such file or directory
Target //tensorflow/tools/pip_package:build_pip_package failed to build
INFO: Elapsed time: 126.000s, Critical Path: 24.69s
INFO: 32 processes: 32 local.
FAILED: Build did NOT complete successfully
Previously I tried another combination of tensorflow-1.12 and bazel-0.15, everything else was same. But I kept running into another which I looked around a lot and there were threads about it but no solution. So I started all over with latest version. (My protobuf version is 5.6.0, saw in another thread where they downgraded it to 5.2.1. I don't know if it will really help me here since I'm configuring the latest version to work on my system)
Any and all help is really appreciated!!
--
Thanks
Best Regards | 1.0 | :gather_string failed (Exit 127) - Hello all,
I'm having issues installing tensorflow on my windows system using bazel. If there is any other alternative or fix please let me know
- **Have I written custom code (as opposed to using a stock example script provided in TensorFlow)**: No
- **OS Platform and Distribution (e.g., Linux Ubuntu 16.04)**: Windows 10 Build 17134
- **Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device**:
- **TensorFlow installed from (source or binary)**: Yes
- **TensorFlow version (use command below)**: Latest version (I'm building it off current master, can't see which version it is)
- **Python version**: 3.6.6
- **Bazel version (if compiling from source)**: 0.23.2
- **GCC/Compiler version (if compiling from source)**: Using VS2017 with v140 toolkit
- **CUDA/cuDNN version**: CUDA; 10.1.105 cudnn:7.5.0
- **GPU model and memory**: GTX 1060 6GB
- **Exact command to reproduce**:
git clone https://github.com/tensorflow/tensorflow.git
cd tensorflow
python ./configure.py
bazel build --config=opt --config=cuda --define=no_tensorflow_py_deps=true --copt=-nvcc_options=disable-warnings //tensorflow/tools/pip_package:build_pip_package
### Describe the problem
HItting this error
ERROR: C:/users/darshan/tensorflow/tensorflow/lite/python/testdata/BUILD:35:1: Executing genrule //tensorflow/lite/python/testdata:gather_string failed (Exit 127): bash.exe failed: error executing command
### Source code / logs
INFO: From Linking tensorflow/lite/toco/toco:
LINK : warning LNK4044: unrecognized option '/Wl,-rpath,../local_config_cuda/cuda/lib64'; ignored
LINK : warning LNK4044: unrecognized option '/Wl,-rpath,../local_config_cuda/cuda/extras/CUPTI/lib64'; ignored
LINK : warning LNK4044: unrecognized option '/ldl'; ignored
LINK : warning LNK4044: unrecognized option '/ldl'; ignored
LINK : warning LNK4044: unrecognized option '/ldl'; ignored
Creating library bazel-out/x64_windows-opt/bin/tensorflow/lite/toco/toco.lib and object bazel-out/x64_windows-opt/bin/tensorflow/lite/toco/toco.exp
libcudnn_plugin.lo(cuda_dnn.o) : warning LNK4217: locally defined symbol ?ThenBlasGemm@Stream@stream_executor@@QEAAAEAV12@W4Transpose@blas@2@0_K11MAEBV?$DeviceMemory@M@2@H2HMPEAV52@H@Z (public: class stream_executor::Stream & __cdecl stream_executor::Stream::ThenBlasGemm(enum stream_executor::blas::Transpose,enum stream_executor::blas::Transpose,unsigned __int64,unsigned __int64,unsigned __int64,float,class stream_executor::DeviceMemory<float> const &,int,class stream_executor::DeviceMemory<float> const &,int,float,class stream_executor::DeviceMemory<float> *,int)) imported in function "public: virtual bool __cdecl stream_executor::gpu::CudnnSupport::DoMatMul(class stream_executor::Stream *,class stream_executor::DeviceMemory<float> const &,class stream_executor::DeviceMemory<float> const &,class stream_executor::dnn::BatchDescriptor const &,class stream_executor::dnn::BatchDescriptor const &,class stream_executor::DeviceMemory<float> *)" (?DoMatMul@CudnnSupport@gpu@stream_executor@@UEAA_NPEAVStream@3@AEBV?$DeviceMemory@M@3@1AEBVBatchDescriptor@dnn@3@2PEAV53@@Z)
ERROR: C:/users/darshan/tensorflow/tensorflow/lite/python/testdata/BUILD:35:1: Executing genrule //tensorflow/lite/python/testdata:gather_string failed (Exit 127): bash.exe failed: error executing command
cd C:/users/darshan/_bazel_darshan/jojkqojs/execroot/org_tensorflow
SET CUDA_TOOLKIT_PATH=C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v10.1
SET CUDNN_INSTALL_PATH=C:/Program Files/cuda
SET PATH=D:\Programs\msys2\usr\bin;D:\Programs\msys2\bin;D:\Programs\Python Installed\Scripts\;D:\Programs\Python Installed\;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\libnvvp;C:\Program Files (x86)\Common Files\Oracle\Java\javapath;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Windows\System32\OpenSSH\;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Strawberry\c\bin;C:\Strawberry\perl\site\bin;C:\Strawberry\perl\bin;C:\Program Files (x86)\GnuWin32\bin;C:\Program Files\Git\cmd;D:\Software\MPC;C:\Program Files\dotnet\;C:\Program Files\Microsoft SQL Server\130\Tools\Binn\;C:\Program Files\Java\jre1.8.0_181\bin;C:\Program Files\PuTTY\;C:\Program Files (x86)\Microsoft SQL Server\Client SDK\ODBC\130\Tools\Binn\;C:\Program Files (x86)\Microsoft SQL Server\140\Tools\Binn\;C:\Program Files (x86)\Microsoft SQL Server\140\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\140\Tools\Binn\ManagementStudio\;C:\Program Files\NVIDIA Corporation\NVIDIA NvDLISR;C:\Program Files\NVIDIA Corporation\Nsight Compute 2019.1\;C:\Users\Darshan\AppData\Local\Microsoft\WindowsApps;C:\Users\Darshan\AppData\Local\Programs\Microsoft VS Code\bin;C:\Users\Darshan\AppData\Roaming\Dashlane\6.5.0.12978\bin\Firefox_Extension\{442718d9-475e-452a-b3e1-fb1ee16b8e9f}\components;C:\Users\Darshan\AppData\Roaming\Dashlane\6.5.0.12978\ucrt;C:\Users\Darshan\AppData\Roaming\Dashlane\6.5.0.12978\bin\Qt;C:\Users\Darshan\AppData\Roaming\Dashlane\6.5.0.12978\ucrt;C:\Users\Darshan\AppData\Roaming\Dashlane\6.5.0.12978\bin\Ssl;C:\Users\Darshan\AppData\Local\GitHubDesktop\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\extras\CUPTI\lib64;D:\Programs\Python Installed;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\tools;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\lib\x64;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\include;D:\Programs;C:\Program Files\Java\jdk1.8.0_201;D:\Programs\msys2\usr\bin;C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC;
SET PYTHON_BIN_PATH=D:/Programs/Python Installed/python.exe
SET PYTHON_LIB_PATH=D:/Programs/Python Installed/lib/site-packages
SET TF_CUDA_CLANG=0
SET TF_CUDA_COMPUTE_CAPABILITIES=6.1
SET TF_CUDA_VERSION=10.1
SET TF_CUDNN_VERSION=7
SET TF_NEED_CUDA=1
SET TF_NEED_OPENCL_SYCL=0
SET TF_NEED_ROCM=0
D:/Programs/msys2/usr/bin/bash.exe -c source external/bazel_tools/tools/genrule/genrule-setup.sh; bazel-out/x64_windows-opt/bin/tensorflow/lite/toco/toco --input_format=TENSORFLOW_GRAPHDEF --output_format=TFLITE --input_file=tensorflow/lite/python/testdata/gather.pbtxt --output_file=bazel-out/x64_windows-opt/genfiles/tensorflow/lite/python/testdata/gather_string.tflite --input_arrays=input,indices --output_arrays=output
Execution platform: @bazel_tools//platforms:host_platform
C:/users/darshan/_bazel_darshan/jojkqojs/execroot/org_tensorflow/bazel-out/x64_windows-opt/bin/tensorflow/lite/toco/toco: error while loading shared libraries: cudnn64_7.dll: cannot open shared object file: No such file or directory
Target //tensorflow/tools/pip_package:build_pip_package failed to build
INFO: Elapsed time: 126.000s, Critical Path: 24.69s
INFO: 32 processes: 32 local.
FAILED: Build did NOT complete successfully
Previously I tried another combination of tensorflow-1.12 and bazel-0.15, everything else was same. But I kept running into another which I looked around a lot and there were threads about it but no solution. So I started all over with latest version. (My protobuf version is 5.6.0, saw in another thread where they downgraded it to 5.2.1. I don't know if it will really help me here since I'm configuring the latest version to work on my system)
Any and all help is really appreciated!!
--
Thanks
Best Regards | non_priority | gather string failed exit hello all i m having issues installing tensorflow on my windows system using bazel if there is any other alternative or fix please let me know have i written custom code as opposed to using a stock example script provided in tensorflow no os platform and distribution e g linux ubuntu windows build mobile device e g iphone pixel samsung galaxy if the issue happens on mobile device tensorflow installed from source or binary yes tensorflow version use command below latest version i m building it off current master can t see which version it is python version bazel version if compiling from source gcc compiler version if compiling from source using with toolkit cuda cudnn version cuda cudnn gpu model and memory gtx exact command to reproduce git clone cd tensorflow python configure py bazel build config opt config cuda define no tensorflow py deps true copt nvcc options disable warnings tensorflow tools pip package build pip package describe the problem hitting this error error c users darshan tensorflow tensorflow lite python testdata build executing genrule tensorflow lite python testdata gather string failed exit bash exe failed error executing command source code logs info from linking tensorflow lite toco toco link warning unrecognized option wl rpath local config cuda cuda ignored link warning unrecognized option wl rpath local config cuda cuda extras cupti ignored link warning unrecognized option ldl ignored link warning unrecognized option ldl ignored link warning unrecognized option ldl ignored creating library bazel out windows opt bin tensorflow lite toco toco lib and object bazel out windows opt bin tensorflow lite toco toco exp libcudnn plugin lo cuda dnn o warning locally defined symbol thenblasgemm stream stream executor blas devicememory m h z public class stream executor stream cdecl stream executor stream thenblasgemm enum stream executor blas transpose enum stream executor blas transpose unsigned unsigned unsigned float class stream executor devicememory const int class stream executor devicememory const int float class stream executor devicememory int imported in function public virtual bool cdecl stream executor gpu cudnnsupport domatmul class stream executor stream class stream executor devicememory const class stream executor devicememory const class stream executor dnn batchdescriptor const class stream executor dnn batchdescriptor const class stream executor devicememory domatmul cudnnsupport gpu stream executor ueaa npeavstream aebv devicememory m dnn z error c users darshan tensorflow tensorflow lite python testdata build executing genrule tensorflow lite python testdata gather string failed exit bash exe failed error executing command cd c users darshan bazel darshan jojkqojs execroot org tensorflow set cuda toolkit path c program files nvidia gpu computing toolkit cuda set cudnn install path c program files cuda set path d programs usr bin d programs bin d programs python installed scripts d programs python installed c program files nvidia gpu computing toolkit cuda bin c program files nvidia gpu computing toolkit cuda libnvvp c program files common files oracle java javapath c windows c windows c windows wbem c windows windowspowershell c windows openssh c program files nvidia corporation physx common c strawberry c bin c strawberry perl site bin c strawberry perl bin c program files bin c program files git cmd d software mpc c program files dotnet c program files microsoft sql server tools binn c program files java bin c program files putty c program files microsoft sql server client sdk odbc tools binn c program files microsoft sql server tools binn c program files microsoft sql server dts binn c program files microsoft sql server tools binn managementstudio c program files nvidia corporation nvidia nvdlisr c program files nvidia corporation nsight compute c users darshan appdata local microsoft windowsapps c users darshan appdata local programs microsoft vs code bin c users darshan appdata roaming dashlane bin firefox extension components c users darshan appdata roaming dashlane ucrt c users darshan appdata roaming dashlane bin qt c users darshan appdata roaming dashlane ucrt c users darshan appdata roaming dashlane bin ssl c users darshan appdata local githubdesktop bin c program files nvidia gpu computing toolkit cuda bin c program files nvidia gpu computing toolkit cuda extras cupti d programs python installed c program files nvidia gpu computing toolkit cuda tools c program files nvidia gpu computing toolkit cuda lib c program files nvidia gpu computing toolkit cuda include d programs c program files java d programs usr bin c program files microsoft visual studio vc set python bin path d programs python installed python exe set python lib path d programs python installed lib site packages set tf cuda clang set tf cuda compute capabilities set tf cuda version set tf cudnn version set tf need cuda set tf need opencl sycl set tf need rocm d programs usr bin bash exe c source external bazel tools tools genrule genrule setup sh bazel out windows opt bin tensorflow lite toco toco input format tensorflow graphdef output format tflite input file tensorflow lite python testdata gather pbtxt output file bazel out windows opt genfiles tensorflow lite python testdata gather string tflite input arrays input indices output arrays output execution platform bazel tools platforms host platform c users darshan bazel darshan jojkqojs execroot org tensorflow bazel out windows opt bin tensorflow lite toco toco error while loading shared libraries dll cannot open shared object file no such file or directory target tensorflow tools pip package build pip package failed to build info elapsed time critical path info processes local failed build did not complete successfully previously i tried another combination of tensorflow and bazel everything else was same but i kept running into another which i looked around a lot and there were threads about it but no solution so i started all over with latest version my protobuf version is saw in another thread where they downgraded it to i don t know if it will really help me here since i m configuring the latest version to work on my system any and all help is really appreciated thanks best regards | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.