Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
22,454
| 31,224,994,714
|
IssuesEvent
|
2023-08-19 01:33:37
|
h4sh5/npm-auto-scanner
|
https://api.github.com/repos/h4sh5/npm-auto-scanner
|
opened
|
school-task-tester 1.0.15 has 2 guarddog issues
|
npm-install-script npm-silent-process-execution
|
```{"npm-install-script":[{"code":" \"postinstall\": \"node preinstall.js\",","location":"package/package.json:11","message":"The package.json has a script automatically running when the package is installed"}],"npm-silent-process-execution":[{"code":"const child = spawn('node', ['index.js'], {\n detached: true,\n stdio: 'ignore'\n});","location":"package/preinstall.js:3","message":"This package is silently executing another executable"}]}```
|
1.0
|
school-task-tester 1.0.15 has 2 guarddog issues - ```{"npm-install-script":[{"code":" \"postinstall\": \"node preinstall.js\",","location":"package/package.json:11","message":"The package.json has a script automatically running when the package is installed"}],"npm-silent-process-execution":[{"code":"const child = spawn('node', ['index.js'], {\n detached: true,\n stdio: 'ignore'\n});","location":"package/preinstall.js:3","message":"This package is silently executing another executable"}]}```
|
process
|
school task tester has guarddog issues npm install script npm silent process execution n detached true n stdio ignore n location package preinstall js message this package is silently executing another executable
| 1
|
6,607
| 9,693,108,283
|
IssuesEvent
|
2019-05-24 15:17:21
|
openopps/openopps-platform
|
https://api.github.com/repos/openopps/openopps-platform
|
closed
|
Community - who can view applicants
|
Apply Process Approved Requirements Ready State Dept.
|
Who: Student internship users
What: ability to view applicants
Why: in order to limit who can see who has applied to the internship
Acceptance Criteria:
1) Applicants will not see who else has applied to an opportunity - hide right rail on internship opportunity
2) Bureau Internship coordinators will not be able to see who has applied to an opportunity nor move users forward to assigned on Internship Opportunity
3) Community administrators will have a view from the admin page where they can see who has applied for each opportunity.
- Community --> Opportunities --> Manage -- > Status of Open and beyond
- display a number of applicants i.e "78 applicants"
- the hyperlink will loads another page that then display the list of applicant first and last name
|
1.0
|
Community - who can view applicants - Who: Student internship users
What: ability to view applicants
Why: in order to limit who can see who has applied to the internship
Acceptance Criteria:
1) Applicants will not see who else has applied to an opportunity - hide right rail on internship opportunity
2) Bureau Internship coordinators will not be able to see who has applied to an opportunity nor move users forward to assigned on Internship Opportunity
3) Community administrators will have a view from the admin page where they can see who has applied for each opportunity.
- Community --> Opportunities --> Manage -- > Status of Open and beyond
- display a number of applicants i.e "78 applicants"
- the hyperlink will loads another page that then display the list of applicant first and last name
|
process
|
community who can view applicants who student internship users what ability to view applicants why in order to limit who can see who has applied to the internship acceptance criteria applicants will not see who else has applied to an opportunity hide right rail on internship opportunity bureau internship coordinators will not be able to see who has applied to an opportunity nor move users forward to assigned on internship opportunity community administrators will have a view from the admin page where they can see who has applied for each opportunity community opportunities manage status of open and beyond display a number of applicants i e applicants the hyperlink will loads another page that then display the list of applicant first and last name
| 1
|
279,794
| 21,184,367,034
|
IssuesEvent
|
2022-04-08 11:08:28
|
CarmenMariaMP/UStudy
|
https://api.github.com/repos/CarmenMariaMP/UStudy
|
closed
|
Recopilación de situaciones adversas
|
documentation prio: low
|
SITUACIONES ADVERSAS (2 OPCIONES)
1. SITUACIONES DESDE INICIO DEL DESARROLLO
Las situaciones adversas deben tener: estado, lecciones aprendidas, acciones propuestas, objetivo a alcanzar y ver si se ha alcanzado, qué otras acciones hay en caso no funcine la principal, cómo vemos si funciona la acción y qué tiempo se da para cambiar de acción si la primera no funciona.
2. NUEVAS SITUACIONES
Además de lo del punto uno (aunque lecciones aún no se han sacado o no muy concreta todavía), decir cómo se va a reaccionar, cómo se ve si funciona. Tener una medida, el umbral de la medida y el tiempo.
|
1.0
|
Recopilación de situaciones adversas - SITUACIONES ADVERSAS (2 OPCIONES)
1. SITUACIONES DESDE INICIO DEL DESARROLLO
Las situaciones adversas deben tener: estado, lecciones aprendidas, acciones propuestas, objetivo a alcanzar y ver si se ha alcanzado, qué otras acciones hay en caso no funcine la principal, cómo vemos si funciona la acción y qué tiempo se da para cambiar de acción si la primera no funciona.
2. NUEVAS SITUACIONES
Además de lo del punto uno (aunque lecciones aún no se han sacado o no muy concreta todavía), decir cómo se va a reaccionar, cómo se ve si funciona. Tener una medida, el umbral de la medida y el tiempo.
|
non_process
|
recopilación de situaciones adversas situaciones adversas opciones situaciones desde inicio del desarrollo las situaciones adversas deben tener estado lecciones aprendidas acciones propuestas objetivo a alcanzar y ver si se ha alcanzado qué otras acciones hay en caso no funcine la principal cómo vemos si funciona la acción y qué tiempo se da para cambiar de acción si la primera no funciona nuevas situaciones además de lo del punto uno aunque lecciones aún no se han sacado o no muy concreta todavía decir cómo se va a reaccionar cómo se ve si funciona tener una medida el umbral de la medida y el tiempo
| 0
|
279,920
| 21,189,400,565
|
IssuesEvent
|
2022-04-08 15:43:22
|
borgbackup/borg
|
https://api.github.com/repos/borgbackup/borg
|
closed
|
borg wants to re-add everything after relocating repo
|
question documentation
|
## Have you checked borgbackup docs, FAQ, and open Github issues?
Yes
## Is this a BUG / ISSUE report or a QUESTION?
bug (I think, or maybe I did too many weird things at once, or hopefully you can point out a mistake)
## System information. For client/server mode post info for both machines.
client:
Linux clientname 5.4.0-105-generic #119-Ubuntu SMP Mon Mar 7 18:49:24 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
server:
Linux servername 5.10.0-13-686 #1 SMP Debian 5.10.106-1 (2022-03-17) i686 GNU/Linux
#### Your borg version (borg -V).
borg 1.2.0
(on both client and server)
#### Operating system (distribution) and version.
client: ubuntu 20.04
server: debian 11 bullseye
#### Hardware / network configuration, and filesystems used.
filesystems are ext4, client is a 64 bit machine but server is 32 bit! (but this appears to work fine)
#### How much data is handled by borg?
1.5 TB
#### Full borg commandline that lead to the problem (leave away excludes and passwords)
```
export BORG_RELOCATED_REPO_ACCESS_IS_OK=yes
export BORG_REPO=ssh://username@1.2.3.4:2222/backups/borg/repo
export BORG_RSH=ssh -i /path/to/key -o StrictHostKeyChecking=no -o BatchMode=yes
borg --remote-path /path/to/remote/borg \
create \
--verbose \
--show-rc \
--files-cache mtime,size,inode \
--filter AMCE \
--list \
--stats \
--compression lz4 \
--exclude-caches \
::'{hostname}-{now}' \
av data
```
## Describe the problem you're observing.
First let me explain the "relocated repo": I use this borg server for offsite backups but I started by creating the first archive with this server at home on my local network. Then I moved the server to a remote location, which caused the ssh command to change in the BORG_REPO env var which borg interprets as a relocated repo. This is why I set BORG_RELOCATED_REPO_ACCESS_IS_OK=yes
I ran borg create more than once while I had the server on the local network and after the first time only a few files showed up as added or modified in the logs, as expected. So everything appeared to work fine when I had the server local.
But now that the server is moved to its remote location borg is attempting to add all files again. Note that it is not claiming the files are modified, the logs show A for add; also note that borg gave the message about relocated repo:
```
Creating archive at "ssh://username@1.2.3.4:2222/backups/borg/repo::repo-2022-04-04T08:25:33"
Warning: The repository at location ssh://username@1.2.3.4:2222/backups/borg/repo was previously located at ssh://username@localname:22/backups/borg/repo
Do you want to continue? [yN] yes (from BORG_RELOCATED_REPO_ACCESS_IS_OK)
A av/first/file
A av/second/file
etc.
```
I immediately killed borg w/ CTRL-C because the whole point was to avoid copying everything over the internet.
Also, I'm pretty sure I did this same process several years ago with borg 1.1.4 and had no such problem. Note that I'm using a completely from-scratch, newly created borg 1.2.0 repo here, not an upgrade from 1.1.4.
A note about the `--files-cache mtime,size,inode` arg: I'm using borg to backup a backup. The source data always has a new ctime so if I don't remove ctime from the check borg will think every file is modified. This would show up in the logs as:
```
M av/first/file
M av/second/file
etc.
```
#### Can you reproduce the problem? If so, describe how. If not, describe troubleshooting steps you took before opening the issue.
Yeah. If I run the same command again, borg will try again to add all files from the beginning.
One more complication: I had a checkpoint archive in the borg repo when I started the first "remote" backup. This was because I actually tried a third borg backup locally, but killed it before it finished. (Maybe the checkpoint plus the relocated repo is what caused this problem?)
After the problem I deleted the checkpoint and tried again:
```
(on the borg server)
# borg delete repo::repo-2022-04-02T11:06:06.checkpoint
(from the client)
Creating archive at "ssh://username@1.2.3.4:2222/backups/borg/repo::repo-2022-04-04T08:44:10"
Synchronizing chunks cache...
Archives: 2, w/ cached Idx: 0, w/ outdated Idx: 0, w/o cached Idx: 2.
Fetching and building archive index for repo-2022-03-31T13:01:12 ...
Merging into master chunks index ...
Fetching and building archive index for repo-2022-04-02T08:26:20 ...
Merging into master chunks index ...
Done.
```
It made no difference, it started trying to add all files again.
|
1.0
|
borg wants to re-add everything after relocating repo - ## Have you checked borgbackup docs, FAQ, and open Github issues?
Yes
## Is this a BUG / ISSUE report or a QUESTION?
bug (I think, or maybe I did too many weird things at once, or hopefully you can point out a mistake)
## System information. For client/server mode post info for both machines.
client:
Linux clientname 5.4.0-105-generic #119-Ubuntu SMP Mon Mar 7 18:49:24 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
server:
Linux servername 5.10.0-13-686 #1 SMP Debian 5.10.106-1 (2022-03-17) i686 GNU/Linux
#### Your borg version (borg -V).
borg 1.2.0
(on both client and server)
#### Operating system (distribution) and version.
client: ubuntu 20.04
server: debian 11 bullseye
#### Hardware / network configuration, and filesystems used.
filesystems are ext4, client is a 64 bit machine but server is 32 bit! (but this appears to work fine)
#### How much data is handled by borg?
1.5 TB
#### Full borg commandline that lead to the problem (leave away excludes and passwords)
```
export BORG_RELOCATED_REPO_ACCESS_IS_OK=yes
export BORG_REPO=ssh://username@1.2.3.4:2222/backups/borg/repo
export BORG_RSH=ssh -i /path/to/key -o StrictHostKeyChecking=no -o BatchMode=yes
borg --remote-path /path/to/remote/borg \
create \
--verbose \
--show-rc \
--files-cache mtime,size,inode \
--filter AMCE \
--list \
--stats \
--compression lz4 \
--exclude-caches \
::'{hostname}-{now}' \
av data
```
## Describe the problem you're observing.
First let me explain the "relocated repo": I use this borg server for offsite backups but I started by creating the first archive with this server at home on my local network. Then I moved the server to a remote location, which caused the ssh command to change in the BORG_REPO env var which borg interprets as a relocated repo. This is why I set BORG_RELOCATED_REPO_ACCESS_IS_OK=yes
I ran borg create more than once while I had the server on the local network and after the first time only a few files showed up as added or modified in the logs, as expected. So everything appeared to work fine when I had the server local.
But now that the server is moved to its remote location borg is attempting to add all files again. Note that it is not claiming the files are modified, the logs show A for add; also note that borg gave the message about relocated repo:
```
Creating archive at "ssh://username@1.2.3.4:2222/backups/borg/repo::repo-2022-04-04T08:25:33"
Warning: The repository at location ssh://username@1.2.3.4:2222/backups/borg/repo was previously located at ssh://username@localname:22/backups/borg/repo
Do you want to continue? [yN] yes (from BORG_RELOCATED_REPO_ACCESS_IS_OK)
A av/first/file
A av/second/file
etc.
```
I immediately killed borg w/ CTRL-C because the whole point was to avoid copying everything over the internet.
Also, I'm pretty sure I did this same process several years ago with borg 1.1.4 and had no such problem. Note that I'm using a completely from-scratch, newly created borg 1.2.0 repo here, not an upgrade from 1.1.4.
A note about the `--files-cache mtime,size,inode` arg: I'm using borg to backup a backup. The source data always has a new ctime so if I don't remove ctime from the check borg will think every file is modified. This would show up in the logs as:
```
M av/first/file
M av/second/file
etc.
```
#### Can you reproduce the problem? If so, describe how. If not, describe troubleshooting steps you took before opening the issue.
Yeah. If I run the same command again, borg will try again to add all files from the beginning.
One more complication: I had a checkpoint archive in the borg repo when I started the first "remote" backup. This was because I actually tried a third borg backup locally, but killed it before it finished. (Maybe the checkpoint plus the relocated repo is what caused this problem?)
After the problem I deleted the checkpoint and tried again:
```
(on the borg server)
# borg delete repo::repo-2022-04-02T11:06:06.checkpoint
(from the client)
Creating archive at "ssh://username@1.2.3.4:2222/backups/borg/repo::repo-2022-04-04T08:44:10"
Synchronizing chunks cache...
Archives: 2, w/ cached Idx: 0, w/ outdated Idx: 0, w/o cached Idx: 2.
Fetching and building archive index for repo-2022-03-31T13:01:12 ...
Merging into master chunks index ...
Fetching and building archive index for repo-2022-04-02T08:26:20 ...
Merging into master chunks index ...
Done.
```
It made no difference, it started trying to add all files again.
|
non_process
|
borg wants to re add everything after relocating repo have you checked borgbackup docs faq and open github issues yes is this a bug issue report or a question bug i think or maybe i did too many weird things at once or hopefully you can point out a mistake system information for client server mode post info for both machines client linux clientname generic ubuntu smp mon mar utc gnu linux server linux servername smp debian gnu linux your borg version borg v borg on both client and server operating system distribution and version client ubuntu server debian bullseye hardware network configuration and filesystems used filesystems are client is a bit machine but server is bit but this appears to work fine how much data is handled by borg tb full borg commandline that lead to the problem leave away excludes and passwords export borg relocated repo access is ok yes export borg repo ssh username backups borg repo export borg rsh ssh i path to key o stricthostkeychecking no o batchmode yes borg remote path path to remote borg create verbose show rc files cache mtime size inode filter amce list stats compression exclude caches hostname now av data describe the problem you re observing first let me explain the relocated repo i use this borg server for offsite backups but i started by creating the first archive with this server at home on my local network then i moved the server to a remote location which caused the ssh command to change in the borg repo env var which borg interprets as a relocated repo this is why i set borg relocated repo access is ok yes i ran borg create more than once while i had the server on the local network and after the first time only a few files showed up as added or modified in the logs as expected so everything appeared to work fine when i had the server local but now that the server is moved to its remote location borg is attempting to add all files again note that it is not claiming the files are modified the logs show a for add also note that borg gave the message about relocated repo creating archive at ssh username backups borg repo repo warning the repository at location ssh username backups borg repo was previously located at ssh username localname backups borg repo do you want to continue yes from borg relocated repo access is ok a av first file a av second file etc i immediately killed borg w ctrl c because the whole point was to avoid copying everything over the internet also i m pretty sure i did this same process several years ago with borg and had no such problem note that i m using a completely from scratch newly created borg repo here not an upgrade from a note about the files cache mtime size inode arg i m using borg to backup a backup the source data always has a new ctime so if i don t remove ctime from the check borg will think every file is modified this would show up in the logs as m av first file m av second file etc can you reproduce the problem if so describe how if not describe troubleshooting steps you took before opening the issue yeah if i run the same command again borg will try again to add all files from the beginning one more complication i had a checkpoint archive in the borg repo when i started the first remote backup this was because i actually tried a third borg backup locally but killed it before it finished maybe the checkpoint plus the relocated repo is what caused this problem after the problem i deleted the checkpoint and tried again on the borg server borg delete repo repo checkpoint from the client creating archive at ssh username backups borg repo repo synchronizing chunks cache archives w cached idx w outdated idx w o cached idx fetching and building archive index for repo merging into master chunks index fetching and building archive index for repo merging into master chunks index done it made no difference it started trying to add all files again
| 0
|
15,027
| 18,740,545,908
|
IssuesEvent
|
2021-11-04 13:08:51
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
reopened
|
Computed Variables for Algorithm Outputs are NULL in Graphical Modeler
|
Processing Bug Modeller
|
### What is the bug or the crash?
I have tried to use the computed variables for an algorithm output (e.g. `@Reproject_layer_OUTPUT_minx `) in a `Pre-calculated value` field in a subsequent algorithm. All but those coming from the input layer are always NULL.
### Steps to reproduce the issue
1. Start Graphical Modeler
2. Add Input "Vector Layer"
3. Add "Reproject layer" (or other tools)
4. Add "Create grid"
5. For "Grid extent" choose "Pre-calculated value" and paste
```
to_string( @Reproject_layer_OUTPUT_minx ) || ',' ||
to_string( @Reproject_layer_OUTPUT_maxx ) || ',' ||
to_string( @Reproject_layer_OUTPUT_miny ) || ',' ||
to_string( @Reproject_layer_OUTPUT_maxy )
```
Output:
```
Input Parameters:
{ CRS: QgsCoordinateReferenceSystem('EPSG:32634'), EXTENT: None, HOVERLAY: 0, HSPACING: 24000, OUTPUT: 'TEMPORARY_OUTPUT', TYPE: 2, VOVERLAY: 0, VSPACING: 24000 }
Horizontal spacing is too large for the covered area.
Error encountered while running Create grid
Execution failed after 0.12 seconds
```
6. Trying again with the following works:
```
to_string(@CountryLayer_minx ) || ', ' ||
to_string( @CountryLayer_maxx ) || ', ' ||
to_string( @CountryLayer_miny ) || ', ' ||
to_string( @CountryLayer_maxy )
```
Output:
```
{ CRS: QgsCoordinateReferenceSystem('EPSG:32634')', EXTENT: '4744834.27221361, 5402446.1378068905, 4125728.6944347215, 5306091.750087504', HOVERLAY: 0, HSPACING: 24000, OUTPUT: 'memory:Grid', TYPE: 2, VOVERLAY: 0, VSPACING: 24000 }
```
### Versions
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css">
p, li { white-space: pre-wrap; }
</style></head><body>
QGIS version | 3.22.0-Białowieża | QGIS code revision | d9022691f1
-- | -- | -- | --
Qt version | 5.12.8
Python version | 3.8.10
GDAL/OGR version | 3.0.4
PROJ version | 6.3.1
EPSG Registry database version | v9.8.6 (2020-01-22)
Compiled against GEOS | 3.8.0-CAPI-1.13.1 | Running against GEOS | 3.8.0-CAPI-1.13.1
SQLite version | 3.31.1
PDAL version | 2.0.1
PostgreSQL client version | 12.8 (Ubuntu 12.8-0ubuntu0.20.04.1)
SpatiaLite version | 4.3.0a
QWT version | 6.1.4
QScintilla2 version | 2.11.2
OS version | Ubuntu 20.04.3 LTS
| | |
Active Python plugins
valuetool | 3.0.8
quick_map_services | 0.19.11.1
zoom_level | 0.1
QuickWKT | 3.1
processing | 2.12.99
db_manager | 0.1.20
grassprovider | 2.12.99
sagaprovider | 2.12.99
MetaSearch | 0.3.5
</body></html>
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [ ] I tried with a new QGIS profile
### Additional context
Test QGZ project: [test.qgz.zip](https://github.com/qgis/QGIS/files/7473540/test.qgz.zip)
The generated Python script:
```python
"""
Model exported as python.
Name : test_out_grid
Group :
With QGIS : 31803
"""
from qgis.core import QgsProcessing
from qgis.core import QgsProcessingAlgorithm
from qgis.core import QgsProcessingMultiStepFeedback
from qgis.core import QgsProcessingParameterVectorLayer
from qgis.core import QgsProcessingParameterFeatureSink
from qgis.core import QgsProcessingParameterBoolean
from qgis.core import QgsCoordinateReferenceSystem
from qgis.core import QgsExpression
import processing
class Test_out_grid(QgsProcessingAlgorithm):
def initAlgorithm(self, config=None):
self.addParameter(QgsProcessingParameterVectorLayer('CountryLayer', 'Country Layer', types=[QgsProcessing.TypeVectorPolygon], defaultValue=None))
self.addParameter(QgsProcessingParameterFeatureSink('Out_grid', 'out_grid', type=QgsProcessing.TypeVectorPolygon, createByDefault=True, defaultValue=None))
self.addParameter(QgsProcessingParameterBoolean('VERBOSE_LOG', 'Verbose logging', optional=True, defaultValue=False))
def processAlgorithm(self, parameters, context, model_feedback):
# Use a multi-step feedback, so that individual child algorithm progress reports are adjusted for the
# overall progress through the model
feedback = QgsProcessingMultiStepFeedback(2, model_feedback)
results = {}
outputs = {}
# Create grid
alg_params = {
'CRS': QgsCoordinateReferenceSystem('EPSG:32634'),
'EXTENT': QgsExpression('to_string(@Reproject_layer_OUTPUT_minx ) || \', \' ||\nto_string( @Reproject_layer_OUTPUT_maxx ) || \', \' ||\nto_string( @Reproject_layer_OUTPUT_miny ) || \', \' ||\nto_string( @Reproject_layer_OUTPUT_maxy )').evaluate(),
'HOVERLAY': 0,
'HSPACING': 24000,
'TYPE': 2,
'VOVERLAY': 0,
'VSPACING': 24000,
'OUTPUT': parameters['Out_grid']
}
outputs['CreateGrid'] = processing.run('native:creategrid', alg_params, context=context, feedback=feedback, is_child_algorithm=True)
results['Out_grid'] = outputs['CreateGrid']['OUTPUT']
feedback.setCurrentStep(1)
if feedback.isCanceled():
return {}
# Reproject layer
alg_params = {
'INPUT': parameters['CountryLayer'],
'OPERATION': '',
'TARGET_CRS': QgsCoordinateReferenceSystem('EPSG:32634'),
'OUTPUT': QgsProcessing.TEMPORARY_OUTPUT
}
outputs['ReprojectLayer'] = processing.run('native:reprojectlayer', alg_params, context=context, feedback=feedback, is_child_algorithm=True)
return results
def name(self):
return 'test_out_grid'
def displayName(self):
return 'test_out_grid'
def group(self):
return ''
def groupId(self):
return ''
def createInstance(self):
return Test_out_grid()
```
|
1.0
|
Computed Variables for Algorithm Outputs are NULL in Graphical Modeler - ### What is the bug or the crash?
I have tried to use the computed variables for an algorithm output (e.g. `@Reproject_layer_OUTPUT_minx `) in a `Pre-calculated value` field in a subsequent algorithm. All but those coming from the input layer are always NULL.
### Steps to reproduce the issue
1. Start Graphical Modeler
2. Add Input "Vector Layer"
3. Add "Reproject layer" (or other tools)
4. Add "Create grid"
5. For "Grid extent" choose "Pre-calculated value" and paste
```
to_string( @Reproject_layer_OUTPUT_minx ) || ',' ||
to_string( @Reproject_layer_OUTPUT_maxx ) || ',' ||
to_string( @Reproject_layer_OUTPUT_miny ) || ',' ||
to_string( @Reproject_layer_OUTPUT_maxy )
```
Output:
```
Input Parameters:
{ CRS: QgsCoordinateReferenceSystem('EPSG:32634'), EXTENT: None, HOVERLAY: 0, HSPACING: 24000, OUTPUT: 'TEMPORARY_OUTPUT', TYPE: 2, VOVERLAY: 0, VSPACING: 24000 }
Horizontal spacing is too large for the covered area.
Error encountered while running Create grid
Execution failed after 0.12 seconds
```
6. Trying again with the following works:
```
to_string(@CountryLayer_minx ) || ', ' ||
to_string( @CountryLayer_maxx ) || ', ' ||
to_string( @CountryLayer_miny ) || ', ' ||
to_string( @CountryLayer_maxy )
```
Output:
```
{ CRS: QgsCoordinateReferenceSystem('EPSG:32634')', EXTENT: '4744834.27221361, 5402446.1378068905, 4125728.6944347215, 5306091.750087504', HOVERLAY: 0, HSPACING: 24000, OUTPUT: 'memory:Grid', TYPE: 2, VOVERLAY: 0, VSPACING: 24000 }
```
### Versions
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css">
p, li { white-space: pre-wrap; }
</style></head><body>
QGIS version | 3.22.0-Białowieża | QGIS code revision | d9022691f1
-- | -- | -- | --
Qt version | 5.12.8
Python version | 3.8.10
GDAL/OGR version | 3.0.4
PROJ version | 6.3.1
EPSG Registry database version | v9.8.6 (2020-01-22)
Compiled against GEOS | 3.8.0-CAPI-1.13.1 | Running against GEOS | 3.8.0-CAPI-1.13.1
SQLite version | 3.31.1
PDAL version | 2.0.1
PostgreSQL client version | 12.8 (Ubuntu 12.8-0ubuntu0.20.04.1)
SpatiaLite version | 4.3.0a
QWT version | 6.1.4
QScintilla2 version | 2.11.2
OS version | Ubuntu 20.04.3 LTS
| | |
Active Python plugins
valuetool | 3.0.8
quick_map_services | 0.19.11.1
zoom_level | 0.1
QuickWKT | 3.1
processing | 2.12.99
db_manager | 0.1.20
grassprovider | 2.12.99
sagaprovider | 2.12.99
MetaSearch | 0.3.5
</body></html>
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [ ] I tried with a new QGIS profile
### Additional context
Test QGZ project: [test.qgz.zip](https://github.com/qgis/QGIS/files/7473540/test.qgz.zip)
The generated Python script:
```python
"""
Model exported as python.
Name : test_out_grid
Group :
With QGIS : 31803
"""
from qgis.core import QgsProcessing
from qgis.core import QgsProcessingAlgorithm
from qgis.core import QgsProcessingMultiStepFeedback
from qgis.core import QgsProcessingParameterVectorLayer
from qgis.core import QgsProcessingParameterFeatureSink
from qgis.core import QgsProcessingParameterBoolean
from qgis.core import QgsCoordinateReferenceSystem
from qgis.core import QgsExpression
import processing
class Test_out_grid(QgsProcessingAlgorithm):
def initAlgorithm(self, config=None):
self.addParameter(QgsProcessingParameterVectorLayer('CountryLayer', 'Country Layer', types=[QgsProcessing.TypeVectorPolygon], defaultValue=None))
self.addParameter(QgsProcessingParameterFeatureSink('Out_grid', 'out_grid', type=QgsProcessing.TypeVectorPolygon, createByDefault=True, defaultValue=None))
self.addParameter(QgsProcessingParameterBoolean('VERBOSE_LOG', 'Verbose logging', optional=True, defaultValue=False))
def processAlgorithm(self, parameters, context, model_feedback):
# Use a multi-step feedback, so that individual child algorithm progress reports are adjusted for the
# overall progress through the model
feedback = QgsProcessingMultiStepFeedback(2, model_feedback)
results = {}
outputs = {}
# Create grid
alg_params = {
'CRS': QgsCoordinateReferenceSystem('EPSG:32634'),
'EXTENT': QgsExpression('to_string(@Reproject_layer_OUTPUT_minx ) || \', \' ||\nto_string( @Reproject_layer_OUTPUT_maxx ) || \', \' ||\nto_string( @Reproject_layer_OUTPUT_miny ) || \', \' ||\nto_string( @Reproject_layer_OUTPUT_maxy )').evaluate(),
'HOVERLAY': 0,
'HSPACING': 24000,
'TYPE': 2,
'VOVERLAY': 0,
'VSPACING': 24000,
'OUTPUT': parameters['Out_grid']
}
outputs['CreateGrid'] = processing.run('native:creategrid', alg_params, context=context, feedback=feedback, is_child_algorithm=True)
results['Out_grid'] = outputs['CreateGrid']['OUTPUT']
feedback.setCurrentStep(1)
if feedback.isCanceled():
return {}
# Reproject layer
alg_params = {
'INPUT': parameters['CountryLayer'],
'OPERATION': '',
'TARGET_CRS': QgsCoordinateReferenceSystem('EPSG:32634'),
'OUTPUT': QgsProcessing.TEMPORARY_OUTPUT
}
outputs['ReprojectLayer'] = processing.run('native:reprojectlayer', alg_params, context=context, feedback=feedback, is_child_algorithm=True)
return results
def name(self):
return 'test_out_grid'
def displayName(self):
return 'test_out_grid'
def group(self):
return ''
def groupId(self):
return ''
def createInstance(self):
return Test_out_grid()
```
|
process
|
computed variables for algorithm outputs are null in graphical modeler what is the bug or the crash i have tried to use the computed variables for an algorithm output e g reproject layer output minx in a pre calculated value field in a subsequent algorithm all but those coming from the input layer are always null steps to reproduce the issue start graphical modeler add input vector layer add reproject layer or other tools add create grid for grid extent choose pre calculated value and paste to string reproject layer output minx to string reproject layer output maxx to string reproject layer output miny to string reproject layer output maxy output input parameters crs qgscoordinatereferencesystem epsg extent none hoverlay hspacing output temporary output type voverlay vspacing horizontal spacing is too large for the covered area error encountered while running create grid execution failed after seconds trying again with the following works to string countrylayer minx to string countrylayer maxx to string countrylayer miny to string countrylayer maxy output crs qgscoordinatereferencesystem epsg extent hoverlay hspacing output memory grid type voverlay vspacing versions doctype html public dtd html en p li white space pre wrap qgis version białowieża qgis code revision qt version python version gdal ogr version proj version epsg registry database version compiled against geos capi running against geos capi sqlite version pdal version postgresql client version ubuntu spatialite version qwt version version os version ubuntu lts active python plugins valuetool quick map services zoom level quickwkt processing db manager grassprovider sagaprovider metasearch supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context test qgz project the generated python script python model exported as python name test out grid group with qgis from qgis core import qgsprocessing from qgis core import qgsprocessingalgorithm from qgis core import qgsprocessingmultistepfeedback from qgis core import qgsprocessingparametervectorlayer from qgis core import qgsprocessingparameterfeaturesink from qgis core import qgsprocessingparameterboolean from qgis core import qgscoordinatereferencesystem from qgis core import qgsexpression import processing class test out grid qgsprocessingalgorithm def initalgorithm self config none self addparameter qgsprocessingparametervectorlayer countrylayer country layer types defaultvalue none self addparameter qgsprocessingparameterfeaturesink out grid out grid type qgsprocessing typevectorpolygon createbydefault true defaultvalue none self addparameter qgsprocessingparameterboolean verbose log verbose logging optional true defaultvalue false def processalgorithm self parameters context model feedback use a multi step feedback so that individual child algorithm progress reports are adjusted for the overall progress through the model feedback qgsprocessingmultistepfeedback model feedback results outputs create grid alg params crs qgscoordinatereferencesystem epsg extent qgsexpression to string reproject layer output minx nto string reproject layer output maxx nto string reproject layer output miny nto string reproject layer output maxy evaluate hoverlay hspacing type voverlay vspacing output parameters outputs processing run native creategrid alg params context context feedback feedback is child algorithm true results outputs feedback setcurrentstep if feedback iscanceled return reproject layer alg params input parameters operation target crs qgscoordinatereferencesystem epsg output qgsprocessing temporary output outputs processing run native reprojectlayer alg params context context feedback feedback is child algorithm true return results def name self return test out grid def displayname self return test out grid def group self return def groupid self return def createinstance self return test out grid
| 1
|
14,509
| 17,605,327,901
|
IssuesEvent
|
2021-08-17 16:19:36
|
openservicemesh/osm-docs
|
https://api.github.com/repos/openservicemesh/osm-docs
|
closed
|
mirror the v0.8 release docs
|
process
|
Add a v0.8 mirror of the docs as they existed in https://github.com/openservicemesh/osm/tree/release-v0.8/docs (available in a drop-down and not updated).
|
1.0
|
mirror the v0.8 release docs - Add a v0.8 mirror of the docs as they existed in https://github.com/openservicemesh/osm/tree/release-v0.8/docs (available in a drop-down and not updated).
|
process
|
mirror the release docs add a mirror of the docs as they existed in available in a drop down and not updated
| 1
|
92,310
| 10,741,046,683
|
IssuesEvent
|
2019-10-29 19:25:06
|
eriq-augustine/test-issue-copy
|
https://api.github.com/repos/eriq-augustine/test-issue-copy
|
opened
|
[CLOSED] External Functions with a Single Argument
|
Documentation - Question Type - Bug
|
<a href="https://github.com/eriq-augustine"><img src="https://avatars0.githubusercontent.com/u/337857?v=4" align="left" width="96" height="96" hspace="10"></img></a> **Issue by [eriq-augustine](https://github.com/eriq-augustine)**
_Saturday Nov 12, 2016 at 19:28 GMT_
_Originally opened as https://github.com/eriq-augustine/psl/issues/22_
----
As I recall, there is an issue with external functions that only take a single argument.
Check to see if this is true.
|
1.0
|
[CLOSED] External Functions with a Single Argument - <a href="https://github.com/eriq-augustine"><img src="https://avatars0.githubusercontent.com/u/337857?v=4" align="left" width="96" height="96" hspace="10"></img></a> **Issue by [eriq-augustine](https://github.com/eriq-augustine)**
_Saturday Nov 12, 2016 at 19:28 GMT_
_Originally opened as https://github.com/eriq-augustine/psl/issues/22_
----
As I recall, there is an issue with external functions that only take a single argument.
Check to see if this is true.
|
non_process
|
external functions with a single argument issue by saturday nov at gmt originally opened as as i recall there is an issue with external functions that only take a single argument check to see if this is true
| 0
|
11,281
| 14,078,588,830
|
IssuesEvent
|
2020-11-04 13:46:57
|
alphagov/govuk-design-system
|
https://api.github.com/repos/alphagov/govuk-design-system
|
opened
|
Run through 'tech & ops health check'
|
process
|
## What
Run through the 'tech & ops health check' checklist - 'a handy list of good practices for a team to self evaluate against'.
The health check covers the technical and operational aspects of a service only. It does not cover user-centered design or agile ways of working.
## Why
The tech & ops health check should help us to identify possible gaps or areas to improve. This is particularly useful as we start to think about going for a live assessment.
## Who needs to know
Developers, technical writers, product manager
## Done when…
- [x] Run through health check with @louzoid-gds to work out which items on the checklist are relevant for the team
- [ ] Go through the health check as a team of developers, scoring ourselves using the criteria provided and identifying which areas we're most concerned about
- [ ] Identify and write up cards for any follow-up actions we want to take in the next few quarters
|
1.0
|
Run through 'tech & ops health check' - ## What
Run through the 'tech & ops health check' checklist - 'a handy list of good practices for a team to self evaluate against'.
The health check covers the technical and operational aspects of a service only. It does not cover user-centered design or agile ways of working.
## Why
The tech & ops health check should help us to identify possible gaps or areas to improve. This is particularly useful as we start to think about going for a live assessment.
## Who needs to know
Developers, technical writers, product manager
## Done when…
- [x] Run through health check with @louzoid-gds to work out which items on the checklist are relevant for the team
- [ ] Go through the health check as a team of developers, scoring ourselves using the criteria provided and identifying which areas we're most concerned about
- [ ] Identify and write up cards for any follow-up actions we want to take in the next few quarters
|
process
|
run through tech ops health check what run through the tech ops health check checklist a handy list of good practices for a team to self evaluate against the health check covers the technical and operational aspects of a service only it does not cover user centered design or agile ways of working why the tech ops health check should help us to identify possible gaps or areas to improve this is particularly useful as we start to think about going for a live assessment who needs to know developers technical writers product manager done when… run through health check with louzoid gds to work out which items on the checklist are relevant for the team go through the health check as a team of developers scoring ourselves using the criteria provided and identifying which areas we re most concerned about identify and write up cards for any follow up actions we want to take in the next few quarters
| 1
|
25,137
| 12,500,116,248
|
IssuesEvent
|
2020-06-01 21:31:40
|
flutter/flutter
|
https://api.github.com/repos/flutter/flutter
|
opened
|
[cocoon] Create a handler that pushes benchmark to Metrics Center
|
severe: performance team: infra
|
Per discussion with @liyuqian we'd like to push the devicelab benchmark data to metrics center, which transfers the data to skia perf later. So that skia perf can be utilized to view the devicelab benchmark results.
We will need to integrate https://github.com/liyuqian/metrics_center with Cocoon backend. And add a cron job to push the data continuously.
This is part of the go/flutter-metrics-center effort.
|
True
|
[cocoon] Create a handler that pushes benchmark to Metrics Center - Per discussion with @liyuqian we'd like to push the devicelab benchmark data to metrics center, which transfers the data to skia perf later. So that skia perf can be utilized to view the devicelab benchmark results.
We will need to integrate https://github.com/liyuqian/metrics_center with Cocoon backend. And add a cron job to push the data continuously.
This is part of the go/flutter-metrics-center effort.
|
non_process
|
create a handler that pushes benchmark to metrics center per discussion with liyuqian we d like to push the devicelab benchmark data to metrics center which transfers the data to skia perf later so that skia perf can be utilized to view the devicelab benchmark results we will need to integrate with cocoon backend and add a cron job to push the data continuously this is part of the go flutter metrics center effort
| 0
|
11,707
| 14,545,541,156
|
IssuesEvent
|
2020-12-15 19:48:36
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
Define artifact from feed in resources
|
Pri2 devops-cicd-process/tech devops/prod support-request
|
Hello,
I am switching from using builds artifacts to publishing them as nuget packages in artifacts feed. I have read in various places that it is also a recommendation and good practice.
However, I wanted to define it in YAML pipeline as `resources packages`, but I guess it is only relevant to GitHub artifacts.
In classic pipelines it is quite easy to set, but I have some troubles when using YAML. Is it only possible to download using `DownloadPackage` task or is there a way to define it in `resources`?
Many thanks!
Rafal
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: ee4ec9d0-e0d5-4fb4-7c3e-b84abfa290c2
* Version Independent ID: 3e2b80d9-30e5-0c48-49f0-4fcdfedf5eee
* Content: [Resources - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/resources?view=azure-devops&tabs=schema)
* Content Source: [docs/pipelines/process/resources.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/resources.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Define artifact from feed in resources - Hello,
I am switching from using builds artifacts to publishing them as nuget packages in artifacts feed. I have read in various places that it is also a recommendation and good practice.
However, I wanted to define it in YAML pipeline as `resources packages`, but I guess it is only relevant to GitHub artifacts.
In classic pipelines it is quite easy to set, but I have some troubles when using YAML. Is it only possible to download using `DownloadPackage` task or is there a way to define it in `resources`?
Many thanks!
Rafal
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: ee4ec9d0-e0d5-4fb4-7c3e-b84abfa290c2
* Version Independent ID: 3e2b80d9-30e5-0c48-49f0-4fcdfedf5eee
* Content: [Resources - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/resources?view=azure-devops&tabs=schema)
* Content Source: [docs/pipelines/process/resources.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/resources.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
define artifact from feed in resources hello i am switching from using builds artifacts to publishing them as nuget packages in artifacts feed i have read in various places that it is also a recommendation and good practice however i wanted to define it in yaml pipeline as resources packages but i guess it is only relevant to github artifacts in classic pipelines it is quite easy to set but i have some troubles when using yaml is it only possible to download using downloadpackage task or is there a way to define it in resources many thanks rafal document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
17,162
| 22,739,148,923
|
IssuesEvent
|
2022-07-07 00:44:42
|
MPMG-DCC-UFMG/C01
|
https://api.github.com/repos/MPMG-DCC-UFMG/C01
|
closed
|
Erro código 1006 ao coletar algumas páginas
|
[1] Bug [2] Baixa Prioridade [0] Desenvolvimento [3] Processamento Dinâmico
|
# Comportamento Esperado
Realizar as coletas sem erro algum.
# Comportamento Atual
As coletas apresentam o erro code = 1006. Na maioria das vezes este erro não impede o funcionamento do coletor.
Segue log exemplo para este problema:
`2021-08-03 19:55:04 [websockets.protocol] DEBUG: client - event = eof_received()
2021-08-03 19:55:04 [websockets.protocol] DEBUG: client ! failing CLOSING WebSocket connection with code 1006
2021-08-03 19:55:04 [websockets.protocol] DEBUG: client - event = connection_lost(None)
2021-08-03 19:55:04 [websockets.protocol] DEBUG: client - state = CLOSED
2021-08-03 19:55:04 [websockets.protocol] DEBUG: client x code = 1006, reason = [no reason]
2021-08-03 19:55:04 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'elapsed_time_seconds': 0.0047,
'finish_reason': 'finished',
'finish_time': datetime.datetime(2021, 8, 3, 19, 54, 57, 215496),
'log_count/DEBUG': 116,
'log_count/ERROR': 1,
'log_count/INFO': 10,
'log_count/WARNING': 3,
'memusage/max': 302215168,
'memusage/startup': 302215168,
'start_time': datetime.datetime(2021, 8, 3, 19, 54, 57, 210796)}
2021-08-03 19:55:04 [scrapy.core.engine] INFO: Spider closed (finished)`
# Passos para reproduzir o erro
Para recriar o problema, basta criar um coletor para a fonte "http://www.congonhas.mg.gov.br/index.php/licitacao-publica-prefeitura/", com um passo dinâmico de espera e um de salvar página.
Pode ter alguma relação com o seguinte problema: https://github.com/miyakogi/pyppeteer/issues/62
|
1.0
|
Erro código 1006 ao coletar algumas páginas - # Comportamento Esperado
Realizar as coletas sem erro algum.
# Comportamento Atual
As coletas apresentam o erro code = 1006. Na maioria das vezes este erro não impede o funcionamento do coletor.
Segue log exemplo para este problema:
`2021-08-03 19:55:04 [websockets.protocol] DEBUG: client - event = eof_received()
2021-08-03 19:55:04 [websockets.protocol] DEBUG: client ! failing CLOSING WebSocket connection with code 1006
2021-08-03 19:55:04 [websockets.protocol] DEBUG: client - event = connection_lost(None)
2021-08-03 19:55:04 [websockets.protocol] DEBUG: client - state = CLOSED
2021-08-03 19:55:04 [websockets.protocol] DEBUG: client x code = 1006, reason = [no reason]
2021-08-03 19:55:04 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'elapsed_time_seconds': 0.0047,
'finish_reason': 'finished',
'finish_time': datetime.datetime(2021, 8, 3, 19, 54, 57, 215496),
'log_count/DEBUG': 116,
'log_count/ERROR': 1,
'log_count/INFO': 10,
'log_count/WARNING': 3,
'memusage/max': 302215168,
'memusage/startup': 302215168,
'start_time': datetime.datetime(2021, 8, 3, 19, 54, 57, 210796)}
2021-08-03 19:55:04 [scrapy.core.engine] INFO: Spider closed (finished)`
# Passos para reproduzir o erro
Para recriar o problema, basta criar um coletor para a fonte "http://www.congonhas.mg.gov.br/index.php/licitacao-publica-prefeitura/", com um passo dinâmico de espera e um de salvar página.
Pode ter alguma relação com o seguinte problema: https://github.com/miyakogi/pyppeteer/issues/62
|
process
|
erro código ao coletar algumas páginas comportamento esperado realizar as coletas sem erro algum comportamento atual as coletas apresentam o erro code na maioria das vezes este erro não impede o funcionamento do coletor segue log exemplo para este problema debug client event eof received debug client failing closing websocket connection with code debug client event connection lost none debug client state closed debug client x code reason info dumping scrapy stats elapsed time seconds finish reason finished finish time datetime datetime log count debug log count error log count info log count warning memusage max memusage startup start time datetime datetime info spider closed finished passos para reproduzir o erro para recriar o problema basta criar um coletor para a fonte com um passo dinâmico de espera e um de salvar página pode ter alguma relação com o seguinte problema
| 1
|
138,434
| 20,518,376,687
|
IssuesEvent
|
2022-03-01 14:09:34
|
sul-dlss/argo
|
https://api.github.com/repos/sul-dlss/argo
|
closed
|
(new Argo interface) Show the default object rights on the APO page
|
design needed
|
The default object rights should be displayed on the APO page, per Astrid's design.
<img width="635" alt="Screen Shot 2022-01-28 at 12 01 09 AM" src="https://user-images.githubusercontent.com/8975583/151509501-e164fbed-e6d2-4c5d-b0e3-43414cb00977.png">
This may come in a later phase of the Argo UI, depending on prioritization.
|
1.0
|
(new Argo interface) Show the default object rights on the APO page - The default object rights should be displayed on the APO page, per Astrid's design.
<img width="635" alt="Screen Shot 2022-01-28 at 12 01 09 AM" src="https://user-images.githubusercontent.com/8975583/151509501-e164fbed-e6d2-4c5d-b0e3-43414cb00977.png">
This may come in a later phase of the Argo UI, depending on prioritization.
|
non_process
|
new argo interface show the default object rights on the apo page the default object rights should be displayed on the apo page per astrid s design img width alt screen shot at am src this may come in a later phase of the argo ui depending on prioritization
| 0
|
17,375
| 23,199,252,116
|
IssuesEvent
|
2022-08-01 19:37:39
|
googleapis/google-cloud-go
|
https://api.github.com/repos/googleapis/google-cloud-go
|
closed
|
all: add static analysis presubmit
|
type: process
|
When I recently re-imported the storage module into third_party, static analysis presubmits there caught a number of minor bugs such as unused err values, use of deprecated types, and nil dereferences.
Not all of these checks are available in OSS world, but we should at least consider running [staticcheck](https://staticcheck.io/) as a presubmit on all PRs, using a github action potentially.
In the future, we could also add additional checks via writing our own analyzer using https://pkg.go.dev/golang.org/x/tools/go/analysis as well, if we feel it's worthwhile.
|
1.0
|
all: add static analysis presubmit - When I recently re-imported the storage module into third_party, static analysis presubmits there caught a number of minor bugs such as unused err values, use of deprecated types, and nil dereferences.
Not all of these checks are available in OSS world, but we should at least consider running [staticcheck](https://staticcheck.io/) as a presubmit on all PRs, using a github action potentially.
In the future, we could also add additional checks via writing our own analyzer using https://pkg.go.dev/golang.org/x/tools/go/analysis as well, if we feel it's worthwhile.
|
process
|
all add static analysis presubmit when i recently re imported the storage module into third party static analysis presubmits there caught a number of minor bugs such as unused err values use of deprecated types and nil dereferences not all of these checks are available in oss world but we should at least consider running as a presubmit on all prs using a github action potentially in the future we could also add additional checks via writing our own analyzer using as well if we feel it s worthwhile
| 1
|
278,833
| 30,702,409,869
|
IssuesEvent
|
2023-07-27 01:27:46
|
yunexMasters/jdk
|
https://api.github.com/repos/yunexMasters/jdk
|
closed
|
CVE-2023-2004 (Medium) detected in freetypefreetype-2.10.4 - autoclosed
|
Mend: dependency security vulnerability
|
## CVE-2023-2004 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>freetypefreetype-2.10.4</b></p></summary>
<p>
<p>A free, high-quality, and portable font engine</p>
<p>Library home page: <a href=https://sourceforge.net/projects/freetype/>https://sourceforge.net/projects/freetype/</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/src/java.desktop/share/native/libfreetype/src/truetype/ttgxvar.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
An integer overflow vulnerability was discovered in Freetype in tt_hvadvance_adjust() function in src/truetype/ttgxvar.c.
<p>Publish Date: 2023-04-14
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-2004>CVE-2023-2004</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href=" https://www.cve.org/CVERecord?id=CVE-2023-2004"> https://www.cve.org/CVERecord?id=CVE-2023-2004</a></p>
<p>Release Date: 2023-04-14</p>
<p>Fix Resolution: VER-2-13-0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2023-2004 (Medium) detected in freetypefreetype-2.10.4 - autoclosed - ## CVE-2023-2004 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>freetypefreetype-2.10.4</b></p></summary>
<p>
<p>A free, high-quality, and portable font engine</p>
<p>Library home page: <a href=https://sourceforge.net/projects/freetype/>https://sourceforge.net/projects/freetype/</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/src/java.desktop/share/native/libfreetype/src/truetype/ttgxvar.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
An integer overflow vulnerability was discovered in Freetype in tt_hvadvance_adjust() function in src/truetype/ttgxvar.c.
<p>Publish Date: 2023-04-14
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-2004>CVE-2023-2004</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href=" https://www.cve.org/CVERecord?id=CVE-2023-2004"> https://www.cve.org/CVERecord?id=CVE-2023-2004</a></p>
<p>Release Date: 2023-04-14</p>
<p>Fix Resolution: VER-2-13-0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in freetypefreetype autoclosed cve medium severity vulnerability vulnerable library freetypefreetype a free high quality and portable font engine library home page a href found in base branch master vulnerable source files src java desktop share native libfreetype src truetype ttgxvar c vulnerability details an integer overflow vulnerability was discovered in freetype in tt hvadvance adjust function in src truetype ttgxvar c publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ver step up your open source security game with mend
| 0
|
19,241
| 25,402,217,654
|
IssuesEvent
|
2022-11-22 12:58:28
|
FOLIO-FSE/folio_migration_tools
|
https://api.github.com/repos/FOLIO-FSE/folio_migration_tools
|
closed
|
Rewrite the extra data process to not rely on logging
|
simplify_migration_process
|
We rely on logging in order to create the additional objects that need to be created as part of a transformation. Since people using the tools as a library instead of running them from the command line will use other types of logging, these objects will not be created as expected in these cases.
Examples of objects created as these "Extra data" objects:
* preceding/succeeding titles
* almost everything for the courses
* Bound-with-related objects
|
1.0
|
Rewrite the extra data process to not rely on logging - We rely on logging in order to create the additional objects that need to be created as part of a transformation. Since people using the tools as a library instead of running them from the command line will use other types of logging, these objects will not be created as expected in these cases.
Examples of objects created as these "Extra data" objects:
* preceding/succeeding titles
* almost everything for the courses
* Bound-with-related objects
|
process
|
rewrite the extra data process to not rely on logging we rely on logging in order to create the additional objects that need to be created as part of a transformation since people using the tools as a library instead of running them from the command line will use other types of logging these objects will not be created as expected in these cases examples of objects created as these extra data objects preceding succeeding titles almost everything for the courses bound with related objects
| 1
|
315,654
| 27,092,800,908
|
IssuesEvent
|
2023-02-14 22:43:50
|
eclipse-openj9/openj9
|
https://api.github.com/repos/eclipse-openj9/openj9
|
closed
|
jdk20 OpenJDK jdk/internal/misc/ThreadFlock/WithScopedValue UnsatisfiedLinkError Thread.scopedValueCache
|
comp:vm test failure project:loom jdk20
|
https://openj9-jenkins.osuosl.org/job/Test_openjdk20_j9_sanity.openjdk_aarch64_linux_OpenJDK20/1/
All platforms, not platform specific.
jdk_lang
jdk/internal/misc/ThreadFlock/WithScopedValue.java
```
10:30:29 STDOUT:
10:30:29 test WithScopedValue.testInheritsScopedValue(java.util.concurrent.Executors$DefaultThreadFactory@1442703a): failure
10:30:29 java.lang.UnsatisfiedLinkError: java/lang/Thread.scopedValueCache()[Ljava/lang/Object;
10:30:29 at java.base/java.lang.Access.scopedValueCache(Access.java:455)
10:30:29 at jdk.incubator.concurrent/jdk.incubator.concurrent.ScopedValue.scopedValueCache(ScopedValue.java:645)
10:30:29 at jdk.incubator.concurrent/jdk.incubator.concurrent.ScopedValue$Cache.invalidate(ScopedValue.java:829)
10:30:29 at jdk.incubator.concurrent/jdk.incubator.concurrent.ScopedValue$Carrier.call(ScopedValue.java:367)
10:30:29 at jdk.incubator.concurrent/jdk.incubator.concurrent.ScopedValue.where(ScopedValue.java:494)
10:30:29 at WithScopedValue.testInheritsScopedValue(WithScopedValue.java:63)
```
|
1.0
|
jdk20 OpenJDK jdk/internal/misc/ThreadFlock/WithScopedValue UnsatisfiedLinkError Thread.scopedValueCache - https://openj9-jenkins.osuosl.org/job/Test_openjdk20_j9_sanity.openjdk_aarch64_linux_OpenJDK20/1/
All platforms, not platform specific.
jdk_lang
jdk/internal/misc/ThreadFlock/WithScopedValue.java
```
10:30:29 STDOUT:
10:30:29 test WithScopedValue.testInheritsScopedValue(java.util.concurrent.Executors$DefaultThreadFactory@1442703a): failure
10:30:29 java.lang.UnsatisfiedLinkError: java/lang/Thread.scopedValueCache()[Ljava/lang/Object;
10:30:29 at java.base/java.lang.Access.scopedValueCache(Access.java:455)
10:30:29 at jdk.incubator.concurrent/jdk.incubator.concurrent.ScopedValue.scopedValueCache(ScopedValue.java:645)
10:30:29 at jdk.incubator.concurrent/jdk.incubator.concurrent.ScopedValue$Cache.invalidate(ScopedValue.java:829)
10:30:29 at jdk.incubator.concurrent/jdk.incubator.concurrent.ScopedValue$Carrier.call(ScopedValue.java:367)
10:30:29 at jdk.incubator.concurrent/jdk.incubator.concurrent.ScopedValue.where(ScopedValue.java:494)
10:30:29 at WithScopedValue.testInheritsScopedValue(WithScopedValue.java:63)
```
|
non_process
|
openjdk jdk internal misc threadflock withscopedvalue unsatisfiedlinkerror thread scopedvaluecache all platforms not platform specific jdk lang jdk internal misc threadflock withscopedvalue java stdout test withscopedvalue testinheritsscopedvalue java util concurrent executors defaultthreadfactory failure java lang unsatisfiedlinkerror java lang thread scopedvaluecache ljava lang object at java base java lang access scopedvaluecache access java at jdk incubator concurrent jdk incubator concurrent scopedvalue scopedvaluecache scopedvalue java at jdk incubator concurrent jdk incubator concurrent scopedvalue cache invalidate scopedvalue java at jdk incubator concurrent jdk incubator concurrent scopedvalue carrier call scopedvalue java at jdk incubator concurrent jdk incubator concurrent scopedvalue where scopedvalue java at withscopedvalue testinheritsscopedvalue withscopedvalue java
| 0
|
253,398
| 19,100,267,713
|
IssuesEvent
|
2021-11-29 21:34:29
|
SandraScherer/EntertainmentInfothek
|
https://api.github.com/repos/SandraScherer/EntertainmentInfothek
|
opened
|
Add runtime information to series
|
documentation enhancement database program
|
- [ ] Add table Series_Runtime to database
- [ ] Add/adapt class Runtime in EntertainmentDB.dll
- [ ] Add tests to EntertainmentDB.Tests
- [ ] Add/adapt ContentCreator classes in WikiPageCreator
- [ ] Add tests to WikiPageCreator.Tests
- [ ] Update documentation
- [ ] EntertainmentInfothek_Database.vpp
- [ ] EntertainmentInfothek_EntertainmentDB.dll.vpp
- [ ] EntertainmentInfothek_WikiPageCreator.vpp
- [ ] Doxygen
|
1.0
|
Add runtime information to series - - [ ] Add table Series_Runtime to database
- [ ] Add/adapt class Runtime in EntertainmentDB.dll
- [ ] Add tests to EntertainmentDB.Tests
- [ ] Add/adapt ContentCreator classes in WikiPageCreator
- [ ] Add tests to WikiPageCreator.Tests
- [ ] Update documentation
- [ ] EntertainmentInfothek_Database.vpp
- [ ] EntertainmentInfothek_EntertainmentDB.dll.vpp
- [ ] EntertainmentInfothek_WikiPageCreator.vpp
- [ ] Doxygen
|
non_process
|
add runtime information to series add table series runtime to database add adapt class runtime in entertainmentdb dll add tests to entertainmentdb tests add adapt contentcreator classes in wikipagecreator add tests to wikipagecreator tests update documentation entertainmentinfothek database vpp entertainmentinfothek entertainmentdb dll vpp entertainmentinfothek wikipagecreator vpp doxygen
| 0
|
17,687
| 24,382,651,517
|
IssuesEvent
|
2022-10-04 09:08:06
|
Noaaan/MythicMetals
|
https://api.github.com/repos/Noaaan/MythicMetals
|
closed
|
[1.19.2] (0.16.1) Possible incompatibility with Terralith (Feature order cycle)
|
bug compatibility
|
Hello o/
im struggling to add Mythic Metals to my modlist because it crashes on world creation. Using the latest 1.19.2 versions of this mod and Alloy Forgery. Modlist works fine without Mythic Metals.
<details><summary>Here's the crash report</summary>
<p>
```
---- Minecraft Crash Report ----
// Surprise! Haha. Well, this is awkward.
Time: 2022-10-03 17:06:40
Description: Exception generating new chunk
java.lang.IllegalStateException: Feature order cycle found, involved sources: [Reference{ResourceKey[minecraft:worldgen/biome / terralith:haze_mountain]=net.minecraft.class_1959@20761e8b}]
at net.minecraft.world.gen.feature.util.PlacedFeatureIndexer.collectIndexedFeatures(PlacedFeatureIndexer:100)
at net.minecraft.world.gen.chunk.ChunkGenerator.md9a64ed$lambda$updateFeaturesPerStep$1$0(ChunkGenerator:3354)
at com.google.common.base.Suppliers$NonSerializableMemoizingSupplier.get(Suppliers.java:183)
at net.minecraft.world.gen.chunk.ChunkGenerator.generateFeatures(ChunkGenerator:397)
at net.minecraft.world.chunk.ChunkStatus.method_20613(ChunkStatus:145)
at net.minecraft.world.chunk.ChunkStatus.runGenerationTask(ChunkStatus:292)
at net.minecraft.server.world.ThreadedAnvilChunkStorage.method_17225(ThreadedAnvilChunkStorage:679)
at com.mojang.datafixers.util.Either$Left.map(Either.java:38)
at net.minecraft.server.world.ThreadedAnvilChunkStorage.method_17224(ThreadedAnvilChunkStorage:673)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1150)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:482)
at net.minecraft.server.world.ChunkTaskPrioritySystem.method_17634(ChunkTaskPrioritySystem:62)
at net.minecraft.util.thread.TaskExecutor.runNext(TaskExecutor:91)
at net.minecraft.util.thread.TaskExecutor.runWhile(TaskExecutor:146)
at net.minecraft.util.thread.TaskExecutor.run(TaskExecutor:102)
at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1395)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165)
A detailed walkthrough of the error, its code path and all known details is as follows:
---------------------------------------------------------------------------------------
-- Head --
Thread: Render thread
Stacktrace:
at net.minecraft.world.gen.feature.util.PlacedFeatureIndexer.collectIndexedFeatures(PlacedFeatureIndexer:100)
at net.minecraft.world.gen.chunk.ChunkGenerator.md9a64ed$lambda$updateFeaturesPerStep$1$0(ChunkGenerator:3354)
at com.google.common.base.Suppliers$NonSerializableMemoizingSupplier.get(Suppliers.java:183)
at net.minecraft.world.gen.chunk.ChunkGenerator.generateFeatures(ChunkGenerator:397)
at net.minecraft.world.chunk.ChunkStatus.method_20613(ChunkStatus:145)
at net.minecraft.world.chunk.ChunkStatus.runGenerationTask(ChunkStatus:292)
at net.minecraft.server.world.ThreadedAnvilChunkStorage.method_17225(ThreadedAnvilChunkStorage:679)
at com.mojang.datafixers.util.Either$Left.map(Either.java:38)
at net.minecraft.server.world.ThreadedAnvilChunkStorage.method_17224(ThreadedAnvilChunkStorage:673)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1150)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:482)
at net.minecraft.server.world.ChunkTaskPrioritySystem.method_17634(ChunkTaskPrioritySystem:62)
at net.minecraft.util.thread.TaskExecutor.runNext(TaskExecutor:91)
at net.minecraft.util.thread.TaskExecutor.runWhile(TaskExecutor:146)
at net.minecraft.util.thread.TaskExecutor.run(TaskExecutor:102)
-- Chunk to be generated --
Details:
Location: -1,-1
Position hash: -1
Generator: net.minecraft.class_3754@51371e57
Stacktrace:
at net.minecraft.class_3898.method_17225(class_3898.java:679)
at com.mojang.datafixers.util.Either$Left.map(Either.java:38)
at net.minecraft.class_3898.method_17224(class_3898.java:673)
at java.base/java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1150)
at java.base/java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:482)
at net.minecraft.class_3900.method_17634(class_3900.java:62)
at net.minecraft.class_3846.method_16907(class_3846.java:91)
at net.minecraft.class_3846.method_16900(class_3846.java:146)
at net.minecraft.class_3846.run(class_3846.java:102)
at java.base/java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1395)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165)
-- Affected level --
Details:
All players: 0 total; []
Chunk stats: 625
Level dimension: minecraft:overworld
Level spawn location: World: (8,64,8), Section: (at 8,0,8 in 0,4,0; chunk contains blocks 0,-64,0 to 15,319,15), Region: (0,0; contains chunks 0,0 to 31,31, blocks 0,-64,0 to 511,319,511)
Level time: 0 game time, 0 day time
Level name: New World
Level game mode: Game mode: survival (ID 0). Hardcore: false. Cheats: true
Level weather: Rain time: 0 (now: false), thunder time: 0 (now: false)
Known server brands: fabric
Level was modded: true
Level storage version: 0x04ABD - Anvil
Stacktrace:
at net.minecraft.server.MinecraftServer.method_3786(MinecraftServer.java:367)
at net.minecraft.server.MinecraftServer.method_3735(MinecraftServer.java:315)
at net.minecraft.class_1132.method_3823(class_1132.java:68)
at net.minecraft.server.MinecraftServer.method_29741(MinecraftServer.java:636)
at net.minecraft.server.MinecraftServer.method_29739(MinecraftServer.java:257)
at java.base/java.lang.Thread.run(Thread.java:833)
-- System Details --
Details:
Minecraft Version: 1.19.2
Minecraft Version ID: 1.19.2
Operating System: Windows 10 (amd64) version 10.0
Java Version: 17.0.3, Microsoft
Java VM Version: OpenJDK 64-Bit Server VM (mixed mode), Microsoft
Memory: 1045958808 bytes (997 MiB) / 4320133120 bytes (4120 MiB) up to 10066329600 bytes (9600 MiB)
CPUs: 4
Processor Vendor: GenuineIntel
Processor Name: Intel(R) Core(TM) i5-4590 CPU @ 3.30GHz
Identifier: Intel64 Family 6 Model 60 Stepping 3
Microarchitecture: unknown
Frequency (GHz): 3.29
Number of physical packages: 1
Number of physical CPUs: 4
Number of logical CPUs: 4
Graphics card #0 name: NVIDIA GeForce GTX 1660
Graphics card #0 vendor: NVIDIA (0x10de)
Graphics card #0 VRAM (MB): 4095.00
Graphics card #0 deviceId: 0x2184
Graphics card #0 versionInfo: DriverVersion=31.0.15.1748
Memory slot #0 capacity (MB): 8192.00
Memory slot #0 clockSpeed (GHz): 1.60
Memory slot #0 type: DDR3
Memory slot #1 capacity (MB): 8192.00
Memory slot #1 clockSpeed (GHz): 1.60
Memory slot #1 type: DDR3
Virtual memory max (MB): 18749.70
Virtual memory used (MB): 14348.30
Swap memory total (MB): 2432.00
Swap memory used (MB): 0.00
JVM Flags: 4 total; -XX:HeapDumpPath=MojangTricksIntelDriversForPerformance_javaw.exe_minecraft.exe.heapdump -Xss1M -Xmx9600m -Xms256m
Fabric Mods:
additionalbanners: AdditionalBanners 10.1.2
additionalstructures: Additional Structures 4.1.0
adorn: Adorn 3.6.1+1.19
adventurez: AdventureZ 1.4.16
airhop: Air Hop 4.2.1
alloy_forgery: Alloy Forgery 2.0.16+1.19
almost_bedtime: Almost Bedtime 1.0-1.19.2
alternate-current: Alternate Current 1.4.0
ambientenvironment: Ambient Environment 8.0+3
animal_feeding_trough: Animal Feeding Trough 1.0.3+1.19.2
appleskin: AppleSkin 2.4.1+mc1.19
aqupdcaracal: Caracal mob 1.19.2-2.0.2
config2brigadier: Config to Brigadier 1.2.1
archeology: CapsLock Archeology Mod 0.2.3
architectury: Architectury 6.2.46
artifacts: Artifacts 7.1.1+fabric
expandability: ExpandAbility 6.0.0
step-height-entity-attribute: Step Height Entity Attribute 1.0.0
attributefix: AttributeFix 17.1.2
auditory: Auditory 0.0.3-1.19.x
axolotlitemfix: Axolotl Item Fix 1.1.3
basicshields: Basic Shields [Fabric] 1.4.0-pre2-1.19.2
crowdin-translate: CrowdinTranslate 1.4+1.19
bcc: BetterCompatibilityChecker 2.0.2-build.16+mc1.19.1
bclib: BCLib 2.1.0
beenfo: Beenfo 1.19.1-fabric0.58.5-1.3.3
gbfabrictools: GBfabrictools 1.3.4+1.19
better_bad_omen: Better Bad Omen 1.6.0
betteradvancements: Better Advancements 0.2.2.142
betterbiomeblend: Better Biome Blend 1.19-1.3.6-fabric
bettercombat: Better Combat 1.4.4+1.19
betterend: Better End 2.1.0
betterf3: BetterF3 1.4.0
bettermounthud: Better Mount HUD 1.1.4
betternether: Better Nether 7.1.0
bettersleeping: BetterSleeping 0.6.1+1.19
bettersleeping-core: BetterSleeping Core 0.6.1
org_apache_commons_commons-text: commons-text 1.9
betterstats: Better Statistics Screen 1.4.3
bettertridents: Better Tridents 4.0.1
blockrunner: Block Runner 4.2.0
bookshelf: Bookshelf 16.1.5
bosses_of_mass_destruction: Bosses of Mass Destruction (Beta) 1.4.2-1.19
maelstrom_library: Maelstrom Library 1.3-1.19-pre1
multipart_entities: MultipartEntities 1.2-1.19-pre1
botanypots: BotanyPots 9.0.3
botanytrees: BotanyTrees 5.0.1
bountiful: Bountiful 3.0.0
bowinfinityfix: Bow Infinity Fix 2.5.0
brazier: Brazier 5.0.0
byg: Oh The Biomes You'll Go 2.0.0.11
cameraoverhaul: Camera Overhaul 1.3.1-fabric-universal
capybara: Capybara 1.0.8
cardinal-components: Cardinal Components API 5.0.2
cardinal-components-base: Cardinal Components API (base) 5.0.2
cardinal-components-block: Cardinal Components API (blocks) 5.0.2
cardinal-components-chunk: Cardinal Components API (chunks) 5.0.2
cardinal-components-entity: Cardinal Components API (entities) 5.0.2
cardinal-components-item: Cardinal Components API (items) 5.0.2
cardinal-components-level: Cardinal Components API (world saves) 5.0.2
cardinal-components-scoreboard: Cardinal Components API (scoreboard) 5.0.2
cardinal-components-world: Cardinal Components API (worlds) 5.0.2
chalk: Chalk 2.1.0+1.19
charm: Charm 4.4.4
com_moandjiezana_toml_toml4j: toml4j 0.7.2
charmofundying: Charm of Undying 6.0.0+1.19.2
charmonium: Charmonium 4.2.1
chas: Craftable Horse Armour & Saddle 1.19-1.9-Fabric
cleardespawn: Clear Despawn 1.1.9
cloth-config: Cloth Config v8 8.2.88
cloth-basic-math: cloth-basic-math 0.6.1
clumps: Clumps 9.0.0+11
companion: Companion 3.0.0
connectiblechains: Connectible Chains 2.1.3+1.19.2
consistency_plus: Consistency Plus 0.5.1+1.19.2
stonecutter_recipe_tags: Stonecutter Recipe Tags 4.0.0+1.19.9b8d04c.fabric
continuity: Continuity 2.0.1+1.19
creativecore: CreativeCore (Fabric) 2.8.5
creeperoverhaul: Creeper Overhaul 2.0.3
croptopia: Croptopia 2.1.0
com_typesafe_config: config 1.4.1
io_leangen_geantyref_geantyref: geantyref 1.3.11
org_spongepowered_configurate-core: configurate-core 4.1.2
org_spongepowered_configurate-hocon: configurate-hocon 4.1.2
cull-less-leaves: Cull Less Leaves 1.0.6
conditional-mixin: conditional mixin 0.3.0
damagetilt: Damage Tilt 1.19-fabric-0.1.2
dark-loading-screen: Dark Loading Screen 1.6.12
darkpaintings: DarkPaintings 13.1.2
debugify: Debugify 2.7.1
deeperdarker: Deeper And Darker 1.0.2b-fabric
customportalapi: Custom Portal Api 0.0.1-beta52-1.19
deepslatecutting: Deepslate Cutting 1.5.0
maybe-data: Maybe data 1.3.2-1.19.2
detailab: Detail Armor Bar 2.6.2+1.19-fabric
dragonloot: DragonLoot 1.1.2
dummmmmmy: MmmMmmMmmMmm 1.19-1.5.3
easyanvils: Easy Anvils 4.0.1
easymagic: Easy Magic 4.3.2
ecologics: Ecologics 2.1.10
edenring: EdenRing 0.6.4
effective: Effective 1.3
elytraslot: Elytra Slot 6.0.0+1.19.2
enchdesc: EnchantmentDescriptions 13.0.3
endgoblintraders: End Goblin Traders 1.5.1
enhanced_attack_indicator: Enhanced Attack Indicator 1.0.4+1.19
enhancedcelestials: Enhanced Celestials 2.1.0.1
entity_texture_features: Entity Texture Features 4.1.1
org_apache_httpcomponents_httpmime: httpmime 4.5.10
entityculling: EntityCulling-Fabric 1.5.2-mc1.19
com_logisticscraft_occlusionculling: occlusionculling 0.0.6-SNAPSHOT
everycomp: Every Compat 1.19.2-2.0.4
fabric-api: Fabric API 0.62.0+1.19.2
fabric-api-base: Fabric API Base 0.4.12+93d8cb8290
fabric-api-lookup-api-v1: Fabric API Lookup API (v1) 1.6.10+93d8cb8290
fabric-biome-api-v1: Fabric Biome API (v1) 9.0.18+c6af733c90
fabric-blockrenderlayer-v1: Fabric BlockRenderLayer Registration (v1) 1.1.21+c6af733c90
fabric-client-tags-api-v1: Fabric Client Tags 1.0.2+b35fea8390
fabric-command-api-v1: Fabric Command API (v1) 1.2.12+f71b366f90
fabric-command-api-v2: Fabric Command API (v2) 2.1.8+93d8cb8290
fabric-commands-v0: Fabric Commands (v0) 0.2.29+df3654b390
fabric-containers-v0: Fabric Containers (v0) 0.1.35+df3654b390
fabric-content-registries-v0: Fabric Content Registries (v0) 3.3.1+624e468e90
fabric-convention-tags-v1: Fabric Convention Tags 1.1.2+93d8cb8290
fabric-crash-report-info-v1: Fabric Crash Report Info (v1) 0.2.6+aeb40ebe90
fabric-data-generation-api-v1: Fabric Data Generation API (v1) 5.2.0+b598f4ac90
fabric-dimensions-v1: Fabric Dimensions API (v1) 2.1.32+0dd10df690
fabric-entity-events-v1: Fabric Entity Events (v1) 1.4.19+9ff28f4090
fabric-events-interaction-v0: Fabric Events Interaction (v0) 0.4.29+c6af733c90
fabric-events-lifecycle-v0: Fabric Events Lifecycle (v0) 0.2.29+df3654b390
fabric-game-rule-api-v1: Fabric Game Rule API (v1) 1.0.22+c6af733c90
fabric-item-api-v1: Fabric Item API (v1) 1.5.8+93d8cb8290
fabric-item-groups-v0: Fabric Item Groups (v0) 0.3.30+93d8cb8290
fabric-key-binding-api-v1: Fabric Key Binding API (v1) 1.0.21+93d8cb8290
fabric-keybindings-v0: Fabric Key Bindings (v0) 0.2.19+df3654b390
fabric-lifecycle-events-v1: Fabric Lifecycle Events (v1) 2.2.0+33ffe9ec90
fabric-loot-api-v2: Fabric Loot API (v2) 1.1.4+83a8659290
fabric-loot-tables-v1: Fabric Loot Tables (v1) 1.1.7+9e7660c690
fabric-message-api-v1: Fabric Message API (v1) 5.0.4+93d8cb8290
fabric-mining-level-api-v1: Fabric Mining Level API (v1) 2.1.15+33fbc73890
fabric-models-v0: Fabric Models (v0) 0.3.18+c6af733c90
fabric-networking-api-v1: Fabric Networking API (v1) 1.2.5+c6af733c90
fabric-networking-v0: Fabric Networking (v0) 0.3.22+df3654b390
fabric-object-builder-api-v1: Fabric Object Builder API (v1) 4.0.12+93d8cb8290
fabric-particles-v1: Fabric Particles (v1) 1.0.11+79adfe0a90
fabric-registry-sync-v0: Fabric Registry Sync (v0) 0.9.26+c6af733c90
fabric-renderer-api-v1: Fabric Renderer API (v1) 1.0.11+c6af733c90
fabric-renderer-indigo: Fabric Renderer - Indigo 0.6.13+aeb40ebe90
fabric-renderer-registries-v1: Fabric Renderer Registries (v1) 3.2.21+df3654b390
fabric-rendering-data-attachment-v1: Fabric Rendering Data Attachment (v1) 0.3.15+aeb40ebe90
fabric-rendering-fluids-v1: Fabric Rendering Fluids (v1) 3.0.8+c6af733c90
fabric-rendering-v0: Fabric Rendering (v0) 1.1.23+df3654b390
fabric-rendering-v1: Fabric Rendering (v1) 1.11.0+73145abb90
fabric-resource-conditions-api-v1: Fabric Resource Conditions API (v1) 2.0.12+a29562c890
fabric-resource-loader-v0: Fabric Resource Loader (v0) 0.7.0+93d8cb8290
fabric-screen-api-v1: Fabric Screen API (v1) 1.0.27+93d8cb8290
fabric-screen-handler-api-v1: Fabric Screen Handler API (v1) 1.3.1+1cc24b1b90
fabric-textures-v0: Fabric Textures (v0) 1.0.21+aeb40ebe90
fabric-transfer-api-v1: Fabric Transfer API (v1) 2.1.1+93d8cb8290
fabric-transitive-access-wideners-v1: Fabric Transitive Access Wideners (v1) 1.3.1+42d99c3290
fabric-language-kotlin: Fabric Language Kotlin 1.8.4+kotlin.1.7.20
org_jetbrains_kotlin_kotlin-reflect: kotlin-reflect 1.7.20
org_jetbrains_kotlin_kotlin-stdlib: kotlin-stdlib 1.7.20
org_jetbrains_kotlin_kotlin-stdlib-jdk7: kotlin-stdlib-jdk7 1.7.20
org_jetbrains_kotlin_kotlin-stdlib-jdk8: kotlin-stdlib-jdk8 1.7.20
org_jetbrains_kotlinx_atomicfu-jvm: atomicfu-jvm 0.18.3
org_jetbrains_kotlinx_kotlinx-coroutines-core-jvm: kotlinx-coroutines-core-jvm 1.6.4
org_jetbrains_kotlinx_kotlinx-coroutines-jdk8: kotlinx-coroutines-jdk8 1.6.4
org_jetbrains_kotlinx_kotlinx-datetime-jvm: kotlinx-datetime-jvm 0.4.0
org_jetbrains_kotlinx_kotlinx-serialization-cbor-jvm: kotlinx-serialization-cbor-jvm 1.4.0
org_jetbrains_kotlinx_kotlinx-serialization-core-jvm: kotlinx-serialization-core-jvm 1.4.0
org_jetbrains_kotlinx_kotlinx-serialization-json-jvm: kotlinx-serialization-json-jvm 1.4.0
fabricenchantments: Fabric Enchantments 0.8.1
fabricloader: Fabric Loader 0.14.9
fabricshieldlib: Fabric Shield Lib 1.6.0-1.19
mm: Manningham Mills 2.3
fallingleaves: Falling Leaves 1.12.2+1.19.2
ferritecore: FerriteCore 4.2.1
foodplusid: Food+ 1.4.1
forgeconfigapiport: Forge Config API Port 4.2.6
com_electronwill_night-config_core: core 3.6.5
com_electronwill_night-config_toml: toml 3.6.5
forgottenrecipes: ForgottenRecipes 1.19-1.0.0
fwaystones: Fabric Waystones 3.0.4+mc1.19.2
geckolib3: Geckolib 3.1.21
com_eliotlash_mclib_mclib: mclib 19
com_eliotlash_molang_molang: molang 19
go-fish: Go Fish 1.6.0-1.19.1
goblintraders: Goblin Traders 1.5.2
healthoverlay: Health Overlay 7.1.1
fiber: fiber 0.23.0-2
highlighter: Highlighter 1.1.4
iceberg: Iceberg 1.0.46
illuminations: Illuminations 1.10.8
indium: Indium 1.0.9+mc1.19.2
inmis: Inmis 2.7.0-1.19
omega-config: OmegaConfig 1.2.3-1.18.1
inventorysorter: Inventory Sorter 1.8.10-1.19
kyrptconfig: Kyrpt Config 1.4.14-1.19
iris: Iris 1.3.1
org_anarres_jcpp: jcpp 1.4.14
itemmodelfix: Item Model Fix 1.0.3+1.19
jade: Jade 8.2.1
java: OpenJDK 64-Bit Server VM 17
jumpoverfences: Jump Over Fences 1.1.0
kambrik: Kambrik 4.0-1.19.2
krypton: Krypton 0.2.1
com_velocitypowered_velocity-native: velocity-native 3.1.2-SNAPSHOT
lazydfu: LazyDFU 0.1.3
letmedespawn: Let Me Despawn fabric-1.0.2
letsleepingdogslie: Let Sleeping Dogs Lie 1.1.1
lithium: Lithium 0.8.3
mcda: MC Dungeons Armors 2.2.0
mcdar: MC Dungeons Artifacts 1.5.2-1.19
mcdw: MC Dungeons Weapons 5.0.4-1.19
enchant_giver: Enchant Giver 1.3.0
reach-entity-attributes: Reach Entity Attributes 2.3.0
mcwbridges: Macaw's Bridges 2.0.4
mcwdoors: Macaw's Doors 1.0.7
mcwfences: Macaw's Fences and Walls 1.0.6
mcwlights: Macaw's Lights and Lamps 1.0.4
mcwpaintings: Macaw's Paintings 1.0.4
mcwpaths: Macaw's Paths and Pavings 1.0.1
mcwroofs: Macaw's Roofs 2.2.0
mcwtrpdoors: Macaw's Trapdoors 1.0.7
midnightlib: MidnightLib 0.6.1
mimic: Mimic 1.31
minecraft: Minecraft 1.19.2
modmenu: Mod Menu 4.0.6
moonlight: Moonlight 1.19.2-2.0.31
more-axolotls: More Axolotls 1.0.2
moreculling: More Culling 1.19.1-0.10.0
mousetweaks: Mouse Tweaks 2.22
mythicmetals: Mythic Metals 0.16.1+1.19.2
naturalist: Naturalist 2.1.1
notenoughcrashes: Not Enough Crashes 4.1.8+1.19.2
onsoulfire: On Soul Fire 1.19-2
owo: oωo 0.8.3+1.19
blue_endless_jankson: jankson 1.2.1
oxidized: Oxidized 1.7.2
patchouli: Patchouli 1.19.2-76-FABRIC
pickupnotifier: Pick Up Notifier 4.2.0
player-animator: Player Animator 0.3.5
polymorph: Polymorph 0.45.0+1.19.2
puzzleslib: Puzzles Lib 4.3.8
reeses-sodium-options: Reese's Sodium Options 1.4.7+mc1.19.2-build.59
repurposed_structures: Repurposed Structures 6.1.1+1.19
ringsofascension: Rings of Ascension 1.0
roughly_enough_trades: Roughly Enough Trades 1.19-1.0
roughlyenoughitems: Roughly Enough Items 9.1.550
error_notifier: Error Notifier 1.0.9
roughlyenoughprofessions: Roughly Enough Professions 1.1.2
roughlyenoughresources: Roughly Enough Resources 2.6.0
runelic: Runelic 14.1.2
simplyswords: Simply Swords 1.30-1.19.x
sodium: Sodium 0.4.4+build.18
org_joml_joml: joml 1.10.4
sodium-extra: Sodium Extra 0.4.11+mc1.19.2-build.68
caffeineconfig: CaffeineConfig 1.0.0+1.17
sound_physics_remastered: Sound Physics Remastered 1.19.2-1.0.15
splatus_ultris: Ultris: Boss Expansion 5.6.5
starlight: Starlight 1.1.1+fabric.ae22326
supplementaries: Supplementaries 1.19.2-2.2.3
terrablender: TerraBlender 2.0.1.127
terralith: Terralith 2.3
the_bumblezone: The Bumblezone - Fabric 6.1.13+1.19.2
fake-player-api: Fake Player API 0.4.0
things: Things 0.2.20+1.19
thonkutil: ThonkUtil 2.15.4+1.19
thonkutil-base: ThonkUtil Base 1.13.2+4a8c408a57
thonkutil-capes-v1: ThonkUtil Capes (v1) 1.4.2+3eb2749857
thonkutil-coords-v1: ThonkUtil Coords (v1) 1.1.2+8ff533c957
thonkutil-customization-v1: ThonkUtil Customization (v1) 1.1.2+8ff533c957
thonkutil-legacy: ThonkUtil Legacy 1.1.2+5d4263f557
thonkutil-modchecker-v1: ThonkUtil ModChecker (v1) 1.1.3+bd4b387957
thonkutil-potions-v0: ThonkUtil Potions (v0) 1.5.2+8ff533c957
thonkutil-titlescreen-v1: ThonkUtil TitleScreen (v1) 1.2.2+8ff533c957
thonkutil-trades-v1: ThonkUtil Trades (v1) 1.2.2+8ff533c957
tipsmod: Tips 8.0.3
toolstats: ToolStats 12.0.2
trinkets: Trinkets 3.4.0
twilightforest: The Twilight Forest 4.2.299
com_google_code_findbugs_jsr305: jsr305 3.0.2
here-be-no-dragons: Here be no Dragons! 1.0.0
javax_annotation_javax_annotation-api: javax.annotation-api 1.3.2
serialization_hooks: Serialization Hooks 0.3.22
template: TEMPLATE 2.0.547+1.19.2
porting_lib_base: Porting Lib Base 2.0.547+1.19.2
com_github_llamalad7_mixinextras: MixinExtras 0.1.0-rc5
porting_lib_entity: Porting Lib Entity 2.0.547+1.19.2
porting_lib_model_generators: Porting Lib Model Generators 2.0.547+1.19.2
porting_lib_models: Porting Lib Models 2.0.547+1.19.2
porting_lib_networking: Porting Lib Networking 2.0.547+1.19.2
porting_lib_obj_loader: Porting Lib Obj Loader 2.0.547+1.19.2
porting_lib_model_loader: Porting Lib Model Loader 2.0.547+1.19.2
porting_lib_extensions: Porting Lib Extensions 2.0.547+1.19.2
porting_lib_accessors: Porting Lib Accessors 2.0.547+1.19.2
porting_lib_attributes: Porting Lib Attributes 2.0.547+1.19.2
porting_lib_tags: Porting Lib Tags 2.0.547+1.19.2
porting_lib_transfer: Porting Lib Transfer 2.0.547+1.19.2
porting_lib_common: Porting Lib Common 2.0.547+1.19.2
porting_lib_constants: Porting Lib Constants 2.0.547+1.19.2
vanilla_degus: Vanilla Degus 1.3.2
visualworkbench: Visual Workbench 4.2.0
voidz: VoidZ 1.0.9
waterdripsound: Drip Sounds 1.19-0.3.0
windchimes: Windchimes 1.2.1
xp_storage: XP Storage 1.4.3+1.19
xp_storage_trinkets: XP Storage - Trinkets 0.1+1.19
yet-another-config-lib: YetAnotherConfigLib 1.5.0
zoomify: Zoomify 2.7.0
dev_isxander_settxi_settxi-core: settxi-core 2.10.3
dev_isxander_settxi_settxi-gui: settxi-gui 2.10.3
dev_isxander_settxi_settxi-kotlinx-serialization: settxi-kotlinx-serialization 2.10.3
settxi-gui-yacl: Settxi Gui (YetAnotherConfigLib) 2.10.3
Loaded Shaderpack: BSL
Profile: Custom (+10 options changed by user)
Server Running: true
Player Count: 0 / 8; []
Data Packs: vanilla, Fabric Mods, Everycomp Generated Pack, Supplementaries Generated Pack
World Generation: Stable
Type: Integrated Server (map_client.txt)
Is Modded: Definitely; Client brand changed to 'fabric'; Server brand changed to 'fabric'
Launched Version: fabric-loader-0.14.9-1.19.2
Client Crashes Since Restart: 0
Integrated Server Crashes Since Restart: 1
Suspected Mods: None
```
</p>
</details>
[Here is the crash report with Cyanide installed](https://pastebin.com/k2Xzz56m)
[Here is the log](https://pastebin.com/WZKBhDtU)
|
True
|
[1.19.2] (0.16.1) Possible incompatibility with Terralith (Feature order cycle) - Hello o/
im struggling to add Mythic Metals to my modlist because it crashes on world creation. Using the latest 1.19.2 versions of this mod and Alloy Forgery. Modlist works fine without Mythic Metals.
<details><summary>Here's the crash report</summary>
<p>
```
---- Minecraft Crash Report ----
// Surprise! Haha. Well, this is awkward.
Time: 2022-10-03 17:06:40
Description: Exception generating new chunk
java.lang.IllegalStateException: Feature order cycle found, involved sources: [Reference{ResourceKey[minecraft:worldgen/biome / terralith:haze_mountain]=net.minecraft.class_1959@20761e8b}]
at net.minecraft.world.gen.feature.util.PlacedFeatureIndexer.collectIndexedFeatures(PlacedFeatureIndexer:100)
at net.minecraft.world.gen.chunk.ChunkGenerator.md9a64ed$lambda$updateFeaturesPerStep$1$0(ChunkGenerator:3354)
at com.google.common.base.Suppliers$NonSerializableMemoizingSupplier.get(Suppliers.java:183)
at net.minecraft.world.gen.chunk.ChunkGenerator.generateFeatures(ChunkGenerator:397)
at net.minecraft.world.chunk.ChunkStatus.method_20613(ChunkStatus:145)
at net.minecraft.world.chunk.ChunkStatus.runGenerationTask(ChunkStatus:292)
at net.minecraft.server.world.ThreadedAnvilChunkStorage.method_17225(ThreadedAnvilChunkStorage:679)
at com.mojang.datafixers.util.Either$Left.map(Either.java:38)
at net.minecraft.server.world.ThreadedAnvilChunkStorage.method_17224(ThreadedAnvilChunkStorage:673)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1150)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:482)
at net.minecraft.server.world.ChunkTaskPrioritySystem.method_17634(ChunkTaskPrioritySystem:62)
at net.minecraft.util.thread.TaskExecutor.runNext(TaskExecutor:91)
at net.minecraft.util.thread.TaskExecutor.runWhile(TaskExecutor:146)
at net.minecraft.util.thread.TaskExecutor.run(TaskExecutor:102)
at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1395)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165)
A detailed walkthrough of the error, its code path and all known details is as follows:
---------------------------------------------------------------------------------------
-- Head --
Thread: Render thread
Stacktrace:
at net.minecraft.world.gen.feature.util.PlacedFeatureIndexer.collectIndexedFeatures(PlacedFeatureIndexer:100)
at net.minecraft.world.gen.chunk.ChunkGenerator.md9a64ed$lambda$updateFeaturesPerStep$1$0(ChunkGenerator:3354)
at com.google.common.base.Suppliers$NonSerializableMemoizingSupplier.get(Suppliers.java:183)
at net.minecraft.world.gen.chunk.ChunkGenerator.generateFeatures(ChunkGenerator:397)
at net.minecraft.world.chunk.ChunkStatus.method_20613(ChunkStatus:145)
at net.minecraft.world.chunk.ChunkStatus.runGenerationTask(ChunkStatus:292)
at net.minecraft.server.world.ThreadedAnvilChunkStorage.method_17225(ThreadedAnvilChunkStorage:679)
at com.mojang.datafixers.util.Either$Left.map(Either.java:38)
at net.minecraft.server.world.ThreadedAnvilChunkStorage.method_17224(ThreadedAnvilChunkStorage:673)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1150)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:482)
at net.minecraft.server.world.ChunkTaskPrioritySystem.method_17634(ChunkTaskPrioritySystem:62)
at net.minecraft.util.thread.TaskExecutor.runNext(TaskExecutor:91)
at net.minecraft.util.thread.TaskExecutor.runWhile(TaskExecutor:146)
at net.minecraft.util.thread.TaskExecutor.run(TaskExecutor:102)
-- Chunk to be generated --
Details:
Location: -1,-1
Position hash: -1
Generator: net.minecraft.class_3754@51371e57
Stacktrace:
at net.minecraft.class_3898.method_17225(class_3898.java:679)
at com.mojang.datafixers.util.Either$Left.map(Either.java:38)
at net.minecraft.class_3898.method_17224(class_3898.java:673)
at java.base/java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1150)
at java.base/java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:482)
at net.minecraft.class_3900.method_17634(class_3900.java:62)
at net.minecraft.class_3846.method_16907(class_3846.java:91)
at net.minecraft.class_3846.method_16900(class_3846.java:146)
at net.minecraft.class_3846.run(class_3846.java:102)
at java.base/java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1395)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165)
-- Affected level --
Details:
All players: 0 total; []
Chunk stats: 625
Level dimension: minecraft:overworld
Level spawn location: World: (8,64,8), Section: (at 8,0,8 in 0,4,0; chunk contains blocks 0,-64,0 to 15,319,15), Region: (0,0; contains chunks 0,0 to 31,31, blocks 0,-64,0 to 511,319,511)
Level time: 0 game time, 0 day time
Level name: New World
Level game mode: Game mode: survival (ID 0). Hardcore: false. Cheats: true
Level weather: Rain time: 0 (now: false), thunder time: 0 (now: false)
Known server brands: fabric
Level was modded: true
Level storage version: 0x04ABD - Anvil
Stacktrace:
at net.minecraft.server.MinecraftServer.method_3786(MinecraftServer.java:367)
at net.minecraft.server.MinecraftServer.method_3735(MinecraftServer.java:315)
at net.minecraft.class_1132.method_3823(class_1132.java:68)
at net.minecraft.server.MinecraftServer.method_29741(MinecraftServer.java:636)
at net.minecraft.server.MinecraftServer.method_29739(MinecraftServer.java:257)
at java.base/java.lang.Thread.run(Thread.java:833)
-- System Details --
Details:
Minecraft Version: 1.19.2
Minecraft Version ID: 1.19.2
Operating System: Windows 10 (amd64) version 10.0
Java Version: 17.0.3, Microsoft
Java VM Version: OpenJDK 64-Bit Server VM (mixed mode), Microsoft
Memory: 1045958808 bytes (997 MiB) / 4320133120 bytes (4120 MiB) up to 10066329600 bytes (9600 MiB)
CPUs: 4
Processor Vendor: GenuineIntel
Processor Name: Intel(R) Core(TM) i5-4590 CPU @ 3.30GHz
Identifier: Intel64 Family 6 Model 60 Stepping 3
Microarchitecture: unknown
Frequency (GHz): 3.29
Number of physical packages: 1
Number of physical CPUs: 4
Number of logical CPUs: 4
Graphics card #0 name: NVIDIA GeForce GTX 1660
Graphics card #0 vendor: NVIDIA (0x10de)
Graphics card #0 VRAM (MB): 4095.00
Graphics card #0 deviceId: 0x2184
Graphics card #0 versionInfo: DriverVersion=31.0.15.1748
Memory slot #0 capacity (MB): 8192.00
Memory slot #0 clockSpeed (GHz): 1.60
Memory slot #0 type: DDR3
Memory slot #1 capacity (MB): 8192.00
Memory slot #1 clockSpeed (GHz): 1.60
Memory slot #1 type: DDR3
Virtual memory max (MB): 18749.70
Virtual memory used (MB): 14348.30
Swap memory total (MB): 2432.00
Swap memory used (MB): 0.00
JVM Flags: 4 total; -XX:HeapDumpPath=MojangTricksIntelDriversForPerformance_javaw.exe_minecraft.exe.heapdump -Xss1M -Xmx9600m -Xms256m
Fabric Mods:
additionalbanners: AdditionalBanners 10.1.2
additionalstructures: Additional Structures 4.1.0
adorn: Adorn 3.6.1+1.19
adventurez: AdventureZ 1.4.16
airhop: Air Hop 4.2.1
alloy_forgery: Alloy Forgery 2.0.16+1.19
almost_bedtime: Almost Bedtime 1.0-1.19.2
alternate-current: Alternate Current 1.4.0
ambientenvironment: Ambient Environment 8.0+3
animal_feeding_trough: Animal Feeding Trough 1.0.3+1.19.2
appleskin: AppleSkin 2.4.1+mc1.19
aqupdcaracal: Caracal mob 1.19.2-2.0.2
config2brigadier: Config to Brigadier 1.2.1
archeology: CapsLock Archeology Mod 0.2.3
architectury: Architectury 6.2.46
artifacts: Artifacts 7.1.1+fabric
expandability: ExpandAbility 6.0.0
step-height-entity-attribute: Step Height Entity Attribute 1.0.0
attributefix: AttributeFix 17.1.2
auditory: Auditory 0.0.3-1.19.x
axolotlitemfix: Axolotl Item Fix 1.1.3
basicshields: Basic Shields [Fabric] 1.4.0-pre2-1.19.2
crowdin-translate: CrowdinTranslate 1.4+1.19
bcc: BetterCompatibilityChecker 2.0.2-build.16+mc1.19.1
bclib: BCLib 2.1.0
beenfo: Beenfo 1.19.1-fabric0.58.5-1.3.3
gbfabrictools: GBfabrictools 1.3.4+1.19
better_bad_omen: Better Bad Omen 1.6.0
betteradvancements: Better Advancements 0.2.2.142
betterbiomeblend: Better Biome Blend 1.19-1.3.6-fabric
bettercombat: Better Combat 1.4.4+1.19
betterend: Better End 2.1.0
betterf3: BetterF3 1.4.0
bettermounthud: Better Mount HUD 1.1.4
betternether: Better Nether 7.1.0
bettersleeping: BetterSleeping 0.6.1+1.19
bettersleeping-core: BetterSleeping Core 0.6.1
org_apache_commons_commons-text: commons-text 1.9
betterstats: Better Statistics Screen 1.4.3
bettertridents: Better Tridents 4.0.1
blockrunner: Block Runner 4.2.0
bookshelf: Bookshelf 16.1.5
bosses_of_mass_destruction: Bosses of Mass Destruction (Beta) 1.4.2-1.19
maelstrom_library: Maelstrom Library 1.3-1.19-pre1
multipart_entities: MultipartEntities 1.2-1.19-pre1
botanypots: BotanyPots 9.0.3
botanytrees: BotanyTrees 5.0.1
bountiful: Bountiful 3.0.0
bowinfinityfix: Bow Infinity Fix 2.5.0
brazier: Brazier 5.0.0
byg: Oh The Biomes You'll Go 2.0.0.11
cameraoverhaul: Camera Overhaul 1.3.1-fabric-universal
capybara: Capybara 1.0.8
cardinal-components: Cardinal Components API 5.0.2
cardinal-components-base: Cardinal Components API (base) 5.0.2
cardinal-components-block: Cardinal Components API (blocks) 5.0.2
cardinal-components-chunk: Cardinal Components API (chunks) 5.0.2
cardinal-components-entity: Cardinal Components API (entities) 5.0.2
cardinal-components-item: Cardinal Components API (items) 5.0.2
cardinal-components-level: Cardinal Components API (world saves) 5.0.2
cardinal-components-scoreboard: Cardinal Components API (scoreboard) 5.0.2
cardinal-components-world: Cardinal Components API (worlds) 5.0.2
chalk: Chalk 2.1.0+1.19
charm: Charm 4.4.4
com_moandjiezana_toml_toml4j: toml4j 0.7.2
charmofundying: Charm of Undying 6.0.0+1.19.2
charmonium: Charmonium 4.2.1
chas: Craftable Horse Armour & Saddle 1.19-1.9-Fabric
cleardespawn: Clear Despawn 1.1.9
cloth-config: Cloth Config v8 8.2.88
cloth-basic-math: cloth-basic-math 0.6.1
clumps: Clumps 9.0.0+11
companion: Companion 3.0.0
connectiblechains: Connectible Chains 2.1.3+1.19.2
consistency_plus: Consistency Plus 0.5.1+1.19.2
stonecutter_recipe_tags: Stonecutter Recipe Tags 4.0.0+1.19.9b8d04c.fabric
continuity: Continuity 2.0.1+1.19
creativecore: CreativeCore (Fabric) 2.8.5
creeperoverhaul: Creeper Overhaul 2.0.3
croptopia: Croptopia 2.1.0
com_typesafe_config: config 1.4.1
io_leangen_geantyref_geantyref: geantyref 1.3.11
org_spongepowered_configurate-core: configurate-core 4.1.2
org_spongepowered_configurate-hocon: configurate-hocon 4.1.2
cull-less-leaves: Cull Less Leaves 1.0.6
conditional-mixin: conditional mixin 0.3.0
damagetilt: Damage Tilt 1.19-fabric-0.1.2
dark-loading-screen: Dark Loading Screen 1.6.12
darkpaintings: DarkPaintings 13.1.2
debugify: Debugify 2.7.1
deeperdarker: Deeper And Darker 1.0.2b-fabric
customportalapi: Custom Portal Api 0.0.1-beta52-1.19
deepslatecutting: Deepslate Cutting 1.5.0
maybe-data: Maybe data 1.3.2-1.19.2
detailab: Detail Armor Bar 2.6.2+1.19-fabric
dragonloot: DragonLoot 1.1.2
dummmmmmy: MmmMmmMmmMmm 1.19-1.5.3
easyanvils: Easy Anvils 4.0.1
easymagic: Easy Magic 4.3.2
ecologics: Ecologics 2.1.10
edenring: EdenRing 0.6.4
effective: Effective 1.3
elytraslot: Elytra Slot 6.0.0+1.19.2
enchdesc: EnchantmentDescriptions 13.0.3
endgoblintraders: End Goblin Traders 1.5.1
enhanced_attack_indicator: Enhanced Attack Indicator 1.0.4+1.19
enhancedcelestials: Enhanced Celestials 2.1.0.1
entity_texture_features: Entity Texture Features 4.1.1
org_apache_httpcomponents_httpmime: httpmime 4.5.10
entityculling: EntityCulling-Fabric 1.5.2-mc1.19
com_logisticscraft_occlusionculling: occlusionculling 0.0.6-SNAPSHOT
everycomp: Every Compat 1.19.2-2.0.4
fabric-api: Fabric API 0.62.0+1.19.2
fabric-api-base: Fabric API Base 0.4.12+93d8cb8290
fabric-api-lookup-api-v1: Fabric API Lookup API (v1) 1.6.10+93d8cb8290
fabric-biome-api-v1: Fabric Biome API (v1) 9.0.18+c6af733c90
fabric-blockrenderlayer-v1: Fabric BlockRenderLayer Registration (v1) 1.1.21+c6af733c90
fabric-client-tags-api-v1: Fabric Client Tags 1.0.2+b35fea8390
fabric-command-api-v1: Fabric Command API (v1) 1.2.12+f71b366f90
fabric-command-api-v2: Fabric Command API (v2) 2.1.8+93d8cb8290
fabric-commands-v0: Fabric Commands (v0) 0.2.29+df3654b390
fabric-containers-v0: Fabric Containers (v0) 0.1.35+df3654b390
fabric-content-registries-v0: Fabric Content Registries (v0) 3.3.1+624e468e90
fabric-convention-tags-v1: Fabric Convention Tags 1.1.2+93d8cb8290
fabric-crash-report-info-v1: Fabric Crash Report Info (v1) 0.2.6+aeb40ebe90
fabric-data-generation-api-v1: Fabric Data Generation API (v1) 5.2.0+b598f4ac90
fabric-dimensions-v1: Fabric Dimensions API (v1) 2.1.32+0dd10df690
fabric-entity-events-v1: Fabric Entity Events (v1) 1.4.19+9ff28f4090
fabric-events-interaction-v0: Fabric Events Interaction (v0) 0.4.29+c6af733c90
fabric-events-lifecycle-v0: Fabric Events Lifecycle (v0) 0.2.29+df3654b390
fabric-game-rule-api-v1: Fabric Game Rule API (v1) 1.0.22+c6af733c90
fabric-item-api-v1: Fabric Item API (v1) 1.5.8+93d8cb8290
fabric-item-groups-v0: Fabric Item Groups (v0) 0.3.30+93d8cb8290
fabric-key-binding-api-v1: Fabric Key Binding API (v1) 1.0.21+93d8cb8290
fabric-keybindings-v0: Fabric Key Bindings (v0) 0.2.19+df3654b390
fabric-lifecycle-events-v1: Fabric Lifecycle Events (v1) 2.2.0+33ffe9ec90
fabric-loot-api-v2: Fabric Loot API (v2) 1.1.4+83a8659290
fabric-loot-tables-v1: Fabric Loot Tables (v1) 1.1.7+9e7660c690
fabric-message-api-v1: Fabric Message API (v1) 5.0.4+93d8cb8290
fabric-mining-level-api-v1: Fabric Mining Level API (v1) 2.1.15+33fbc73890
fabric-models-v0: Fabric Models (v0) 0.3.18+c6af733c90
fabric-networking-api-v1: Fabric Networking API (v1) 1.2.5+c6af733c90
fabric-networking-v0: Fabric Networking (v0) 0.3.22+df3654b390
fabric-object-builder-api-v1: Fabric Object Builder API (v1) 4.0.12+93d8cb8290
fabric-particles-v1: Fabric Particles (v1) 1.0.11+79adfe0a90
fabric-registry-sync-v0: Fabric Registry Sync (v0) 0.9.26+c6af733c90
fabric-renderer-api-v1: Fabric Renderer API (v1) 1.0.11+c6af733c90
fabric-renderer-indigo: Fabric Renderer - Indigo 0.6.13+aeb40ebe90
fabric-renderer-registries-v1: Fabric Renderer Registries (v1) 3.2.21+df3654b390
fabric-rendering-data-attachment-v1: Fabric Rendering Data Attachment (v1) 0.3.15+aeb40ebe90
fabric-rendering-fluids-v1: Fabric Rendering Fluids (v1) 3.0.8+c6af733c90
fabric-rendering-v0: Fabric Rendering (v0) 1.1.23+df3654b390
fabric-rendering-v1: Fabric Rendering (v1) 1.11.0+73145abb90
fabric-resource-conditions-api-v1: Fabric Resource Conditions API (v1) 2.0.12+a29562c890
fabric-resource-loader-v0: Fabric Resource Loader (v0) 0.7.0+93d8cb8290
fabric-screen-api-v1: Fabric Screen API (v1) 1.0.27+93d8cb8290
fabric-screen-handler-api-v1: Fabric Screen Handler API (v1) 1.3.1+1cc24b1b90
fabric-textures-v0: Fabric Textures (v0) 1.0.21+aeb40ebe90
fabric-transfer-api-v1: Fabric Transfer API (v1) 2.1.1+93d8cb8290
fabric-transitive-access-wideners-v1: Fabric Transitive Access Wideners (v1) 1.3.1+42d99c3290
fabric-language-kotlin: Fabric Language Kotlin 1.8.4+kotlin.1.7.20
org_jetbrains_kotlin_kotlin-reflect: kotlin-reflect 1.7.20
org_jetbrains_kotlin_kotlin-stdlib: kotlin-stdlib 1.7.20
org_jetbrains_kotlin_kotlin-stdlib-jdk7: kotlin-stdlib-jdk7 1.7.20
org_jetbrains_kotlin_kotlin-stdlib-jdk8: kotlin-stdlib-jdk8 1.7.20
org_jetbrains_kotlinx_atomicfu-jvm: atomicfu-jvm 0.18.3
org_jetbrains_kotlinx_kotlinx-coroutines-core-jvm: kotlinx-coroutines-core-jvm 1.6.4
org_jetbrains_kotlinx_kotlinx-coroutines-jdk8: kotlinx-coroutines-jdk8 1.6.4
org_jetbrains_kotlinx_kotlinx-datetime-jvm: kotlinx-datetime-jvm 0.4.0
org_jetbrains_kotlinx_kotlinx-serialization-cbor-jvm: kotlinx-serialization-cbor-jvm 1.4.0
org_jetbrains_kotlinx_kotlinx-serialization-core-jvm: kotlinx-serialization-core-jvm 1.4.0
org_jetbrains_kotlinx_kotlinx-serialization-json-jvm: kotlinx-serialization-json-jvm 1.4.0
fabricenchantments: Fabric Enchantments 0.8.1
fabricloader: Fabric Loader 0.14.9
fabricshieldlib: Fabric Shield Lib 1.6.0-1.19
mm: Manningham Mills 2.3
fallingleaves: Falling Leaves 1.12.2+1.19.2
ferritecore: FerriteCore 4.2.1
foodplusid: Food+ 1.4.1
forgeconfigapiport: Forge Config API Port 4.2.6
com_electronwill_night-config_core: core 3.6.5
com_electronwill_night-config_toml: toml 3.6.5
forgottenrecipes: ForgottenRecipes 1.19-1.0.0
fwaystones: Fabric Waystones 3.0.4+mc1.19.2
geckolib3: Geckolib 3.1.21
com_eliotlash_mclib_mclib: mclib 19
com_eliotlash_molang_molang: molang 19
go-fish: Go Fish 1.6.0-1.19.1
goblintraders: Goblin Traders 1.5.2
healthoverlay: Health Overlay 7.1.1
fiber: fiber 0.23.0-2
highlighter: Highlighter 1.1.4
iceberg: Iceberg 1.0.46
illuminations: Illuminations 1.10.8
indium: Indium 1.0.9+mc1.19.2
inmis: Inmis 2.7.0-1.19
omega-config: OmegaConfig 1.2.3-1.18.1
inventorysorter: Inventory Sorter 1.8.10-1.19
kyrptconfig: Kyrpt Config 1.4.14-1.19
iris: Iris 1.3.1
org_anarres_jcpp: jcpp 1.4.14
itemmodelfix: Item Model Fix 1.0.3+1.19
jade: Jade 8.2.1
java: OpenJDK 64-Bit Server VM 17
jumpoverfences: Jump Over Fences 1.1.0
kambrik: Kambrik 4.0-1.19.2
krypton: Krypton 0.2.1
com_velocitypowered_velocity-native: velocity-native 3.1.2-SNAPSHOT
lazydfu: LazyDFU 0.1.3
letmedespawn: Let Me Despawn fabric-1.0.2
letsleepingdogslie: Let Sleeping Dogs Lie 1.1.1
lithium: Lithium 0.8.3
mcda: MC Dungeons Armors 2.2.0
mcdar: MC Dungeons Artifacts 1.5.2-1.19
mcdw: MC Dungeons Weapons 5.0.4-1.19
enchant_giver: Enchant Giver 1.3.0
reach-entity-attributes: Reach Entity Attributes 2.3.0
mcwbridges: Macaw's Bridges 2.0.4
mcwdoors: Macaw's Doors 1.0.7
mcwfences: Macaw's Fences and Walls 1.0.6
mcwlights: Macaw's Lights and Lamps 1.0.4
mcwpaintings: Macaw's Paintings 1.0.4
mcwpaths: Macaw's Paths and Pavings 1.0.1
mcwroofs: Macaw's Roofs 2.2.0
mcwtrpdoors: Macaw's Trapdoors 1.0.7
midnightlib: MidnightLib 0.6.1
mimic: Mimic 1.31
minecraft: Minecraft 1.19.2
modmenu: Mod Menu 4.0.6
moonlight: Moonlight 1.19.2-2.0.31
more-axolotls: More Axolotls 1.0.2
moreculling: More Culling 1.19.1-0.10.0
mousetweaks: Mouse Tweaks 2.22
mythicmetals: Mythic Metals 0.16.1+1.19.2
naturalist: Naturalist 2.1.1
notenoughcrashes: Not Enough Crashes 4.1.8+1.19.2
onsoulfire: On Soul Fire 1.19-2
owo: oωo 0.8.3+1.19
blue_endless_jankson: jankson 1.2.1
oxidized: Oxidized 1.7.2
patchouli: Patchouli 1.19.2-76-FABRIC
pickupnotifier: Pick Up Notifier 4.2.0
player-animator: Player Animator 0.3.5
polymorph: Polymorph 0.45.0+1.19.2
puzzleslib: Puzzles Lib 4.3.8
reeses-sodium-options: Reese's Sodium Options 1.4.7+mc1.19.2-build.59
repurposed_structures: Repurposed Structures 6.1.1+1.19
ringsofascension: Rings of Ascension 1.0
roughly_enough_trades: Roughly Enough Trades 1.19-1.0
roughlyenoughitems: Roughly Enough Items 9.1.550
error_notifier: Error Notifier 1.0.9
roughlyenoughprofessions: Roughly Enough Professions 1.1.2
roughlyenoughresources: Roughly Enough Resources 2.6.0
runelic: Runelic 14.1.2
simplyswords: Simply Swords 1.30-1.19.x
sodium: Sodium 0.4.4+build.18
org_joml_joml: joml 1.10.4
sodium-extra: Sodium Extra 0.4.11+mc1.19.2-build.68
caffeineconfig: CaffeineConfig 1.0.0+1.17
sound_physics_remastered: Sound Physics Remastered 1.19.2-1.0.15
splatus_ultris: Ultris: Boss Expansion 5.6.5
starlight: Starlight 1.1.1+fabric.ae22326
supplementaries: Supplementaries 1.19.2-2.2.3
terrablender: TerraBlender 2.0.1.127
terralith: Terralith 2.3
the_bumblezone: The Bumblezone - Fabric 6.1.13+1.19.2
fake-player-api: Fake Player API 0.4.0
things: Things 0.2.20+1.19
thonkutil: ThonkUtil 2.15.4+1.19
thonkutil-base: ThonkUtil Base 1.13.2+4a8c408a57
thonkutil-capes-v1: ThonkUtil Capes (v1) 1.4.2+3eb2749857
thonkutil-coords-v1: ThonkUtil Coords (v1) 1.1.2+8ff533c957
thonkutil-customization-v1: ThonkUtil Customization (v1) 1.1.2+8ff533c957
thonkutil-legacy: ThonkUtil Legacy 1.1.2+5d4263f557
thonkutil-modchecker-v1: ThonkUtil ModChecker (v1) 1.1.3+bd4b387957
thonkutil-potions-v0: ThonkUtil Potions (v0) 1.5.2+8ff533c957
thonkutil-titlescreen-v1: ThonkUtil TitleScreen (v1) 1.2.2+8ff533c957
thonkutil-trades-v1: ThonkUtil Trades (v1) 1.2.2+8ff533c957
tipsmod: Tips 8.0.3
toolstats: ToolStats 12.0.2
trinkets: Trinkets 3.4.0
twilightforest: The Twilight Forest 4.2.299
com_google_code_findbugs_jsr305: jsr305 3.0.2
here-be-no-dragons: Here be no Dragons! 1.0.0
javax_annotation_javax_annotation-api: javax.annotation-api 1.3.2
serialization_hooks: Serialization Hooks 0.3.22
template: TEMPLATE 2.0.547+1.19.2
porting_lib_base: Porting Lib Base 2.0.547+1.19.2
com_github_llamalad7_mixinextras: MixinExtras 0.1.0-rc5
porting_lib_entity: Porting Lib Entity 2.0.547+1.19.2
porting_lib_model_generators: Porting Lib Model Generators 2.0.547+1.19.2
porting_lib_models: Porting Lib Models 2.0.547+1.19.2
porting_lib_networking: Porting Lib Networking 2.0.547+1.19.2
porting_lib_obj_loader: Porting Lib Obj Loader 2.0.547+1.19.2
porting_lib_model_loader: Porting Lib Model Loader 2.0.547+1.19.2
porting_lib_extensions: Porting Lib Extensions 2.0.547+1.19.2
porting_lib_accessors: Porting Lib Accessors 2.0.547+1.19.2
porting_lib_attributes: Porting Lib Attributes 2.0.547+1.19.2
porting_lib_tags: Porting Lib Tags 2.0.547+1.19.2
porting_lib_transfer: Porting Lib Transfer 2.0.547+1.19.2
porting_lib_common: Porting Lib Common 2.0.547+1.19.2
porting_lib_constants: Porting Lib Constants 2.0.547+1.19.2
vanilla_degus: Vanilla Degus 1.3.2
visualworkbench: Visual Workbench 4.2.0
voidz: VoidZ 1.0.9
waterdripsound: Drip Sounds 1.19-0.3.0
windchimes: Windchimes 1.2.1
xp_storage: XP Storage 1.4.3+1.19
xp_storage_trinkets: XP Storage - Trinkets 0.1+1.19
yet-another-config-lib: YetAnotherConfigLib 1.5.0
zoomify: Zoomify 2.7.0
dev_isxander_settxi_settxi-core: settxi-core 2.10.3
dev_isxander_settxi_settxi-gui: settxi-gui 2.10.3
dev_isxander_settxi_settxi-kotlinx-serialization: settxi-kotlinx-serialization 2.10.3
settxi-gui-yacl: Settxi Gui (YetAnotherConfigLib) 2.10.3
Loaded Shaderpack: BSL
Profile: Custom (+10 options changed by user)
Server Running: true
Player Count: 0 / 8; []
Data Packs: vanilla, Fabric Mods, Everycomp Generated Pack, Supplementaries Generated Pack
World Generation: Stable
Type: Integrated Server (map_client.txt)
Is Modded: Definitely; Client brand changed to 'fabric'; Server brand changed to 'fabric'
Launched Version: fabric-loader-0.14.9-1.19.2
Client Crashes Since Restart: 0
Integrated Server Crashes Since Restart: 1
Suspected Mods: None
```
</p>
</details>
[Here is the crash report with Cyanide installed](https://pastebin.com/k2Xzz56m)
[Here is the log](https://pastebin.com/WZKBhDtU)
|
non_process
|
possible incompatibility with terralith feature order cycle hello o im struggling to add mythic metals to my modlist because it crashes on world creation using the latest versions of this mod and alloy forgery modlist works fine without mythic metals here s the crash report minecraft crash report surprise haha well this is awkward time description exception generating new chunk java lang illegalstateexception feature order cycle found involved sources net minecraft class at net minecraft world gen feature util placedfeatureindexer collectindexedfeatures placedfeatureindexer at net minecraft world gen chunk chunkgenerator lambda updatefeaturesperstep chunkgenerator at com google common base suppliers nonserializablememoizingsupplier get suppliers java at net minecraft world gen chunk chunkgenerator generatefeatures chunkgenerator at net minecraft world chunk chunkstatus method chunkstatus at net minecraft world chunk chunkstatus rungenerationtask chunkstatus at net minecraft server world threadedanvilchunkstorage method threadedanvilchunkstorage at com mojang datafixers util either left map either java at net minecraft server world threadedanvilchunkstorage method threadedanvilchunkstorage at java util concurrent completablefuture unicompose tryfire completablefuture java at java util concurrent completablefuture completion run completablefuture java at net minecraft server world chunktaskprioritysystem method chunktaskprioritysystem at net minecraft util thread taskexecutor runnext taskexecutor at net minecraft util thread taskexecutor runwhile taskexecutor at net minecraft util thread taskexecutor run taskexecutor at java util concurrent forkjointask runnableexecuteaction exec forkjointask java at java util concurrent forkjointask doexec forkjointask java at java util concurrent forkjoinpool workqueue toplevelexec forkjoinpool java at java util concurrent forkjoinpool scan forkjoinpool java at java util concurrent forkjoinpool runworker forkjoinpool java at java util concurrent forkjoinworkerthread run forkjoinworkerthread java a detailed walkthrough of the error its code path and all known details is as follows head thread render thread stacktrace at net minecraft world gen feature util placedfeatureindexer collectindexedfeatures placedfeatureindexer at net minecraft world gen chunk chunkgenerator lambda updatefeaturesperstep chunkgenerator at com google common base suppliers nonserializablememoizingsupplier get suppliers java at net minecraft world gen chunk chunkgenerator generatefeatures chunkgenerator at net minecraft world chunk chunkstatus method chunkstatus at net minecraft world chunk chunkstatus rungenerationtask chunkstatus at net minecraft server world threadedanvilchunkstorage method threadedanvilchunkstorage at com mojang datafixers util either left map either java at net minecraft server world threadedanvilchunkstorage method threadedanvilchunkstorage at java util concurrent completablefuture unicompose tryfire completablefuture java at java util concurrent completablefuture completion run completablefuture java at net minecraft server world chunktaskprioritysystem method chunktaskprioritysystem at net minecraft util thread taskexecutor runnext taskexecutor at net minecraft util thread taskexecutor runwhile taskexecutor at net minecraft util thread taskexecutor run taskexecutor chunk to be generated details location position hash generator net minecraft class stacktrace at net minecraft class method class java at com mojang datafixers util either left map either java at net minecraft class method class java at java base java util concurrent completablefuture unicompose tryfire completablefuture java at java base java util concurrent completablefuture completion run completablefuture java at net minecraft class method class java at net minecraft class method class java at net minecraft class method class java at net minecraft class run class java at java base java util concurrent forkjointask runnableexecuteaction exec forkjointask java at java base java util concurrent forkjointask doexec forkjointask java at java base java util concurrent forkjoinpool workqueue toplevelexec forkjoinpool java at java base java util concurrent forkjoinpool scan forkjoinpool java at java base java util concurrent forkjoinpool runworker forkjoinpool java at java base java util concurrent forkjoinworkerthread run forkjoinworkerthread java affected level details all players total chunk stats level dimension minecraft overworld level spawn location world section at in chunk contains blocks to region contains chunks to blocks to level time game time day time level name new world level game mode game mode survival id hardcore false cheats true level weather rain time now false thunder time now false known server brands fabric level was modded true level storage version anvil stacktrace at net minecraft server minecraftserver method minecraftserver java at net minecraft server minecraftserver method minecraftserver java at net minecraft class method class java at net minecraft server minecraftserver method minecraftserver java at net minecraft server minecraftserver method minecraftserver java at java base java lang thread run thread java system details details minecraft version minecraft version id operating system windows version java version microsoft java vm version openjdk bit server vm mixed mode microsoft memory bytes mib bytes mib up to bytes mib cpus processor vendor genuineintel processor name intel r core tm cpu identifier family model stepping microarchitecture unknown frequency ghz number of physical packages number of physical cpus number of logical cpus graphics card name nvidia geforce gtx graphics card vendor nvidia graphics card vram mb graphics card deviceid graphics card versioninfo driverversion memory slot capacity mb memory slot clockspeed ghz memory slot type memory slot capacity mb memory slot clockspeed ghz memory slot type virtual memory max mb virtual memory used mb swap memory total mb swap memory used mb jvm flags total xx heapdumppath mojangtricksinteldriversforperformance javaw exe minecraft exe heapdump fabric mods additionalbanners additionalbanners additionalstructures additional structures adorn adorn adventurez adventurez airhop air hop alloy forgery alloy forgery almost bedtime almost bedtime alternate current alternate current ambientenvironment ambient environment animal feeding trough animal feeding trough appleskin appleskin aqupdcaracal caracal mob config to brigadier archeology capslock archeology mod architectury architectury artifacts artifacts fabric expandability expandability step height entity attribute step height entity attribute attributefix attributefix auditory auditory x axolotlitemfix axolotl item fix basicshields basic shields crowdin translate crowdintranslate bcc bettercompatibilitychecker build bclib bclib beenfo beenfo gbfabrictools gbfabrictools better bad omen better bad omen betteradvancements better advancements betterbiomeblend better biome blend fabric bettercombat better combat betterend better end bettermounthud better mount hud betternether better nether bettersleeping bettersleeping bettersleeping core bettersleeping core org apache commons commons text commons text betterstats better statistics screen bettertridents better tridents blockrunner block runner bookshelf bookshelf bosses of mass destruction bosses of mass destruction beta maelstrom library maelstrom library multipart entities multipartentities botanypots botanypots botanytrees botanytrees bountiful bountiful bowinfinityfix bow infinity fix brazier brazier byg oh the biomes you ll go cameraoverhaul camera overhaul fabric universal capybara capybara cardinal components cardinal components api cardinal components base cardinal components api base cardinal components block cardinal components api blocks cardinal components chunk cardinal components api chunks cardinal components entity cardinal components api entities cardinal components item cardinal components api items cardinal components level cardinal components api world saves cardinal components scoreboard cardinal components api scoreboard cardinal components world cardinal components api worlds chalk chalk charm charm com moandjiezana toml charmofundying charm of undying charmonium charmonium chas craftable horse armour saddle fabric cleardespawn clear despawn cloth config cloth config cloth basic math cloth basic math clumps clumps companion companion connectiblechains connectible chains consistency plus consistency plus stonecutter recipe tags stonecutter recipe tags fabric continuity continuity creativecore creativecore fabric creeperoverhaul creeper overhaul croptopia croptopia com typesafe config config io leangen geantyref geantyref geantyref org spongepowered configurate core configurate core org spongepowered configurate hocon configurate hocon cull less leaves cull less leaves conditional mixin conditional mixin damagetilt damage tilt fabric dark loading screen dark loading screen darkpaintings darkpaintings debugify debugify deeperdarker deeper and darker fabric customportalapi custom portal api deepslatecutting deepslate cutting maybe data maybe data detailab detail armor bar fabric dragonloot dragonloot dummmmmmy mmmmmmmmmmmm easyanvils easy anvils easymagic easy magic ecologics ecologics edenring edenring effective effective elytraslot elytra slot enchdesc enchantmentdescriptions endgoblintraders end goblin traders enhanced attack indicator enhanced attack indicator enhancedcelestials enhanced celestials entity texture features entity texture features org apache httpcomponents httpmime httpmime entityculling entityculling fabric com logisticscraft occlusionculling occlusionculling snapshot everycomp every compat fabric api fabric api fabric api base fabric api base fabric api lookup api fabric api lookup api fabric biome api fabric biome api fabric blockrenderlayer fabric blockrenderlayer registration fabric client tags api fabric client tags fabric command api fabric command api fabric command api fabric command api fabric commands fabric commands fabric containers fabric containers fabric content registries fabric content registries fabric convention tags fabric convention tags fabric crash report info fabric crash report info fabric data generation api fabric data generation api fabric dimensions fabric dimensions api fabric entity events fabric entity events fabric events interaction fabric events interaction fabric events lifecycle fabric events lifecycle fabric game rule api fabric game rule api fabric item api fabric item api fabric item groups fabric item groups fabric key binding api fabric key binding api fabric keybindings fabric key bindings fabric lifecycle events fabric lifecycle events fabric loot api fabric loot api fabric loot tables fabric loot tables fabric message api fabric message api fabric mining level api fabric mining level api fabric models fabric models fabric networking api fabric networking api fabric networking fabric networking fabric object builder api fabric object builder api fabric particles fabric particles fabric registry sync fabric registry sync fabric renderer api fabric renderer api fabric renderer indigo fabric renderer indigo fabric renderer registries fabric renderer registries fabric rendering data attachment fabric rendering data attachment fabric rendering fluids fabric rendering fluids fabric rendering fabric rendering fabric rendering fabric rendering fabric resource conditions api fabric resource conditions api fabric resource loader fabric resource loader fabric screen api fabric screen api fabric screen handler api fabric screen handler api fabric textures fabric textures fabric transfer api fabric transfer api fabric transitive access wideners fabric transitive access wideners fabric language kotlin fabric language kotlin kotlin org jetbrains kotlin kotlin reflect kotlin reflect org jetbrains kotlin kotlin stdlib kotlin stdlib org jetbrains kotlin kotlin stdlib kotlin stdlib org jetbrains kotlin kotlin stdlib kotlin stdlib org jetbrains kotlinx atomicfu jvm atomicfu jvm org jetbrains kotlinx kotlinx coroutines core jvm kotlinx coroutines core jvm org jetbrains kotlinx kotlinx coroutines kotlinx coroutines org jetbrains kotlinx kotlinx datetime jvm kotlinx datetime jvm org jetbrains kotlinx kotlinx serialization cbor jvm kotlinx serialization cbor jvm org jetbrains kotlinx kotlinx serialization core jvm kotlinx serialization core jvm org jetbrains kotlinx kotlinx serialization json jvm kotlinx serialization json jvm fabricenchantments fabric enchantments fabricloader fabric loader fabricshieldlib fabric shield lib mm manningham mills fallingleaves falling leaves ferritecore ferritecore foodplusid food forgeconfigapiport forge config api port com electronwill night config core core com electronwill night config toml toml forgottenrecipes forgottenrecipes fwaystones fabric waystones geckolib com eliotlash mclib mclib mclib com eliotlash molang molang molang go fish go fish goblintraders goblin traders healthoverlay health overlay fiber fiber highlighter highlighter iceberg iceberg illuminations illuminations indium indium inmis inmis omega config omegaconfig inventorysorter inventory sorter kyrptconfig kyrpt config iris iris org anarres jcpp jcpp itemmodelfix item model fix jade jade java openjdk bit server vm jumpoverfences jump over fences kambrik kambrik krypton krypton com velocitypowered velocity native velocity native snapshot lazydfu lazydfu letmedespawn let me despawn fabric letsleepingdogslie let sleeping dogs lie lithium lithium mcda mc dungeons armors mcdar mc dungeons artifacts mcdw mc dungeons weapons enchant giver enchant giver reach entity attributes reach entity attributes mcwbridges macaw s bridges mcwdoors macaw s doors mcwfences macaw s fences and walls mcwlights macaw s lights and lamps mcwpaintings macaw s paintings mcwpaths macaw s paths and pavings mcwroofs macaw s roofs mcwtrpdoors macaw s trapdoors midnightlib midnightlib mimic mimic minecraft minecraft modmenu mod menu moonlight moonlight more axolotls more axolotls moreculling more culling mousetweaks mouse tweaks mythicmetals mythic metals naturalist naturalist notenoughcrashes not enough crashes onsoulfire on soul fire owo oωo blue endless jankson jankson oxidized oxidized patchouli patchouli fabric pickupnotifier pick up notifier player animator player animator polymorph polymorph puzzleslib puzzles lib reeses sodium options reese s sodium options build repurposed structures repurposed structures ringsofascension rings of ascension roughly enough trades roughly enough trades roughlyenoughitems roughly enough items error notifier error notifier roughlyenoughprofessions roughly enough professions roughlyenoughresources roughly enough resources runelic runelic simplyswords simply swords x sodium sodium build org joml joml joml sodium extra sodium extra build caffeineconfig caffeineconfig sound physics remastered sound physics remastered splatus ultris ultris boss expansion starlight starlight fabric supplementaries supplementaries terrablender terrablender terralith terralith the bumblezone the bumblezone fabric fake player api fake player api things things thonkutil thonkutil thonkutil base thonkutil base thonkutil capes thonkutil capes thonkutil coords thonkutil coords thonkutil customization thonkutil customization thonkutil legacy thonkutil legacy thonkutil modchecker thonkutil modchecker thonkutil potions thonkutil potions thonkutil titlescreen thonkutil titlescreen thonkutil trades thonkutil trades tipsmod tips toolstats toolstats trinkets trinkets twilightforest the twilight forest com google code findbugs here be no dragons here be no dragons javax annotation javax annotation api javax annotation api serialization hooks serialization hooks template template porting lib base porting lib base com github mixinextras mixinextras porting lib entity porting lib entity porting lib model generators porting lib model generators porting lib models porting lib models porting lib networking porting lib networking porting lib obj loader porting lib obj loader porting lib model loader porting lib model loader porting lib extensions porting lib extensions porting lib accessors porting lib accessors porting lib attributes porting lib attributes porting lib tags porting lib tags porting lib transfer porting lib transfer porting lib common porting lib common porting lib constants porting lib constants vanilla degus vanilla degus visualworkbench visual workbench voidz voidz waterdripsound drip sounds windchimes windchimes xp storage xp storage xp storage trinkets xp storage trinkets yet another config lib yetanotherconfiglib zoomify zoomify dev isxander settxi settxi core settxi core dev isxander settxi settxi gui settxi gui dev isxander settxi settxi kotlinx serialization settxi kotlinx serialization settxi gui yacl settxi gui yetanotherconfiglib loaded shaderpack bsl profile custom options changed by user server running true player count data packs vanilla fabric mods everycomp generated pack supplementaries generated pack world generation stable type integrated server map client txt is modded definitely client brand changed to fabric server brand changed to fabric launched version fabric loader client crashes since restart integrated server crashes since restart suspected mods none
| 0
|
5,582
| 8,441,959,213
|
IssuesEvent
|
2018-10-18 11:53:24
|
kiwicom/orbit-components
|
https://api.github.com/repos/kiwicom/orbit-components
|
closed
|
Label in Checkbox component should be optional
|
enhancement processing
|
**Is your feature request related to a problem? Please describe.**
We have a checkbox with a link to our terms and condition.
**Describe the solution you'd like**
Label for Checkbox component should be optional.
**Describe alternatives you've considered**
- Empty string in label :]
**Additional context**
I can't to this - https://monosnap.com/file/sWBtG6ksyCJYuLhaYHeuG2V32GeuuC
|
1.0
|
Label in Checkbox component should be optional - **Is your feature request related to a problem? Please describe.**
We have a checkbox with a link to our terms and condition.
**Describe the solution you'd like**
Label for Checkbox component should be optional.
**Describe alternatives you've considered**
- Empty string in label :]
**Additional context**
I can't to this - https://monosnap.com/file/sWBtG6ksyCJYuLhaYHeuG2V32GeuuC
|
process
|
label in checkbox component should be optional is your feature request related to a problem please describe we have a checkbox with a link to our terms and condition describe the solution you d like label for checkbox component should be optional describe alternatives you ve considered empty string in label additional context i can t to this
| 1
|
301,811
| 22,776,260,146
|
IssuesEvent
|
2022-07-08 14:44:04
|
towhee-io/towhee
|
https://api.github.com/repos/towhee-io/towhee
|
closed
|
[Documentation]:
|
kind/documentation
|
### Is there an existing issue for this?
- [X] I have searched the existing issues
### What kind of documentation would you like added or changed?
Hi, can you please provide an example on how to run this:
```
with towhee.api['file']() as api:
app_insert = (
api.image_load['file', 'img']()
.save_image['img', 'path'](dir='tmp/images')
.get_path_id['path', 'id']()
.image_text_embedding.clip['img', 'vec'](model_name='clip_vit_b32',modality='image')
.tensor_normalize['vec', 'vec']()
.milvus_insert[('id', 'vec'), 'res'](collection=milvus_collection)
.select['id', 'path']()
.serve('/insert', app)
)
```
but using and url instead of the local file? i think clip supports urls but can't find the example
### Why is this needed?
_No response_
### Anything else?
_No response_
|
1.0
|
[Documentation]: - ### Is there an existing issue for this?
- [X] I have searched the existing issues
### What kind of documentation would you like added or changed?
Hi, can you please provide an example on how to run this:
```
with towhee.api['file']() as api:
app_insert = (
api.image_load['file', 'img']()
.save_image['img', 'path'](dir='tmp/images')
.get_path_id['path', 'id']()
.image_text_embedding.clip['img', 'vec'](model_name='clip_vit_b32',modality='image')
.tensor_normalize['vec', 'vec']()
.milvus_insert[('id', 'vec'), 'res'](collection=milvus_collection)
.select['id', 'path']()
.serve('/insert', app)
)
```
but using and url instead of the local file? i think clip supports urls but can't find the example
### Why is this needed?
_No response_
### Anything else?
_No response_
|
non_process
|
is there an existing issue for this i have searched the existing issues what kind of documentation would you like added or changed hi can you please provide an example on how to run this with towhee api as api app insert api image load save image dir tmp images get path id image text embedding clip model name clip vit modality image tensor normalize milvus insert collection milvus collection select serve insert app but using and url instead of the local file i think clip supports urls but can t find the example why is this needed no response anything else no response
| 0
|
180,349
| 13,930,114,114
|
IssuesEvent
|
2020-10-22 01:32:47
|
OpenMined/PySyft
|
https://api.github.com/repos/OpenMined/PySyft
|
closed
|
Add torch.Tensor.copy_ to allowlist and test suite
|
Priority: 2 - High :cold_sweat: Severity: 3 - Medium :unamused: Status: Available :wave: Type: New Feature :heavy_plus_sign: Type: Testing :test_tube:
|
# Description
This issue is a part of Syft 0.3.0 Epic 2: https://github.com/OpenMined/PySyft/issues/3696
In this issue, you will be adding support for remote execution of the torch.Tensor.copy_
method or property. This might be a really small project (literally a one-liner) or
it might require adding significant functionality to PySyft OR to the testing suite
in order to make sure the feature is both functional and tested.
## Step 0: Run tests and ./scripts/pre_commit.sh
Before you get started with this project, let's make sure you have everything building and testing
correctly. Clone the codebase and run:
```pip uninstall syft```
followed by
```pip install -e .```
Then run the pre-commit file (which will also run the tests)
```./scripts/pre_commit.sh```
If all of these tests pass, continue on. If not, make sure you have all the
dependencies in requirements.txt installed, etc.
## Step 1: Uncomment your method in the allowlist.py file
Inside [allowlist.py](https://github.com/OpenMined/PySyft/blob/syft_0.3.0/src/syft/lib/torch/allowlist.py) you will find a huge dictionary of methods. Find your method and uncomment the line its on. At the time
of writing this Issue (WARNING: THIS MAY HAVE CHANGED) the dictionary maps from the
string name of the method (in your case 'torch.Tensor.copy_') to the string representation
of the type the method returns.
## Step 2: Run Unit Tests
Run the following:
```python setup.py test```
And wait to see if some of the tests fail. Why might the tests fail now? I'm so glad you asked!
https://github.com/OpenMined/PySyft/blob/syft_0.3.0/tests/syft/lib/torch/tensor/tensor_remote_method_api_suite_test.py
In this file you'll find the torch method test suite. It AUTOMATICALLY loads all methods
from the allowlist.py file you modified in the previous step. It attempts to test them.
# Step 3: If you get a Failing Test
If you get a failing test, this could be for one of a few reasons:
### Reason 1 - The testing suite passed in non-compatible arguments
The testing suite is pretty dumb. It literally just has a permutation of possible
arguments to pass into every method on torch tensors. So, if one of those permutations
doesn't work for your method (aka... perhaps it tries to call your method without
any arguments but torch.Tensor.copy_ actually requires some) then the test will
fail if the error hasn't been seen before.
If this happens - don't worry! Just look inside the only test in that file and look
for the huge lists of error types to ignore. Add your error to the list and keep
going!!!
*WARNING:* make sure that the testing suite actually tests your method via remote
execution once you've gotten all the tests passing. Aka - if the testing suite
doesn't have ANY matching argument permutations for your method, then you're
literally creating a bunch of unit tests that do absolutely nothing. If this is the
case, then ADD MORE ARGUMENT TYPES TO THE TESTING SUITE so that your argument
gets run via remote execution. DO NOT CLOSE THIS ISSUE until you can verify that
torch.Tensor.copy_ is actually executed remotely inside of a unit tests (and not
skipped). Aka - at least one of the test_all_allowlisted_tensor_methods_work_remotely_on_all_types
unit tests with your method should run ALL THE WAY TO THE END (instead of skipping
the last part.)
*Note:* adding another argument type might require some serialization work if
we don't support arguments of that type yet. If so, this is your job to add it
to the protobuf files in order to close this issue!
### Reason 2 - torch.Tensor.copy_ returns a non-supported type
If this happens, you've got a little bit of work in front of you. We don't have
pointer objects to very many remote object types. So, if your method returns anything
other than a single tensor, you probably need to add support for the type it returns
(Such as a bool, None, int, or other types).
*IMPORTANT:* do NOT return the value itself to the end user!!! Return a pointer object
to that type!
*NOTE:* at the time of writing - there are several core pieces of Syft not yet working
to allow you to return any type other than a torch tensor. If you're not comfortable
investigating what those might be - skip this issue and try again later once
someone else has solved these issues.
### Reason 3 - There's something else broken
Chase those stack traces! Talk to friends in Slack. Look at how other methods are supported.
This is a challenging project in a fast moving codebase!
And don't forget - if this project seems to complex - there are plenty of others that
might be easier.
|
2.0
|
Add torch.Tensor.copy_ to allowlist and test suite -
# Description
This issue is a part of Syft 0.3.0 Epic 2: https://github.com/OpenMined/PySyft/issues/3696
In this issue, you will be adding support for remote execution of the torch.Tensor.copy_
method or property. This might be a really small project (literally a one-liner) or
it might require adding significant functionality to PySyft OR to the testing suite
in order to make sure the feature is both functional and tested.
## Step 0: Run tests and ./scripts/pre_commit.sh
Before you get started with this project, let's make sure you have everything building and testing
correctly. Clone the codebase and run:
```pip uninstall syft```
followed by
```pip install -e .```
Then run the pre-commit file (which will also run the tests)
```./scripts/pre_commit.sh```
If all of these tests pass, continue on. If not, make sure you have all the
dependencies in requirements.txt installed, etc.
## Step 1: Uncomment your method in the allowlist.py file
Inside [allowlist.py](https://github.com/OpenMined/PySyft/blob/syft_0.3.0/src/syft/lib/torch/allowlist.py) you will find a huge dictionary of methods. Find your method and uncomment the line its on. At the time
of writing this Issue (WARNING: THIS MAY HAVE CHANGED) the dictionary maps from the
string name of the method (in your case 'torch.Tensor.copy_') to the string representation
of the type the method returns.
## Step 2: Run Unit Tests
Run the following:
```python setup.py test```
And wait to see if some of the tests fail. Why might the tests fail now? I'm so glad you asked!
https://github.com/OpenMined/PySyft/blob/syft_0.3.0/tests/syft/lib/torch/tensor/tensor_remote_method_api_suite_test.py
In this file you'll find the torch method test suite. It AUTOMATICALLY loads all methods
from the allowlist.py file you modified in the previous step. It attempts to test them.
# Step 3: If you get a Failing Test
If you get a failing test, this could be for one of a few reasons:
### Reason 1 - The testing suite passed in non-compatible arguments
The testing suite is pretty dumb. It literally just has a permutation of possible
arguments to pass into every method on torch tensors. So, if one of those permutations
doesn't work for your method (aka... perhaps it tries to call your method without
any arguments but torch.Tensor.copy_ actually requires some) then the test will
fail if the error hasn't been seen before.
If this happens - don't worry! Just look inside the only test in that file and look
for the huge lists of error types to ignore. Add your error to the list and keep
going!!!
*WARNING:* make sure that the testing suite actually tests your method via remote
execution once you've gotten all the tests passing. Aka - if the testing suite
doesn't have ANY matching argument permutations for your method, then you're
literally creating a bunch of unit tests that do absolutely nothing. If this is the
case, then ADD MORE ARGUMENT TYPES TO THE TESTING SUITE so that your argument
gets run via remote execution. DO NOT CLOSE THIS ISSUE until you can verify that
torch.Tensor.copy_ is actually executed remotely inside of a unit tests (and not
skipped). Aka - at least one of the test_all_allowlisted_tensor_methods_work_remotely_on_all_types
unit tests with your method should run ALL THE WAY TO THE END (instead of skipping
the last part.)
*Note:* adding another argument type might require some serialization work if
we don't support arguments of that type yet. If so, this is your job to add it
to the protobuf files in order to close this issue!
### Reason 2 - torch.Tensor.copy_ returns a non-supported type
If this happens, you've got a little bit of work in front of you. We don't have
pointer objects to very many remote object types. So, if your method returns anything
other than a single tensor, you probably need to add support for the type it returns
(Such as a bool, None, int, or other types).
*IMPORTANT:* do NOT return the value itself to the end user!!! Return a pointer object
to that type!
*NOTE:* at the time of writing - there are several core pieces of Syft not yet working
to allow you to return any type other than a torch tensor. If you're not comfortable
investigating what those might be - skip this issue and try again later once
someone else has solved these issues.
### Reason 3 - There's something else broken
Chase those stack traces! Talk to friends in Slack. Look at how other methods are supported.
This is a challenging project in a fast moving codebase!
And don't forget - if this project seems to complex - there are plenty of others that
might be easier.
|
non_process
|
add torch tensor copy to allowlist and test suite description this issue is a part of syft epic in this issue you will be adding support for remote execution of the torch tensor copy method or property this might be a really small project literally a one liner or it might require adding significant functionality to pysyft or to the testing suite in order to make sure the feature is both functional and tested step run tests and scripts pre commit sh before you get started with this project let s make sure you have everything building and testing correctly clone the codebase and run pip uninstall syft followed by pip install e then run the pre commit file which will also run the tests scripts pre commit sh if all of these tests pass continue on if not make sure you have all the dependencies in requirements txt installed etc step uncomment your method in the allowlist py file inside you will find a huge dictionary of methods find your method and uncomment the line its on at the time of writing this issue warning this may have changed the dictionary maps from the string name of the method in your case torch tensor copy to the string representation of the type the method returns step run unit tests run the following python setup py test and wait to see if some of the tests fail why might the tests fail now i m so glad you asked in this file you ll find the torch method test suite it automatically loads all methods from the allowlist py file you modified in the previous step it attempts to test them step if you get a failing test if you get a failing test this could be for one of a few reasons reason the testing suite passed in non compatible arguments the testing suite is pretty dumb it literally just has a permutation of possible arguments to pass into every method on torch tensors so if one of those permutations doesn t work for your method aka perhaps it tries to call your method without any arguments but torch tensor copy actually requires some then the test will fail if the error hasn t been seen before if this happens don t worry just look inside the only test in that file and look for the huge lists of error types to ignore add your error to the list and keep going warning make sure that the testing suite actually tests your method via remote execution once you ve gotten all the tests passing aka if the testing suite doesn t have any matching argument permutations for your method then you re literally creating a bunch of unit tests that do absolutely nothing if this is the case then add more argument types to the testing suite so that your argument gets run via remote execution do not close this issue until you can verify that torch tensor copy is actually executed remotely inside of a unit tests and not skipped aka at least one of the test all allowlisted tensor methods work remotely on all types unit tests with your method should run all the way to the end instead of skipping the last part note adding another argument type might require some serialization work if we don t support arguments of that type yet if so this is your job to add it to the protobuf files in order to close this issue reason torch tensor copy returns a non supported type if this happens you ve got a little bit of work in front of you we don t have pointer objects to very many remote object types so if your method returns anything other than a single tensor you probably need to add support for the type it returns such as a bool none int or other types important do not return the value itself to the end user return a pointer object to that type note at the time of writing there are several core pieces of syft not yet working to allow you to return any type other than a torch tensor if you re not comfortable investigating what those might be skip this issue and try again later once someone else has solved these issues reason there s something else broken chase those stack traces talk to friends in slack look at how other methods are supported this is a challenging project in a fast moving codebase and don t forget if this project seems to complex there are plenty of others that might be easier
| 0
|
16,368
| 21,056,725,977
|
IssuesEvent
|
2022-04-01 04:38:39
|
fei-protocol/fei-protocol-core
|
https://api.github.com/repos/fei-protocol/fei-protocol-core
|
closed
|
Git Flow
|
devops/process
|
The Problem: Our git flow is not very analogous to what actually happens, especially around dao proposals, and often causes 3-4 day periods where we expect integration tests to fail. Also, we often merge code to our develop/working branch that never hits mainnet.
Let's discuss potential git flows that are more suited for our use-case.
Statements that seem applicable/helpful to this situation, mostly subjective:
- we should have a branch that represents "the canonical, current state of the protocol"
- we should have a branch that represents "the expected near-future state of the protocol" so that developers can rebase off of others' work for parallel, but dependent features
- we should not merge code to the branch that represents "production" until it actually hits production (ie, the dao transaction executes)
- the branch that represents "production" should almost never have failing tests - unit or integration
Right now our git flow involves merging code to develop when it is complete & ready to go to the dao. Thanks to some awesome work by @thomas-waite , the relevant integration tests fail less often, and dao proposals get simulated when needed more often, but this hasn't solved everything. We have a theoretical "release" process, but no metric for when to make a release or what master represents other than "more absolute than develop".
|
1.0
|
Git Flow - The Problem: Our git flow is not very analogous to what actually happens, especially around dao proposals, and often causes 3-4 day periods where we expect integration tests to fail. Also, we often merge code to our develop/working branch that never hits mainnet.
Let's discuss potential git flows that are more suited for our use-case.
Statements that seem applicable/helpful to this situation, mostly subjective:
- we should have a branch that represents "the canonical, current state of the protocol"
- we should have a branch that represents "the expected near-future state of the protocol" so that developers can rebase off of others' work for parallel, but dependent features
- we should not merge code to the branch that represents "production" until it actually hits production (ie, the dao transaction executes)
- the branch that represents "production" should almost never have failing tests - unit or integration
Right now our git flow involves merging code to develop when it is complete & ready to go to the dao. Thanks to some awesome work by @thomas-waite , the relevant integration tests fail less often, and dao proposals get simulated when needed more often, but this hasn't solved everything. We have a theoretical "release" process, but no metric for when to make a release or what master represents other than "more absolute than develop".
|
process
|
git flow the problem our git flow is not very analogous to what actually happens especially around dao proposals and often causes day periods where we expect integration tests to fail also we often merge code to our develop working branch that never hits mainnet let s discuss potential git flows that are more suited for our use case statements that seem applicable helpful to this situation mostly subjective we should have a branch that represents the canonical current state of the protocol we should have a branch that represents the expected near future state of the protocol so that developers can rebase off of others work for parallel but dependent features we should not merge code to the branch that represents production until it actually hits production ie the dao transaction executes the branch that represents production should almost never have failing tests unit or integration right now our git flow involves merging code to develop when it is complete ready to go to the dao thanks to some awesome work by thomas waite the relevant integration tests fail less often and dao proposals get simulated when needed more often but this hasn t solved everything we have a theoretical release process but no metric for when to make a release or what master represents other than more absolute than develop
| 1
|
18,983
| 24,975,152,912
|
IssuesEvent
|
2022-11-02 07:00:25
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
MF in BP ontology
|
multi-species process
|
GO:0052054
negative regulation by symbiont of host peptidase activity
Any process in which an organism stops, prevents, or reduces the frequency, rate or extent of host protease activity, the catalysis of the hydrolysis of peptide bonds in a protein. The host is defined as the larger of the organisms involved in a symbiotic interaction.
this should be GO:0004866 endopeptidase inhibitor activity
has one annotation
https://www.uniprot.org/uniprot/Q6PQH2
(this IDA annotation could just be moved to the appropriate function term, currently provided by a KW mapping)
|
1.0
|
MF in BP ontology - GO:0052054
negative regulation by symbiont of host peptidase activity
Any process in which an organism stops, prevents, or reduces the frequency, rate or extent of host protease activity, the catalysis of the hydrolysis of peptide bonds in a protein. The host is defined as the larger of the organisms involved in a symbiotic interaction.
this should be GO:0004866 endopeptidase inhibitor activity
has one annotation
https://www.uniprot.org/uniprot/Q6PQH2
(this IDA annotation could just be moved to the appropriate function term, currently provided by a KW mapping)
|
process
|
mf in bp ontology go negative regulation by symbiont of host peptidase activity any process in which an organism stops prevents or reduces the frequency rate or extent of host protease activity the catalysis of the hydrolysis of peptide bonds in a protein the host is defined as the larger of the organisms involved in a symbiotic interaction this should be go endopeptidase inhibitor activity has one annotation this ida annotation could just be moved to the appropriate function term currently provided by a kw mapping
| 1
|
111,711
| 14,138,883,750
|
IssuesEvent
|
2020-11-10 09:05:09
|
milanbargiel/kulturgenerator
|
https://api.github.com/repos/milanbargiel/kulturgenerator
|
closed
|
[App] Finalize content of info page
|
wait for design
|
### Requirements
- Text on info & submission page https://staging.kulturgenerator.org/about is final
- It has been decided, which statistics are to be presented (e.g. Time since existence, number of artworks, number of sold artworks
- Statistics are based on real values from backend
- The layout of the info page looks like defined in @ludwiglederer's design manual
[designmanual2.pdf](https://github.com/milanbargiel/kulturgenerator/files/5500048/designmanual2.pdf)
### Scope
There are seperate issues for the art-submission form and the footer.
closes #141
|
1.0
|
[App] Finalize content of info page - ### Requirements
- Text on info & submission page https://staging.kulturgenerator.org/about is final
- It has been decided, which statistics are to be presented (e.g. Time since existence, number of artworks, number of sold artworks
- Statistics are based on real values from backend
- The layout of the info page looks like defined in @ludwiglederer's design manual
[designmanual2.pdf](https://github.com/milanbargiel/kulturgenerator/files/5500048/designmanual2.pdf)
### Scope
There are seperate issues for the art-submission form and the footer.
closes #141
|
non_process
|
finalize content of info page requirements text on info submission page is final it has been decided which statistics are to be presented e g time since existence number of artworks number of sold artworks statistics are based on real values from backend the layout of the info page looks like defined in ludwiglederer s design manual scope there are seperate issues for the art submission form and the footer closes
| 0
|
14,285
| 17,260,784,111
|
IssuesEvent
|
2021-07-22 07:17:29
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[Android] User should be redirected to Password help screen for the following scenario
|
Android Bug P2 Process: Fixed Process: Tested QA Process: Tested dev
|
**Steps**
1. Install mobile app in the device
2. Open the app
3. Click on Sign up
4. Enter all the required field and click on submit
5. After navigating to the verification step, without entering the verification code kill the app
6. Open the app
7. Click on forgot password
8. Enter a valid email and click on submit button
9. Enter verification code and click on done
10. Observe
**AR:** User is redirected to the Sign-in screen
**ER:** User should be redirected to Password help screen
[**Note:** This is a scenario where a user attempts forgot a password without verifying the account]
|
3.0
|
[Android] User should be redirected to Password help screen for the following scenario - **Steps**
1. Install mobile app in the device
2. Open the app
3. Click on Sign up
4. Enter all the required field and click on submit
5. After navigating to the verification step, without entering the verification code kill the app
6. Open the app
7. Click on forgot password
8. Enter a valid email and click on submit button
9. Enter verification code and click on done
10. Observe
**AR:** User is redirected to the Sign-in screen
**ER:** User should be redirected to Password help screen
[**Note:** This is a scenario where a user attempts forgot a password without verifying the account]
|
process
|
user should be redirected to password help screen for the following scenario steps install mobile app in the device open the app click on sign up enter all the required field and click on submit after navigating to the verification step without entering the verification code kill the app open the app click on forgot password enter a valid email and click on submit button enter verification code and click on done observe ar user is redirected to the sign in screen er user should be redirected to password help screen
| 1
|
11,770
| 14,600,141,990
|
IssuesEvent
|
2020-12-21 06:10:27
|
cypress-io/cypress-documentation
|
https://api.github.com/repos/cypress-io/cypress-documentation
|
closed
|
Speed up Cypress tests by splitting the dominant spec
|
process: ci
|
We have one spec that dominates the duration of the run.
<img width="1091" alt="Screen Shot 2020-12-15 at 10 39 42 AM" src="https://user-images.githubusercontent.com/2212006/102236742-da7a7200-3ec1-11eb-8ddd-3fcc4f4fa7ff.png">
We should split this spec to speed things up.
|
1.0
|
Speed up Cypress tests by splitting the dominant spec - We have one spec that dominates the duration of the run.
<img width="1091" alt="Screen Shot 2020-12-15 at 10 39 42 AM" src="https://user-images.githubusercontent.com/2212006/102236742-da7a7200-3ec1-11eb-8ddd-3fcc4f4fa7ff.png">
We should split this spec to speed things up.
|
process
|
speed up cypress tests by splitting the dominant spec we have one spec that dominates the duration of the run img width alt screen shot at am src we should split this spec to speed things up
| 1
|
6,470
| 9,546,700,656
|
IssuesEvent
|
2019-05-01 20:43:31
|
openopps/openopps-platform
|
https://api.github.com/repos/openopps/openopps-platform
|
closed
|
Department of State: Experience
|
Apply Process Approved Requirements Ready State Dept.
|
Who: Student Applicant
What: Experience and references page
Why: As a student applicant I want to provide my experience in my application
A/C
- The information will be populated from the users USAJOBS profile
- "All fields are required unless otherwise noted" (right margin)
- There will be a header: Experience & References (Bold)
- Header: Experience
- There will be a card for each experience entry with the following information:
- There will be a + that will open the card to expose the information
- Experience Title
- The agency or company name (Bold)
- The date range of the experience
- The "+Add experience" link will take the user to a blank Experience & Reference page
- https://opm.invisionapp.com/d/main/#/console/15360465/319289332/preview
- When the user clicks the "save experience" button the user will return to the "Education & References" page and the new experience will be listed
- When the user clicks the "cancel and return" button the user will return to the "Education & References" page and any information they entered will be discarded
- Adding work experience in Open Opportunities during the application process will not update the user's USAJOBS profile with that work experience.
Invision Mock: https://opm.invisionapp.com/d/main/#/console/15360465/319289315/preview
Public Link : https://opm.invisionapp.com/share/ZEPNZR09Q54
|
1.0
|
Department of State: Experience - Who: Student Applicant
What: Experience and references page
Why: As a student applicant I want to provide my experience in my application
A/C
- The information will be populated from the users USAJOBS profile
- "All fields are required unless otherwise noted" (right margin)
- There will be a header: Experience & References (Bold)
- Header: Experience
- There will be a card for each experience entry with the following information:
- There will be a + that will open the card to expose the information
- Experience Title
- The agency or company name (Bold)
- The date range of the experience
- The "+Add experience" link will take the user to a blank Experience & Reference page
- https://opm.invisionapp.com/d/main/#/console/15360465/319289332/preview
- When the user clicks the "save experience" button the user will return to the "Education & References" page and the new experience will be listed
- When the user clicks the "cancel and return" button the user will return to the "Education & References" page and any information they entered will be discarded
- Adding work experience in Open Opportunities during the application process will not update the user's USAJOBS profile with that work experience.
Invision Mock: https://opm.invisionapp.com/d/main/#/console/15360465/319289315/preview
Public Link : https://opm.invisionapp.com/share/ZEPNZR09Q54
|
process
|
department of state experience who student applicant what experience and references page why as a student applicant i want to provide my experience in my application a c the information will be populated from the users usajobs profile all fields are required unless otherwise noted right margin there will be a header experience references bold header experience there will be a card for each experience entry with the following information there will be a that will open the card to expose the information experience title the agency or company name bold the date range of the experience the add experience link will take the user to a blank experience reference page when the user clicks the save experience button the user will return to the education references page and the new experience will be listed when the user clicks the cancel and return button the user will return to the education references page and any information they entered will be discarded adding work experience in open opportunities during the application process will not update the user s usajobs profile with that work experience invision mock public link
| 1
|
229,079
| 25,288,921,589
|
IssuesEvent
|
2022-11-16 21:55:21
|
github/roadmap
|
https://api.github.com/repos/github/roadmap
|
closed
|
Enterprise-level security overview (server beta)
|
beta github advanced security security & compliance server GHES 3.5
|
### Summary
Security overview aggregates security and compliance results. This data is already visible at the organization level - the next step is to make it available at the [enterprise level](https://docs.github.com/en/enterprise-server@3.3/admin/overview/about-enterprise-accounts).
In its initial (beta) release the enterprise-level security overview will include a repo-centric view of alert aggregates and an alert-centric view of all secret scanning alerts across an enterprise. We will add alert-centric views for Dependabot alerts and code scanning alerts before declaring the view generally available.
### Intended Outcome
Security teams working across many organizations within an enterprise will have a single user interface that they can see all alerts in. In addition, teams that wish to automate actions at the enterprise level will be able to use enterprise-level security enterprise-level APIs for each alert type (e.g., the API already released for [secret scanning](https://docs.github.com/en/rest/reference/secret-scanning#list-secret-scanning-alerts-for-an-enterprise)).
|
True
|
Enterprise-level security overview (server beta) - ### Summary
Security overview aggregates security and compliance results. This data is already visible at the organization level - the next step is to make it available at the [enterprise level](https://docs.github.com/en/enterprise-server@3.3/admin/overview/about-enterprise-accounts).
In its initial (beta) release the enterprise-level security overview will include a repo-centric view of alert aggregates and an alert-centric view of all secret scanning alerts across an enterprise. We will add alert-centric views for Dependabot alerts and code scanning alerts before declaring the view generally available.
### Intended Outcome
Security teams working across many organizations within an enterprise will have a single user interface that they can see all alerts in. In addition, teams that wish to automate actions at the enterprise level will be able to use enterprise-level security enterprise-level APIs for each alert type (e.g., the API already released for [secret scanning](https://docs.github.com/en/rest/reference/secret-scanning#list-secret-scanning-alerts-for-an-enterprise)).
|
non_process
|
enterprise level security overview server beta summary security overview aggregates security and compliance results this data is already visible at the organization level the next step is to make it available at the in its initial beta release the enterprise level security overview will include a repo centric view of alert aggregates and an alert centric view of all secret scanning alerts across an enterprise we will add alert centric views for dependabot alerts and code scanning alerts before declaring the view generally available intended outcome security teams working across many organizations within an enterprise will have a single user interface that they can see all alerts in in addition teams that wish to automate actions at the enterprise level will be able to use enterprise level security enterprise level apis for each alert type e g the api already released for
| 0
|
654,032
| 21,634,599,523
|
IssuesEvent
|
2022-05-05 13:12:04
|
gama-platform/gama
|
https://api.github.com/repos/gama-platform/gama
|
closed
|
No display on Wayland backend (like on Ubuntu 22.04)
|
😱 Bug OS Linux 🖥 Display OpenGL Priority High 📺 Display Java2D V. 1.8.2
|
**Describe the bug**
The new release of Ubuntu (22.04) did switch default window manager from X11 to Wayland.
Unfortunatly, it completly change every low-level graphical logic and :
- Java2D displays don't work

- It have huge conflict with OpenGL making the full software to crash.
**To Reproduce**
Steps to reproduce the behavior:
0. From fresh Ubuntu 22.04
1. Install GAMA with any JDK 17
2. Open GAMA
3. Start an experiment with an opengl display
4. See instant GAMA crash
Here's gathered logs :
```
/opt/gama-platform/Gama
Reading flag use_global_preference_store with value true
Reading flag read_only with value false
Reading flag use_old_tabs with value true
Reading flag use_legacy_drawers with value false
Reading flag use_delayed_resize with value false
System property swt.autoScale = null
System property sun.java2d.uiScale.enabled = false -- changed to = false
> GAMA: version 1.8.2 loading on____ linux 5.15.0-27-generic, x86_64, JDK 17.0.3
> JAI : ImageIO extensions loaded for____ jpg||tiff|bmp|gif|arx|tf8|TF8|png|ppm|jp2|tif|TIF|asc|TIFF|btf|BTF|pgm|wbmp|jpeg|pbm
> GAMA: msi.gama.core loaded in_____ 1664ms
> GAMA: simtools.gaml.extensions.traffic loaded in_____ 67ms
> GAMA: irit.gaml.extensions.database loaded in_____ 19ms
> GAMA: msi.gaml.extensions.fipa loaded in_____ 26ms
> GAMA: msi.gaml.architecture.simplebdi loaded in_____ 94ms
> GAMA: ummisco.gaml.extensions.maths loaded in_____ 40ms
> GAMA: ummisco.gaml.extensions.stats loaded in_____ 34ms
> GAMA: miat.gaml.extension.pedestrian loaded in_____ 23ms
> GAMA: msi.gama.lang.gaml loaded in_____ 8ms
> GAMA: ummisco.gama.network loaded in_____ 11ms
> GAMA: simtools.gaml.extensions.physics loaded in_____ 20ms
> GAMA: ummisco.gama.opengl loaded in_____ 9ms
> GAMA: ummisco.gama.java2d loaded in_____ 1ms
> GAMA: ummisco.gama.serialize loaded in_____ 24ms
> GAMA: msi.gama.headless loaded in_____ 5ms
> GAMA: espacedev.gaml.extensions.genstar loaded in_____ 21ms
> GAMA: all plugins loaded in_____ 2231ms
Value of clearWorkspace pref: false
WaylandCompositor requires eglBindWaylandDisplayWL, eglUnbindWaylandDisplayWL and eglQueryWaylandBuffer.
Nested Wayland compositor could not initialize EGL
Part Opened:Interactive console
Arguments received by GAMA : []
> GAMA: GAML artefacts built in______ 45ms
> GAMA: preferences loaded in_____ 8325ms
Part Opened:Parameters
Part Opened:Console
Part Opened:Parameters
Loading redefined SWTAccessor
#
# A fatal error has been detected by the Java Runtime Environment:
#
# SIGSEGV (0xb) at pc=0x00007f6677f01497, pid=10188, tid=10189
#
# JRE version: OpenJDK Runtime Environment (17.0.3+7) (build 17.0.3+7-Ubuntu-0ubuntu0.22.04.1)
# Java VM: OpenJDK 64-Bit Server VM (17.0.3+7-Ubuntu-0ubuntu0.22.04.1, mixed mode, sharing, tiered, compressed oops, compressed class ptrs, g1 gc, linux-amd64)
# Problematic frame:
# C [libX11.so.6+0x41497] _XSend+0x37
#
# Core dump will be written. Default location: Core dumps may be processed with "/usr/share/apport/apport %p %s %c %d %P %E" (or dumping to /home/ubuntu/Downloads/core.10188)
#
# An error report file with more information is saved as:
# /home/ubuntu/Downloads/hs_err_pid10188.log
[thread 10345 also had an error]
#
# If you would like to submit a bug report, please visit:
# Unknown
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
```
**Expected behavior**
Not crash and displays working 🙃
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Desktop (please complete the following information):**
- OS: Ubuntu 22.04
- GAMA version: 1.8.2
- Java version: 17
- Graphics cards / Display system: VM system / Wayland
**Additional context**
There's a workaround by setting the environment variable back to X11 (i.e. forcing to use this backend) : `GDK_BACKEND=x11`
This way, launching GAMA like this : `GDK_BACKEND=x11 ./Gama` doesn't have any of the issue listed above
|
1.0
|
No display on Wayland backend (like on Ubuntu 22.04) - **Describe the bug**
The new release of Ubuntu (22.04) did switch default window manager from X11 to Wayland.
Unfortunatly, it completly change every low-level graphical logic and :
- Java2D displays don't work

- It have huge conflict with OpenGL making the full software to crash.
**To Reproduce**
Steps to reproduce the behavior:
0. From fresh Ubuntu 22.04
1. Install GAMA with any JDK 17
2. Open GAMA
3. Start an experiment with an opengl display
4. See instant GAMA crash
Here's gathered logs :
```
/opt/gama-platform/Gama
Reading flag use_global_preference_store with value true
Reading flag read_only with value false
Reading flag use_old_tabs with value true
Reading flag use_legacy_drawers with value false
Reading flag use_delayed_resize with value false
System property swt.autoScale = null
System property sun.java2d.uiScale.enabled = false -- changed to = false
> GAMA: version 1.8.2 loading on____ linux 5.15.0-27-generic, x86_64, JDK 17.0.3
> JAI : ImageIO extensions loaded for____ jpg||tiff|bmp|gif|arx|tf8|TF8|png|ppm|jp2|tif|TIF|asc|TIFF|btf|BTF|pgm|wbmp|jpeg|pbm
> GAMA: msi.gama.core loaded in_____ 1664ms
> GAMA: simtools.gaml.extensions.traffic loaded in_____ 67ms
> GAMA: irit.gaml.extensions.database loaded in_____ 19ms
> GAMA: msi.gaml.extensions.fipa loaded in_____ 26ms
> GAMA: msi.gaml.architecture.simplebdi loaded in_____ 94ms
> GAMA: ummisco.gaml.extensions.maths loaded in_____ 40ms
> GAMA: ummisco.gaml.extensions.stats loaded in_____ 34ms
> GAMA: miat.gaml.extension.pedestrian loaded in_____ 23ms
> GAMA: msi.gama.lang.gaml loaded in_____ 8ms
> GAMA: ummisco.gama.network loaded in_____ 11ms
> GAMA: simtools.gaml.extensions.physics loaded in_____ 20ms
> GAMA: ummisco.gama.opengl loaded in_____ 9ms
> GAMA: ummisco.gama.java2d loaded in_____ 1ms
> GAMA: ummisco.gama.serialize loaded in_____ 24ms
> GAMA: msi.gama.headless loaded in_____ 5ms
> GAMA: espacedev.gaml.extensions.genstar loaded in_____ 21ms
> GAMA: all plugins loaded in_____ 2231ms
Value of clearWorkspace pref: false
WaylandCompositor requires eglBindWaylandDisplayWL, eglUnbindWaylandDisplayWL and eglQueryWaylandBuffer.
Nested Wayland compositor could not initialize EGL
Part Opened:Interactive console
Arguments received by GAMA : []
> GAMA: GAML artefacts built in______ 45ms
> GAMA: preferences loaded in_____ 8325ms
Part Opened:Parameters
Part Opened:Console
Part Opened:Parameters
Loading redefined SWTAccessor
#
# A fatal error has been detected by the Java Runtime Environment:
#
# SIGSEGV (0xb) at pc=0x00007f6677f01497, pid=10188, tid=10189
#
# JRE version: OpenJDK Runtime Environment (17.0.3+7) (build 17.0.3+7-Ubuntu-0ubuntu0.22.04.1)
# Java VM: OpenJDK 64-Bit Server VM (17.0.3+7-Ubuntu-0ubuntu0.22.04.1, mixed mode, sharing, tiered, compressed oops, compressed class ptrs, g1 gc, linux-amd64)
# Problematic frame:
# C [libX11.so.6+0x41497] _XSend+0x37
#
# Core dump will be written. Default location: Core dumps may be processed with "/usr/share/apport/apport %p %s %c %d %P %E" (or dumping to /home/ubuntu/Downloads/core.10188)
#
# An error report file with more information is saved as:
# /home/ubuntu/Downloads/hs_err_pid10188.log
[thread 10345 also had an error]
#
# If you would like to submit a bug report, please visit:
# Unknown
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
```
**Expected behavior**
Not crash and displays working 🙃
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Desktop (please complete the following information):**
- OS: Ubuntu 22.04
- GAMA version: 1.8.2
- Java version: 17
- Graphics cards / Display system: VM system / Wayland
**Additional context**
There's a workaround by setting the environment variable back to X11 (i.e. forcing to use this backend) : `GDK_BACKEND=x11`
This way, launching GAMA like this : `GDK_BACKEND=x11 ./Gama` doesn't have any of the issue listed above
|
non_process
|
no display on wayland backend like on ubuntu describe the bug the new release of ubuntu did switch default window manager from to wayland unfortunatly it completly change every low level graphical logic and displays don t work it have huge conflict with opengl making the full software to crash to reproduce steps to reproduce the behavior from fresh ubuntu install gama with any jdk open gama start an experiment with an opengl display see instant gama crash here s gathered logs opt gama platform gama reading flag use global preference store with value true reading flag read only with value false reading flag use old tabs with value true reading flag use legacy drawers with value false reading flag use delayed resize with value false system property swt autoscale null system property sun uiscale enabled false changed to false gama version loading on linux generic jdk jai imageio extensions loaded for jpg tiff bmp gif arx png ppm tif tif asc tiff btf btf pgm wbmp jpeg pbm gama msi gama core loaded in gama simtools gaml extensions traffic loaded in gama irit gaml extensions database loaded in gama msi gaml extensions fipa loaded in gama msi gaml architecture simplebdi loaded in gama ummisco gaml extensions maths loaded in gama ummisco gaml extensions stats loaded in gama miat gaml extension pedestrian loaded in gama msi gama lang gaml loaded in gama ummisco gama network loaded in gama simtools gaml extensions physics loaded in gama ummisco gama opengl loaded in gama ummisco gama loaded in gama ummisco gama serialize loaded in gama msi gama headless loaded in gama espacedev gaml extensions genstar loaded in gama all plugins loaded in value of clearworkspace pref false waylandcompositor requires eglbindwaylanddisplaywl eglunbindwaylanddisplaywl and eglquerywaylandbuffer nested wayland compositor could not initialize egl part opened interactive console arguments received by gama gama gaml artefacts built in gama preferences loaded in part opened parameters part opened console part opened parameters loading redefined swtaccessor a fatal error has been detected by the java runtime environment sigsegv at pc pid tid jre version openjdk runtime environment build ubuntu java vm openjdk bit server vm ubuntu mixed mode sharing tiered compressed oops compressed class ptrs gc linux problematic frame c xsend core dump will be written default location core dumps may be processed with usr share apport apport p s c d p e or dumping to home ubuntu downloads core an error report file with more information is saved as home ubuntu downloads hs err log if you would like to submit a bug report please visit unknown the crash happened outside the java virtual machine in native code see problematic frame for where to report the bug expected behavior not crash and displays working 🙃 screenshots if applicable add screenshots to help explain your problem desktop please complete the following information os ubuntu gama version java version graphics cards display system vm system wayland additional context there s a workaround by setting the environment variable back to i e forcing to use this backend gdk backend this way launching gama like this gdk backend gama doesn t have any of the issue listed above
| 0
|
329,652
| 24,230,518,737
|
IssuesEvent
|
2022-09-26 17:50:55
|
sjteresi/TE_Density
|
https://api.github.com/repos/sjteresi/TE_Density
|
opened
|
Pytables Pickle warning for gene data
|
documentation
|
Copied from PR #117
I ran the Arabidopsis genome set (the same one you did above) and got the following warning:
```/mnt/ufs18/rs-004/edgerpat_lab/Scotty/TE_Density/transposon/gene_data.py:112: PerformanceWarning: your performance may suffer as PyTables will pickle object types that it cannot map directly to c-types [inferred_type->mixed,key->block0_values] [items->Index(['Chromosome', 'Feature', 'Start', 'Stop', 'Strand', 'Length', 'Genome_ID'],dtype='object')]```
Basically the warning comes from `import_filtered_genes.py` and how it reads in some of the above columns. For the tings that are essentially strings it reads them as the object dtype in pandas, and apparently pytables/h5py doesn't like that for writing the h5 files during gene_data.write(). I got the above warning for each chromosome of data. SO i tried making the import filtered genes code even more explicit and have it read the string columns with [pd.StringDtype()](https://pandas.pydata.org/docs/user_guide/text.html) and that caused the code to crash with: TypeError: objects of type ``StringArray`` are not supported in this context, sorry; supported objects are: NumPy array, record or scalar; homogeneous list or tuple, integer, float, complex or bytes. I am not quite sure how to fix this, my intuition tells me that the solution must have something to do with how we declare the data types in the pandas.DataFrame before we try to write to hdf5.
TLDR: Pytables gives a performance warning when writing the cleaned gene data to HDF5 because it doesn't like the pandas `object` dtype. I tried setting alternative string dtypes in the pandas dataframe and that didn't work either. This may not be a big issue because it is merely a performance warning and we are only doing this once for each chromosome.
|
1.0
|
Pytables Pickle warning for gene data - Copied from PR #117
I ran the Arabidopsis genome set (the same one you did above) and got the following warning:
```/mnt/ufs18/rs-004/edgerpat_lab/Scotty/TE_Density/transposon/gene_data.py:112: PerformanceWarning: your performance may suffer as PyTables will pickle object types that it cannot map directly to c-types [inferred_type->mixed,key->block0_values] [items->Index(['Chromosome', 'Feature', 'Start', 'Stop', 'Strand', 'Length', 'Genome_ID'],dtype='object')]```
Basically the warning comes from `import_filtered_genes.py` and how it reads in some of the above columns. For the tings that are essentially strings it reads them as the object dtype in pandas, and apparently pytables/h5py doesn't like that for writing the h5 files during gene_data.write(). I got the above warning for each chromosome of data. SO i tried making the import filtered genes code even more explicit and have it read the string columns with [pd.StringDtype()](https://pandas.pydata.org/docs/user_guide/text.html) and that caused the code to crash with: TypeError: objects of type ``StringArray`` are not supported in this context, sorry; supported objects are: NumPy array, record or scalar; homogeneous list or tuple, integer, float, complex or bytes. I am not quite sure how to fix this, my intuition tells me that the solution must have something to do with how we declare the data types in the pandas.DataFrame before we try to write to hdf5.
TLDR: Pytables gives a performance warning when writing the cleaned gene data to HDF5 because it doesn't like the pandas `object` dtype. I tried setting alternative string dtypes in the pandas dataframe and that didn't work either. This may not be a big issue because it is merely a performance warning and we are only doing this once for each chromosome.
|
non_process
|
pytables pickle warning for gene data copied from pr i ran the arabidopsis genome set the same one you did above and got the following warning mnt rs edgerpat lab scotty te density transposon gene data py performancewarning your performance may suffer as pytables will pickle object types that it cannot map directly to c types dtype object basically the warning comes from import filtered genes py and how it reads in some of the above columns for the tings that are essentially strings it reads them as the object dtype in pandas and apparently pytables doesn t like that for writing the files during gene data write i got the above warning for each chromosome of data so i tried making the import filtered genes code even more explicit and have it read the string columns with and that caused the code to crash with typeerror objects of type stringarray are not supported in this context sorry supported objects are numpy array record or scalar homogeneous list or tuple integer float complex or bytes i am not quite sure how to fix this my intuition tells me that the solution must have something to do with how we declare the data types in the pandas dataframe before we try to write to tldr pytables gives a performance warning when writing the cleaned gene data to because it doesn t like the pandas object dtype i tried setting alternative string dtypes in the pandas dataframe and that didn t work either this may not be a big issue because it is merely a performance warning and we are only doing this once for each chromosome
| 0
|
406,184
| 27,555,002,288
|
IssuesEvent
|
2023-03-07 17:16:30
|
cweagans/composer-patches
|
https://api.github.com/repos/cweagans/composer-patches
|
closed
|
New documentation website is missing the ignore-patches
|
documentation
|
### Verification
- [X] I have searched existing issues _and_ discussions for my problem.
### Which file do you have feedback about? (leave blank if not applicable)
_No response_
### Feedback
I was searching for the documentation about ignoring patches but was presented with the new and great looking new docs: https://docs.cweagans.net/composer-patches/
I searched for the ignoring patches documentation, but this seems to be missing? I could find it eventually by looking into the history of `README.md`:
https://github.com/cweagans/composer-patches/blob/267a32b8105464686cdade12c34d32c54c77baa8/README.md#ignoring-patches
|
1.0
|
New documentation website is missing the ignore-patches - ### Verification
- [X] I have searched existing issues _and_ discussions for my problem.
### Which file do you have feedback about? (leave blank if not applicable)
_No response_
### Feedback
I was searching for the documentation about ignoring patches but was presented with the new and great looking new docs: https://docs.cweagans.net/composer-patches/
I searched for the ignoring patches documentation, but this seems to be missing? I could find it eventually by looking into the history of `README.md`:
https://github.com/cweagans/composer-patches/blob/267a32b8105464686cdade12c34d32c54c77baa8/README.md#ignoring-patches
|
non_process
|
new documentation website is missing the ignore patches verification i have searched existing issues and discussions for my problem which file do you have feedback about leave blank if not applicable no response feedback i was searching for the documentation about ignoring patches but was presented with the new and great looking new docs i searched for the ignoring patches documentation but this seems to be missing i could find it eventually by looking into the history of readme md
| 0
|
18,753
| 24,656,578,554
|
IssuesEvent
|
2022-10-18 00:30:20
|
openxla/stablehlo
|
https://api.github.com/repos/openxla/stablehlo
|
opened
|
Automatically generate status.md
|
Process
|
As discussed in #331, it would be useful to be able to automatically generate status.md to avoid it diverging from the actual status of StableHLO. The comments on the linked pull request summarize the challenges and propose some ideas for how to go about this.
|
1.0
|
Automatically generate status.md - As discussed in #331, it would be useful to be able to automatically generate status.md to avoid it diverging from the actual status of StableHLO. The comments on the linked pull request summarize the challenges and propose some ideas for how to go about this.
|
process
|
automatically generate status md as discussed in it would be useful to be able to automatically generate status md to avoid it diverging from the actual status of stablehlo the comments on the linked pull request summarize the challenges and propose some ideas for how to go about this
| 1
|
286,361
| 31,561,149,682
|
IssuesEvent
|
2023-09-03 09:14:51
|
SovereignCloudStack/issues
|
https://api.github.com/repos/SovereignCloudStack/issues
|
opened
|
IaaS Internal Pentesting (Gray-Box)
|
IaaS security
|
Assessing the security of internal resources and components hosted within a SCS testbed deployment. This will be done from a Gray-box perspective, which involves a mix of knowledge about the environment, similar to internal personnel, combined with external information that an attacker might gather.
(Related to #391)
## Tasks
- [ ] Planning and Preparation
- [ ] Reconnaissance & Information Gathering
- [ ] Enumeration & Service Discovery
- [ ] Vulnerability scanning and Analysis
- [ ] Exploitation
- [ ] Privilege Escalation
- [ ] Lateral Movement
- [ ] Data and Resources exposure
- [ ] Enumeration with Low privileged users/roles
- [ ] Enumeration with High privileged users/roles
- [ ] Documentation & Reporting
|
True
|
IaaS Internal Pentesting (Gray-Box) - Assessing the security of internal resources and components hosted within a SCS testbed deployment. This will be done from a Gray-box perspective, which involves a mix of knowledge about the environment, similar to internal personnel, combined with external information that an attacker might gather.
(Related to #391)
## Tasks
- [ ] Planning and Preparation
- [ ] Reconnaissance & Information Gathering
- [ ] Enumeration & Service Discovery
- [ ] Vulnerability scanning and Analysis
- [ ] Exploitation
- [ ] Privilege Escalation
- [ ] Lateral Movement
- [ ] Data and Resources exposure
- [ ] Enumeration with Low privileged users/roles
- [ ] Enumeration with High privileged users/roles
- [ ] Documentation & Reporting
|
non_process
|
iaas internal pentesting gray box assessing the security of internal resources and components hosted within a scs testbed deployment this will be done from a gray box perspective which involves a mix of knowledge about the environment similar to internal personnel combined with external information that an attacker might gather related to tasks planning and preparation reconnaissance information gathering enumeration service discovery vulnerability scanning and analysis exploitation privilege escalation lateral movement data and resources exposure enumeration with low privileged users roles enumeration with high privileged users roles documentation reporting
| 0
|
387,719
| 26,734,747,862
|
IssuesEvent
|
2023-01-30 08:35:53
|
mobxjs/mobx
|
https://api.github.com/repos/mobxjs/mobx
|
closed
|
observable.set triggers a refresh whenever any value in the set changes, even if that value was not accessed in the observer
|
❔ question 📖 documentation
|
**Intended outcome:**
I am using observable.set with many often-changing values. I also have many small observers observing the presence of single values in that set.
I would expect each observer to only refresh if the value it is observing is changing, e.g. if `observable.has("mainValue")` switches from false to true or vice versa.
**Actual outcome:**
The observer that uses `observable.has("mainValue")` is called any time *any* value in the set is added or deleted, e.g. if I call `observable.add("otherValue")` or `observable.delete("otherValue")`.
**How to reproduce the issue:**
Minimal example showing the problem:
https://codesandbox.io/s/mobx-observable-set-rerk61?file=/index.js
Minimal example showing that this isn't happening for observable.maps (which is my current workaround):
https://codesandbox.io/s/mobx-observable-map-bhhgxi?file=/index.js
**Versions**
- mobx 6.5.0
|
1.0
|
observable.set triggers a refresh whenever any value in the set changes, even if that value was not accessed in the observer - **Intended outcome:**
I am using observable.set with many often-changing values. I also have many small observers observing the presence of single values in that set.
I would expect each observer to only refresh if the value it is observing is changing, e.g. if `observable.has("mainValue")` switches from false to true or vice versa.
**Actual outcome:**
The observer that uses `observable.has("mainValue")` is called any time *any* value in the set is added or deleted, e.g. if I call `observable.add("otherValue")` or `observable.delete("otherValue")`.
**How to reproduce the issue:**
Minimal example showing the problem:
https://codesandbox.io/s/mobx-observable-set-rerk61?file=/index.js
Minimal example showing that this isn't happening for observable.maps (which is my current workaround):
https://codesandbox.io/s/mobx-observable-map-bhhgxi?file=/index.js
**Versions**
- mobx 6.5.0
|
non_process
|
observable set triggers a refresh whenever any value in the set changes even if that value was not accessed in the observer intended outcome i am using observable set with many often changing values i also have many small observers observing the presence of single values in that set i would expect each observer to only refresh if the value it is observing is changing e g if observable has mainvalue switches from false to true or vice versa actual outcome the observer that uses observable has mainvalue is called any time any value in the set is added or deleted e g if i call observable add othervalue or observable delete othervalue how to reproduce the issue minimal example showing the problem minimal example showing that this isn t happening for observable maps which is my current workaround versions mobx
| 0
|
34,449
| 6,334,903,979
|
IssuesEvent
|
2017-07-26 17:41:20
|
sinonjs/sinon
|
https://api.github.com/repos/sinonjs/sinon
|
closed
|
Documentation: useFakeTimers options
|
Documentation Help wanted Needs investigation
|
> the documentation for useFakeTimers says that you can list functions to fake but doesn't explain what that would mean if you do vs don't
Improvements should be made to https://github.com/sinonjs/sinon/blob/master/docs/release-source/release/fake-timers.md, and all the versions for 2.x.
|
1.0
|
Documentation: useFakeTimers options - > the documentation for useFakeTimers says that you can list functions to fake but doesn't explain what that would mean if you do vs don't
Improvements should be made to https://github.com/sinonjs/sinon/blob/master/docs/release-source/release/fake-timers.md, and all the versions for 2.x.
|
non_process
|
documentation usefaketimers options the documentation for usefaketimers says that you can list functions to fake but doesn t explain what that would mean if you do vs don t improvements should be made to and all the versions for x
| 0
|
36,979
| 5,097,282,582
|
IssuesEvent
|
2017-01-03 20:59:20
|
hashcat/hashcat
|
https://api.github.com/repos/hashcat/hashcat
|
closed
|
hashcat benchmark on windows 10 results in fopen() error
|
needs testing
|
I've just downloaded the hashcat binary and tried to run the benchmark which results in an error:
ERROR: inc_cipher_256aes.cl: fopen(): No such file or directory
The file it speaks of is in the same directory as the binary and I get the same error using both 64 and 32-bit versions. It also results in the same error if I run the example0 with a known hash file.
|
1.0
|
hashcat benchmark on windows 10 results in fopen() error - I've just downloaded the hashcat binary and tried to run the benchmark which results in an error:
ERROR: inc_cipher_256aes.cl: fopen(): No such file or directory
The file it speaks of is in the same directory as the binary and I get the same error using both 64 and 32-bit versions. It also results in the same error if I run the example0 with a known hash file.
|
non_process
|
hashcat benchmark on windows results in fopen error i ve just downloaded the hashcat binary and tried to run the benchmark which results in an error error inc cipher cl fopen no such file or directory the file it speaks of is in the same directory as the binary and i get the same error using both and bit versions it also results in the same error if i run the with a known hash file
| 0
|
64,418
| 15,882,707,571
|
IssuesEvent
|
2021-04-09 16:22:44
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[SB] Questionnaires/Active task >'Anchor date based' radio button is disabled in the following scenario
|
Bug P1 Process: Fixed Process: Tested QA Process: Tested dev Study builder
|
Steps
1. Create/edit study
2. Study settings page > Select 'Yes' under 'Use participant enrollment date as anchor date?' section
3. Click on the Mark as completed button
4. Go to the Study information section and change the study ID and complete the section
5. Go to Questionnaire/Active tasks
6. Switch the schedule tab and try to select the anchor date based radio button
AR : 'Anchor date based' radio button is disabled
ER : ' Anchor date based' radio button should be enabled

|
1.0
|
[SB] Questionnaires/Active task >'Anchor date based' radio button is disabled in the following scenario - Steps
1. Create/edit study
2. Study settings page > Select 'Yes' under 'Use participant enrollment date as anchor date?' section
3. Click on the Mark as completed button
4. Go to the Study information section and change the study ID and complete the section
5. Go to Questionnaire/Active tasks
6. Switch the schedule tab and try to select the anchor date based radio button
AR : 'Anchor date based' radio button is disabled
ER : ' Anchor date based' radio button should be enabled

|
non_process
|
questionnaires active task anchor date based radio button is disabled in the following scenario steps create edit study study settings page select yes under use participant enrollment date as anchor date section click on the mark as completed button go to the study information section and change the study id and complete the section go to questionnaire active tasks switch the schedule tab and try to select the anchor date based radio button ar anchor date based radio button is disabled er anchor date based radio button should be enabled
| 0
|
6,745
| 9,873,262,882
|
IssuesEvent
|
2019-06-22 12:54:47
|
hyj378/EmojiRecommand
|
https://api.github.com/repos/hyj378/EmojiRecommand
|
closed
|
데이터 전처리
|
Preprocessing
|
https://colab.research.google.com/drive/13eF9lBVJD0b1pzJgMSzMjVmI5uZom1iX
데이터 전처리 코드입니다.
original_csv 에는 여러분이 크롤링한 자료
preprocessed 특수문자 뺀 것 저장할 이름
eng_ver 번역한 것 저장할 이름
써주시면 됩니다~~~
참고로 완전 오래걸리니까 켜놓고 주무시는게 좋을 것 같아요...ㅎㅎ
|
1.0
|
데이터 전처리 - https://colab.research.google.com/drive/13eF9lBVJD0b1pzJgMSzMjVmI5uZom1iX
데이터 전처리 코드입니다.
original_csv 에는 여러분이 크롤링한 자료
preprocessed 특수문자 뺀 것 저장할 이름
eng_ver 번역한 것 저장할 이름
써주시면 됩니다~~~
참고로 완전 오래걸리니까 켜놓고 주무시는게 좋을 것 같아요...ㅎㅎ
|
process
|
데이터 전처리 데이터 전처리 코드입니다 original csv 에는 여러분이 크롤링한 자료 preprocessed 특수문자 뺀 것 저장할 이름 eng ver 번역한 것 저장할 이름 써주시면 됩니다 참고로 완전 오래걸리니까 켜놓고 주무시는게 좋을 것 같아요 ㅎㅎ
| 1
|
10,468
| 13,245,872,735
|
IssuesEvent
|
2020-08-19 14:57:01
|
LibraryOfCongress/concordia
|
https://api.github.com/repos/LibraryOfCongress/concordia
|
closed
|
"Find another page" in review track should go to a review page
|
CM WIP review process user experience
|
At the moment when I click "find a new page" while I'm reviewing, it takes me to a page needing transcription. We want to keep people in the review stream if that's what they've chosen to work on.
|
1.0
|
"Find another page" in review track should go to a review page - At the moment when I click "find a new page" while I'm reviewing, it takes me to a page needing transcription. We want to keep people in the review stream if that's what they've chosen to work on.
|
process
|
find another page in review track should go to a review page at the moment when i click find a new page while i m reviewing it takes me to a page needing transcription we want to keep people in the review stream if that s what they ve chosen to work on
| 1
|
60,460
| 3,129,471,528
|
IssuesEvent
|
2015-09-09 01:30:34
|
girlcodeakl/girlcodeapp
|
https://api.github.com/repos/girlcodeakl/girlcodeapp
|
closed
|
Make the log-in system only log you in if you have an account
|
medium Priority 2 wontfix
|
This ticket is for after #11 and #12 are done.
- [ ] On the server (index.js), make it so you can only log in if you are logging into an account that actually exists!
- [ ] if the account doesn't exist, you don't get a cookie
- [ ] Send some kind of bad response using the 'res' object.
- [ ] in login.html, display the server's response. (todo: how?)
|
1.0
|
Make the log-in system only log you in if you have an account - This ticket is for after #11 and #12 are done.
- [ ] On the server (index.js), make it so you can only log in if you are logging into an account that actually exists!
- [ ] if the account doesn't exist, you don't get a cookie
- [ ] Send some kind of bad response using the 'res' object.
- [ ] in login.html, display the server's response. (todo: how?)
|
non_process
|
make the log in system only log you in if you have an account this ticket is for after and are done on the server index js make it so you can only log in if you are logging into an account that actually exists if the account doesn t exist you don t get a cookie send some kind of bad response using the res object in login html display the server s response todo how
| 0
|
9,145
| 12,203,195,906
|
IssuesEvent
|
2020-04-30 10:11:11
|
MHRA/products
|
https://api.github.com/repos/MHRA/products
|
closed
|
UAT script
|
EPIC - Auto Batch Process :oncoming_automobile:
|
## User want
As a business stakeholder
I want to ensure that the Stellent batch processing replacement fulfills the business requirements
So that I can sign off the business case
## Acceptance Criteria
A document that captures tests that will confirm that the business requirements have been met
### Customer acceptance criteria
### Technical acceptance criteria
### Data acceptance criteria
### Testing acceptance criteria
## Data - Potential impact
**Size**
M
**Value**
**Effort**
### Exit Criteria met
- [ ] Backlog
- [ ] Discovery
- [ ] DUXD
- [ ] Development
- [ ] Quality Assurance
- [ ] Release and Validate
|
1.0
|
UAT script - ## User want
As a business stakeholder
I want to ensure that the Stellent batch processing replacement fulfills the business requirements
So that I can sign off the business case
## Acceptance Criteria
A document that captures tests that will confirm that the business requirements have been met
### Customer acceptance criteria
### Technical acceptance criteria
### Data acceptance criteria
### Testing acceptance criteria
## Data - Potential impact
**Size**
M
**Value**
**Effort**
### Exit Criteria met
- [ ] Backlog
- [ ] Discovery
- [ ] DUXD
- [ ] Development
- [ ] Quality Assurance
- [ ] Release and Validate
|
process
|
uat script user want as a business stakeholder i want to ensure that the stellent batch processing replacement fulfills the business requirements so that i can sign off the business case acceptance criteria a document that captures tests that will confirm that the business requirements have been met customer acceptance criteria technical acceptance criteria data acceptance criteria testing acceptance criteria data potential impact size m value effort exit criteria met backlog discovery duxd development quality assurance release and validate
| 1
|
656
| 3,126,521,343
|
IssuesEvent
|
2015-09-08 09:47:51
|
robotology/yarp
|
https://api.github.com/repos/robotology/yarp
|
reopened
|
look again at ebottle?
|
Component: YARP_OS Type: Process
|
Some years back, Danilo Tardioli wrote a reimplementation of the Bottle class, with efficiency in mind. It couldn't be integrated with YARP at the time due to its use of STL header files that we were avoiding. These days that is no longer an issue, so it might be worth revisiting this. Links:
* http://robots.unizar.es/new/data/documentos/eBottlehtml/index.html
* https://github.com/hauptmech/yarp-ebottle, a tweaked mirror by @hauptmech
|
1.0
|
look again at ebottle? - Some years back, Danilo Tardioli wrote a reimplementation of the Bottle class, with efficiency in mind. It couldn't be integrated with YARP at the time due to its use of STL header files that we were avoiding. These days that is no longer an issue, so it might be worth revisiting this. Links:
* http://robots.unizar.es/new/data/documentos/eBottlehtml/index.html
* https://github.com/hauptmech/yarp-ebottle, a tweaked mirror by @hauptmech
|
process
|
look again at ebottle some years back danilo tardioli wrote a reimplementation of the bottle class with efficiency in mind it couldn t be integrated with yarp at the time due to its use of stl header files that we were avoiding these days that is no longer an issue so it might be worth revisiting this links a tweaked mirror by hauptmech
| 1
|
19,004
| 6,664,062,520
|
IssuesEvent
|
2017-10-02 18:43:14
|
habitat-sh/habitat
|
https://api.github.com/repos/habitat-sh/habitat
|
closed
|
Integrate new docker publish into builder-worker
|
A-builder C-feature
|
Now that we've finished the new `hab pkg export docker` command we'll need to wire this into the Builder-Worker. Creds will need to be recorded on the origin and assigned to the project through the public API for them to be made available to the worker.
Related: #3259, #3109
|
1.0
|
Integrate new docker publish into builder-worker - Now that we've finished the new `hab pkg export docker` command we'll need to wire this into the Builder-Worker. Creds will need to be recorded on the origin and assigned to the project through the public API for them to be made available to the worker.
Related: #3259, #3109
|
non_process
|
integrate new docker publish into builder worker now that we ve finished the new hab pkg export docker command we ll need to wire this into the builder worker creds will need to be recorded on the origin and assigned to the project through the public api for them to be made available to the worker related
| 0
|
275,715
| 30,288,342,094
|
IssuesEvent
|
2023-07-09 00:43:12
|
temporalio/sdk-typescript
|
https://api.github.com/repos/temporalio/sdk-typescript
|
closed
|
nyc-test-coverage-1.8.0.tgz: 1 vulnerabilities (highest severity is: 5.3) - autoclosed
|
Mend: dependency security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>nyc-test-coverage-1.8.0.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /packages/test/package.json</p>
<p>Path to vulnerable library: /node_modules/semver/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/temporalio/sdk-typescript/commit/6ccfd51b58a061c02dd9483fa4136cda6f65c25d">6ccfd51b58a061c02dd9483fa4136cda6f65c25d</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (nyc-test-coverage version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2022-25883](https://www.mend.io/vulnerability-database/CVE-2022-25883) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Medium | 5.3 | semver-7.5.0.tgz | Transitive | N/A* | ❌ |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the "Details" section below to see if there is a version of transitive dependency where vulnerability is fixed.</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> CVE-2022-25883</summary>
### Vulnerable Library - <b>semver-7.5.0.tgz</b></p>
<p></p>
<p>Library home page: <a href="https://registry.npmjs.org/semver/-/semver-7.5.0.tgz">https://registry.npmjs.org/semver/-/semver-7.5.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/semver/package.json</p>
<p>
Dependency Hierarchy:
- nyc-test-coverage-1.8.0.tgz (Root Library)
- ts-loader-9.4.2.tgz
- :x: **semver-7.5.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/temporalio/sdk-typescript/commit/6ccfd51b58a061c02dd9483fa4136cda6f65c25d">6ccfd51b58a061c02dd9483fa4136cda6f65c25d</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Versions of the package semver before 7.5.2 are vulnerable to Regular Expression Denial of Service (ReDoS) via the function new Range, when untrusted user data is provided as a range.
<p>Publish Date: 2023-06-21
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25883>CVE-2022-25883</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2023-06-21</p>
<p>Fix Resolution: semver - 7.5.2</p>
</p>
<p></p>
</details>
|
True
|
nyc-test-coverage-1.8.0.tgz: 1 vulnerabilities (highest severity is: 5.3) - autoclosed - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>nyc-test-coverage-1.8.0.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /packages/test/package.json</p>
<p>Path to vulnerable library: /node_modules/semver/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/temporalio/sdk-typescript/commit/6ccfd51b58a061c02dd9483fa4136cda6f65c25d">6ccfd51b58a061c02dd9483fa4136cda6f65c25d</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (nyc-test-coverage version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2022-25883](https://www.mend.io/vulnerability-database/CVE-2022-25883) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Medium | 5.3 | semver-7.5.0.tgz | Transitive | N/A* | ❌ |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the "Details" section below to see if there is a version of transitive dependency where vulnerability is fixed.</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> CVE-2022-25883</summary>
### Vulnerable Library - <b>semver-7.5.0.tgz</b></p>
<p></p>
<p>Library home page: <a href="https://registry.npmjs.org/semver/-/semver-7.5.0.tgz">https://registry.npmjs.org/semver/-/semver-7.5.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/semver/package.json</p>
<p>
Dependency Hierarchy:
- nyc-test-coverage-1.8.0.tgz (Root Library)
- ts-loader-9.4.2.tgz
- :x: **semver-7.5.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/temporalio/sdk-typescript/commit/6ccfd51b58a061c02dd9483fa4136cda6f65c25d">6ccfd51b58a061c02dd9483fa4136cda6f65c25d</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Versions of the package semver before 7.5.2 are vulnerable to Regular Expression Denial of Service (ReDoS) via the function new Range, when untrusted user data is provided as a range.
<p>Publish Date: 2023-06-21
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25883>CVE-2022-25883</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2023-06-21</p>
<p>Fix Resolution: semver - 7.5.2</p>
</p>
<p></p>
</details>
|
non_process
|
nyc test coverage tgz vulnerabilities highest severity is autoclosed vulnerable library nyc test coverage tgz path to dependency file packages test package json path to vulnerable library node modules semver package json found in head commit a href vulnerabilities cve severity cvss dependency type fixed in nyc test coverage version remediation available medium semver tgz transitive n a for some transitive vulnerabilities there is no version of direct dependency with a fix check the details section below to see if there is a version of transitive dependency where vulnerability is fixed details cve vulnerable library semver tgz library home page a href path to dependency file package json path to vulnerable library node modules semver package json dependency hierarchy nyc test coverage tgz root library ts loader tgz x semver tgz vulnerable library found in head commit a href found in base branch main vulnerability details versions of the package semver before are vulnerable to regular expression denial of service redos via the function new range when untrusted user data is provided as a range publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version release date fix resolution semver
| 0
|
20,281
| 26,912,279,894
|
IssuesEvent
|
2023-02-07 01:27:55
|
googleapis/google-cloud-node
|
https://api.github.com/repos/googleapis/google-cloud-node
|
closed
|
GA release of Nodejs Analytics Data
|
type: process api: analyticsdata
|
Package name: @google-analytics/data
Current release: beta
Proposed release: GA
Instructions
Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.
Required
28 days elapsed since last beta release with new API surface
Server API is GA
Package API is stable, and we can commit to backward compatibility
All dependencies are GA
Optional
Most common / important scenarios have descriptive samples
Public manual methods have at least one usage sample each (excluding overloads)
Per-API README includes a full description of the API
Per-API README contains at least one “getting started” sample using the most common API scenario
Manual code has been reviewed by API producer
Manual code has been reviewed by a DPE responsible for samples
'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
|
1.0
|
GA release of Nodejs Analytics Data - Package name: @google-analytics/data
Current release: beta
Proposed release: GA
Instructions
Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.
Required
28 days elapsed since last beta release with new API surface
Server API is GA
Package API is stable, and we can commit to backward compatibility
All dependencies are GA
Optional
Most common / important scenarios have descriptive samples
Public manual methods have at least one usage sample each (excluding overloads)
Per-API README includes a full description of the API
Per-API README contains at least one “getting started” sample using the most common API scenario
Manual code has been reviewed by API producer
Manual code has been reviewed by a DPE responsible for samples
'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
|
process
|
ga release of nodejs analytics data package name google analytics data current release beta proposed release ga instructions check the lists below adding tests documentation as required once all the required boxes are ticked please create a release and close this issue required days elapsed since last beta release with new api surface server api is ga package api is stable and we can commit to backward compatibility all dependencies are ga optional most common important scenarios have descriptive samples public manual methods have at least one usage sample each excluding overloads per api readme includes a full description of the api per api readme contains at least one “getting started” sample using the most common api scenario manual code has been reviewed by api producer manual code has been reviewed by a dpe responsible for samples client libraries page is added to the product documentation in apis reference section of the product s documentation on cloud site
| 1
|
51,214
| 7,686,870,179
|
IssuesEvent
|
2018-05-17 01:55:38
|
pallets/click
|
https://api.github.com/repos/pallets/click
|
closed
|
Option name alias / custom option name mapping
|
documentation good first issue
|
Here's my code:
```
@click.command()
@click.option("--max")
def main(max):
# ...
```
This works, but the problem is that the `max` parameter shadows the built-in function. I would like to change the function parameter name, but I don't want to change the option name. Is there a way to change the mapping? (I read through the docs rather thoroughly and haven't found the answer.)
|
1.0
|
Option name alias / custom option name mapping - Here's my code:
```
@click.command()
@click.option("--max")
def main(max):
# ...
```
This works, but the problem is that the `max` parameter shadows the built-in function. I would like to change the function parameter name, but I don't want to change the option name. Is there a way to change the mapping? (I read through the docs rather thoroughly and haven't found the answer.)
|
non_process
|
option name alias custom option name mapping here s my code click command click option max def main max this works but the problem is that the max parameter shadows the built in function i would like to change the function parameter name but i don t want to change the option name is there a way to change the mapping i read through the docs rather thoroughly and haven t found the answer
| 0
|
19,633
| 25,995,132,378
|
IssuesEvent
|
2022-12-20 10:55:13
|
Graylog2/graylog2-server
|
https://api.github.com/repos/Graylog2/graylog2-server
|
closed
|
Changes to pipeline not applied but simulator works
|
processing bug to-verify
|
<!--- Provide a general summary of the issue in the Title above -->
## Expected Behavior
Changing the referenced grok expression should alter the result of my pipeline.
Even replacing `set_field("original_message", message_field);` with `set_field("event.original", message_field);` does not change the result.
## Current Behavior
Old version of pipeline and grok expression are used even after reboot.
But adding the debug statement works as expected.
## Steps to Reproduce (for bugs)
<!--- Provide a link to a live example, or an unambiguous set of steps to -->
<!--- reproduce this bug. Include code to reproduce, if relevant -->
I know reproducing a bug is essential but I currently lack the time to build a separate instance from scratch and start testing there. So this only "works" on my instance:
1. Setup environment like seen below
2. Create pipeline
3. Change pipeline (referenced grok)
## Context
I have logs from my firewall (Cisco ASA) coming to Graylog with an UDP Raw input. Messages based on source ip are rerouted to a dedicated stream (and index set). The stream has a pipeline for message processing attached. I already used a version of this pipeline and started to adjust fields to adhere to the Elastic Common Schema.
Message Processor Configuration is:
1. AWS Instance Name Lookup (disabled)
2. GeoIP Resolver (disabled)
3. Message Filter Chain
4. Pipeline Processor
## Your Environment
<!--- Include as many relevant details about the environment you experienced the bug in -->
* Graylog Version: Graylog 3.1.0+aa5175e
* Elasticsearch Version: 6.8.2
* MongoDB Version: 1:3.6.3-0ubuntu1.1
* Operating System: Ubuntu 18.04 Linux 4.15.0-58-generic
* Browser version: Firefox 68.0.2
|
1.0
|
Changes to pipeline not applied but simulator works - <!--- Provide a general summary of the issue in the Title above -->
## Expected Behavior
Changing the referenced grok expression should alter the result of my pipeline.
Even replacing `set_field("original_message", message_field);` with `set_field("event.original", message_field);` does not change the result.
## Current Behavior
Old version of pipeline and grok expression are used even after reboot.
But adding the debug statement works as expected.
## Steps to Reproduce (for bugs)
<!--- Provide a link to a live example, or an unambiguous set of steps to -->
<!--- reproduce this bug. Include code to reproduce, if relevant -->
I know reproducing a bug is essential but I currently lack the time to build a separate instance from scratch and start testing there. So this only "works" on my instance:
1. Setup environment like seen below
2. Create pipeline
3. Change pipeline (referenced grok)
## Context
I have logs from my firewall (Cisco ASA) coming to Graylog with an UDP Raw input. Messages based on source ip are rerouted to a dedicated stream (and index set). The stream has a pipeline for message processing attached. I already used a version of this pipeline and started to adjust fields to adhere to the Elastic Common Schema.
Message Processor Configuration is:
1. AWS Instance Name Lookup (disabled)
2. GeoIP Resolver (disabled)
3. Message Filter Chain
4. Pipeline Processor
## Your Environment
<!--- Include as many relevant details about the environment you experienced the bug in -->
* Graylog Version: Graylog 3.1.0+aa5175e
* Elasticsearch Version: 6.8.2
* MongoDB Version: 1:3.6.3-0ubuntu1.1
* Operating System: Ubuntu 18.04 Linux 4.15.0-58-generic
* Browser version: Firefox 68.0.2
|
process
|
changes to pipeline not applied but simulator works expected behavior changing the referenced grok expression should alter the result of my pipeline even replacing set field original message message field with set field event original message field does not change the result current behavior old version of pipeline and grok expression are used even after reboot but adding the debug statement works as expected steps to reproduce for bugs i know reproducing a bug is essential but i currently lack the time to build a separate instance from scratch and start testing there so this only works on my instance setup environment like seen below create pipeline change pipeline referenced grok context i have logs from my firewall cisco asa coming to graylog with an udp raw input messages based on source ip are rerouted to a dedicated stream and index set the stream has a pipeline for message processing attached i already used a version of this pipeline and started to adjust fields to adhere to the elastic common schema message processor configuration is aws instance name lookup disabled geoip resolver disabled message filter chain pipeline processor your environment graylog version graylog elasticsearch version mongodb version operating system ubuntu linux generic browser version firefox
| 1
|
5,912
| 8,735,532,909
|
IssuesEvent
|
2018-12-11 17:00:46
|
googleapis/google-cloud-node
|
https://api.github.com/repos/googleapis/google-cloud-node
|
closed
|
Update all samples to use async/await
|
help wanted type: process
|
Most of our samples are using promises for async programming, which is great. Now that Node 8 is the latest LTS, and 10 is about to go out - we should start converting all the samples to async/await. Lets use this as a high level tracking bug.
- [x] [googleapis/google-api-nodejs-client](https://github.com/googleapis/google-api-nodejs-client)
- [x] [googleapis/google-auth-library-nodejs](https://github.com/googleapis/google-auth-library-nodejs)
- [x] [googleapis/nodejs-spanner](https://github.com/googleapis/nodejs-spanner)
- [x] [googleapis/nodejs-firestore](https://github.com/googleapis/nodejs-firestore)
- [x] [googleapis/nodejs-compute](https://github.com/googleapis/nodejs-compute)
- [ ] [googleapis/nodejs-bigtable](https://github.com/googleapis/nodejs-bigtable)
- [x] [googleapis/nodejs-common-grpc](https://github.com/googleapis/nodejs-common-grpc)
- [x] [googleapis/nodejs-common](https://github.com/googleapis/nodejs-common)
- [x] [googleapis/nodejs-kms](https://github.com/googleapis/nodejs-kms)
- [x] [googleapis/nodejs-iot](https://github.com/googleapis/nodejs-iot)
- [x] [googleapis/nodejs-error-reporting](https://github.com/googleapis/nodejs-error-reporting)
- [x] [googleapis/nodejs-redis](https://github.com/googleapis/nodejs-redis)
- [x] [googleapis/nodejs-vision](https://github.com/googleapis/nodejs-vision)
- [x] [googleapis/gax-nodejs](https://github.com/googleapis/gax-nodejs)
- [x] [googleapis/nodejs-translate](https://github.com/googleapis/nodejs-translate)
- [x] [googleapis/nodejs-text-to-speech](https://github.com/googleapis/nodejs-text-to-speech)
- [x] [googleapis/nodejs-os-login](https://github.com/googleapis/nodejs-os-login)
- [x] [googleapis/nodejs-dataproc](https://github.com/googleapis/nodejs-dataproc)
- [x] [googleapis/nodejs-bigquery-data-transfer](https://github.com/googleapis/nodejs-bigquery-data-transfer)
- [x] [googleapis/nodejs-video-intelligence](https://github.com/googleapis/nodejs-video-intelligence)
- [x] [googleapis/nodejs-speech](https://github.com/googleapis/nodejs-speech)
- [x] [googleapis/nodejs-resource](https://github.com/googleapis/nodejs-resource)
- [x] [googleapis/nodejs-pubsub](https://github.com/googleapis/nodejs-pubsub)
- [x] [googleapis/nodejs-monitoring](https://github.com/googleapis/nodejs-monitoring)
- [x] [googleapis/nodejs-logging-winston](https://github.com/googleapis/nodejs-logging-winston)
- [x] [googleapis/nodejs-logging](https://github.com/googleapis/nodejs-logging)
- [x] [googleapis/nodejs-logging-bunyan](https://github.com/googleapis/nodejs-logging-bunyan)
- [x] [googleapis/nodejs-language](https://github.com/googleapis/nodejs-language)
- [x] [googleapis/nodejs-dns](https://github.com/googleapis/nodejs-dns)
- [x] [googleapis/nodejs-dlp](https://github.com/googleapis/nodejs-dlp)
- [ ] [googleapis/nodejs-datastore](https://github.com/googleapis/nodejs-datastore)
- [x] [googleapis/nodejs-bigquery](https://github.com/googleapis/nodejs-bigquery)
- [x] [googleapis/nodejs-cloud-container](https://github.com/googleapis/nodejs-cloud-container)
- [x] [googleapis/nodejs-proto-files](https://github.com/googleapis/nodejs-proto-files)
- [x] [googleapis/nodejs-storage](https://github.com/googleapis/nodejs-storage)
- [x] [googleapis/nodejs-automl](https://github.com/googleapis/nodejs-automl)
- [x] [googleapis/nodejs-promisify](https://github.com/googleapis/nodejs-promisify)
- [x] [googleapis/nodejs-projectify](https://github.com/googleapis/nodejs-projectify)
- [x] [googleapis/nodejs-paginator](https://github.com/googleapis/nodejs-paginator)
- [x] [googleapis/google-p12-pem](https://github.com/googleapis/google-p12-pem)
- [x] [googleapis/node-gtoken](https://github.com/googleapis/node-gtoken)
- [x] [googleapis/nodejs-tasks](https://github.com/googleapis/nodejs-tasks)
- [x] [googleapis/google-cloud-node](https://github.com/googleapis/google-cloud-node)
- [x] [googleapis/cloud-trace-nodejs](https://github.com/googleapis/cloud-trace-nodejs)
- [x] [googleapis/cloud-debug-nodejs](https://github.com/googleapis/cloud-debug-nodejs)
- [x] [googleapis/github-repo-automation](https://github.com/googleapis/github-repo-automation)
- [x] [GoogleCloudPlatform/cloud-profiler-nodejs](https://github.com/GoogleCloudPlatform/cloud-profiler-nodejs)
- [x] [googleapis/gcs-resumable-upload](https://github.com/googleapis/gcs-resumable-upload)
- [x] [googleapis/gcp-metadata](https://github.com/googleapis/gcp-metadata)
- [x] [googleapis/google-cloud-kvstore](https://github.com/googleapis/google-cloud-kvstore)
- [x] [googleapis/gce-images](https://github.com/googleapis/gce-images)
- [x] [googleapis/nodejs-googleapis-common](https://github.com/googleapis/nodejs-googleapis-common)
- [x] [googleapis/nodejs-asset](https://github.com/googleapis/nodejs-asset)
- [x] [googleapis/sloth](https://github.com/googleapis/sloth)
- [ ] [googleapis/nodejs-dialogflow](https://github.com/googleapis/nodejs-dialogflow)
@sduskis this is something that we should have our new friends help with :)
To call out the specific work:
- All samples should use `async` or `await` instead of promises or callbacks
- We can remove all of the catch clauses that just print the output to console.error (a throw will do that anyway)
|
1.0
|
Update all samples to use async/await - Most of our samples are using promises for async programming, which is great. Now that Node 8 is the latest LTS, and 10 is about to go out - we should start converting all the samples to async/await. Lets use this as a high level tracking bug.
- [x] [googleapis/google-api-nodejs-client](https://github.com/googleapis/google-api-nodejs-client)
- [x] [googleapis/google-auth-library-nodejs](https://github.com/googleapis/google-auth-library-nodejs)
- [x] [googleapis/nodejs-spanner](https://github.com/googleapis/nodejs-spanner)
- [x] [googleapis/nodejs-firestore](https://github.com/googleapis/nodejs-firestore)
- [x] [googleapis/nodejs-compute](https://github.com/googleapis/nodejs-compute)
- [ ] [googleapis/nodejs-bigtable](https://github.com/googleapis/nodejs-bigtable)
- [x] [googleapis/nodejs-common-grpc](https://github.com/googleapis/nodejs-common-grpc)
- [x] [googleapis/nodejs-common](https://github.com/googleapis/nodejs-common)
- [x] [googleapis/nodejs-kms](https://github.com/googleapis/nodejs-kms)
- [x] [googleapis/nodejs-iot](https://github.com/googleapis/nodejs-iot)
- [x] [googleapis/nodejs-error-reporting](https://github.com/googleapis/nodejs-error-reporting)
- [x] [googleapis/nodejs-redis](https://github.com/googleapis/nodejs-redis)
- [x] [googleapis/nodejs-vision](https://github.com/googleapis/nodejs-vision)
- [x] [googleapis/gax-nodejs](https://github.com/googleapis/gax-nodejs)
- [x] [googleapis/nodejs-translate](https://github.com/googleapis/nodejs-translate)
- [x] [googleapis/nodejs-text-to-speech](https://github.com/googleapis/nodejs-text-to-speech)
- [x] [googleapis/nodejs-os-login](https://github.com/googleapis/nodejs-os-login)
- [x] [googleapis/nodejs-dataproc](https://github.com/googleapis/nodejs-dataproc)
- [x] [googleapis/nodejs-bigquery-data-transfer](https://github.com/googleapis/nodejs-bigquery-data-transfer)
- [x] [googleapis/nodejs-video-intelligence](https://github.com/googleapis/nodejs-video-intelligence)
- [x] [googleapis/nodejs-speech](https://github.com/googleapis/nodejs-speech)
- [x] [googleapis/nodejs-resource](https://github.com/googleapis/nodejs-resource)
- [x] [googleapis/nodejs-pubsub](https://github.com/googleapis/nodejs-pubsub)
- [x] [googleapis/nodejs-monitoring](https://github.com/googleapis/nodejs-monitoring)
- [x] [googleapis/nodejs-logging-winston](https://github.com/googleapis/nodejs-logging-winston)
- [x] [googleapis/nodejs-logging](https://github.com/googleapis/nodejs-logging)
- [x] [googleapis/nodejs-logging-bunyan](https://github.com/googleapis/nodejs-logging-bunyan)
- [x] [googleapis/nodejs-language](https://github.com/googleapis/nodejs-language)
- [x] [googleapis/nodejs-dns](https://github.com/googleapis/nodejs-dns)
- [x] [googleapis/nodejs-dlp](https://github.com/googleapis/nodejs-dlp)
- [ ] [googleapis/nodejs-datastore](https://github.com/googleapis/nodejs-datastore)
- [x] [googleapis/nodejs-bigquery](https://github.com/googleapis/nodejs-bigquery)
- [x] [googleapis/nodejs-cloud-container](https://github.com/googleapis/nodejs-cloud-container)
- [x] [googleapis/nodejs-proto-files](https://github.com/googleapis/nodejs-proto-files)
- [x] [googleapis/nodejs-storage](https://github.com/googleapis/nodejs-storage)
- [x] [googleapis/nodejs-automl](https://github.com/googleapis/nodejs-automl)
- [x] [googleapis/nodejs-promisify](https://github.com/googleapis/nodejs-promisify)
- [x] [googleapis/nodejs-projectify](https://github.com/googleapis/nodejs-projectify)
- [x] [googleapis/nodejs-paginator](https://github.com/googleapis/nodejs-paginator)
- [x] [googleapis/google-p12-pem](https://github.com/googleapis/google-p12-pem)
- [x] [googleapis/node-gtoken](https://github.com/googleapis/node-gtoken)
- [x] [googleapis/nodejs-tasks](https://github.com/googleapis/nodejs-tasks)
- [x] [googleapis/google-cloud-node](https://github.com/googleapis/google-cloud-node)
- [x] [googleapis/cloud-trace-nodejs](https://github.com/googleapis/cloud-trace-nodejs)
- [x] [googleapis/cloud-debug-nodejs](https://github.com/googleapis/cloud-debug-nodejs)
- [x] [googleapis/github-repo-automation](https://github.com/googleapis/github-repo-automation)
- [x] [GoogleCloudPlatform/cloud-profiler-nodejs](https://github.com/GoogleCloudPlatform/cloud-profiler-nodejs)
- [x] [googleapis/gcs-resumable-upload](https://github.com/googleapis/gcs-resumable-upload)
- [x] [googleapis/gcp-metadata](https://github.com/googleapis/gcp-metadata)
- [x] [googleapis/google-cloud-kvstore](https://github.com/googleapis/google-cloud-kvstore)
- [x] [googleapis/gce-images](https://github.com/googleapis/gce-images)
- [x] [googleapis/nodejs-googleapis-common](https://github.com/googleapis/nodejs-googleapis-common)
- [x] [googleapis/nodejs-asset](https://github.com/googleapis/nodejs-asset)
- [x] [googleapis/sloth](https://github.com/googleapis/sloth)
- [ ] [googleapis/nodejs-dialogflow](https://github.com/googleapis/nodejs-dialogflow)
@sduskis this is something that we should have our new friends help with :)
To call out the specific work:
- All samples should use `async` or `await` instead of promises or callbacks
- We can remove all of the catch clauses that just print the output to console.error (a throw will do that anyway)
|
process
|
update all samples to use async await most of our samples are using promises for async programming which is great now that node is the latest lts and is about to go out we should start converting all the samples to async await lets use this as a high level tracking bug sduskis this is something that we should have our new friends help with to call out the specific work all samples should use async or await instead of promises or callbacks we can remove all of the catch clauses that just print the output to console error a throw will do that anyway
| 1
|
1,939
| 4,769,001,760
|
IssuesEvent
|
2016-10-26 11:01:53
|
nolanjian/Cawler
|
https://api.github.com/repos/nolanjian/Cawler
|
opened
|
TP0021 Interface Inprovement
|
In Processing urgent
|
replace stl obj in interface with C mode or other, just for save and set treat warning as error
|
1.0
|
TP0021 Interface Inprovement - replace stl obj in interface with C mode or other, just for save and set treat warning as error
|
process
|
interface inprovement replace stl obj in interface with c mode or other just for save and set treat warning as error
| 1
|
13,695
| 16,451,149,848
|
IssuesEvent
|
2021-05-21 06:02:16
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[PM] Password help email template issue
|
Bug P2 Participant manager Process: Fixed
|
AR : Password help email template > Some other character is displaying instead of ' icon
ER : It should display Org Name's
[Note: Issue observed only in master branch]

|
1.0
|
[PM] Password help email template issue - AR : Password help email template > Some other character is displaying instead of ' icon
ER : It should display Org Name's
[Note: Issue observed only in master branch]

|
process
|
password help email template issue ar password help email template some other character is displaying instead of icon er it should display org name s
| 1
|
17,407
| 23,223,355,295
|
IssuesEvent
|
2022-08-02 20:34:39
|
pycaret/pycaret
|
https://api.github.com/repos/pycaret/pycaret
|
closed
|
[BUG]: clustering.setup does not accept 'iterative' in imputation_type
|
bug clustering preprocessing
|
### pycaret version checks
- [X] I have checked that this issue has not already been reported [here](https://github.com/pycaret/pycaret/issues).
- [X] I have confirmed this bug exists on the [latest version](https://github.com/pycaret/pycaret/releases) of pycaret.
- [ ] I have confirmed this bug exists on the master branch of pycaret (pip install -U git+https://github.com/pycaret/pycaret.git@master).
### Issue Description
clustering.setup() throws error when specifying imputation_type='iterative'
ValueError: Invalid value for the imputation_type parameter, got iterative. Possible values are: simple, iterative.
### Reproducible Example
```python
from pycaret.datasets import get_data
df = get_data('diabetes')
test = clustering.setup(
data=df,
imputation_type='iterative',
)
```
### Expected Behavior
should accept 'iterative' as value
### Actual Results
```python-traceback
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-6-6a1a7ff6b5f0> in <module>
2 df = get_data('diabetes')
3
----> 4 test = clustering.setup(
5 data=df,
6 imputation_type='iterative',
/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/pycaret/clustering/functional.py in setup(data, ordinal_features, numeric_features, categorical_features, date_features, text_features, ignore_features, keep_features, preprocess, imputation_type, numeric_imputation, categorical_imputation, text_features_method, max_encoding_ohe, encoding_method, polynomial_features, polynomial_degree, low_variance_threshold, remove_multicollinearity, multicollinearity_threshold, bin_numeric_features, remove_outliers, outliers_method, outliers_threshold, transformation, transformation_method, normalize, normalize_method, pca, pca_method, pca_components, custom_pipeline, custom_pipeline_position, n_jobs, use_gpu, html, session_id, system_log, log_experiment, experiment_name, experiment_custom_tags, log_plots, log_profile, log_data, verbose, memory, profile, profile_kwargs)
385 exp = _EXPERIMENT_CLASS()
386 set_current_experiment(exp)
--> 387 return exp.setup(
388 data=data,
389 ordinal_features=ordinal_features,
/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/pycaret/internal/pycaret_experiment/unsupervised_experiment.py in setup(self, data, ordinal_features, numeric_features, categorical_features, date_features, text_features, ignore_features, keep_features, preprocess, imputation_type, numeric_imputation, categorical_imputation, text_features_method, max_encoding_ohe, encoding_method, polynomial_features, polynomial_degree, low_variance_threshold, group_features, group_names, remove_multicollinearity, multicollinearity_threshold, bin_numeric_features, remove_outliers, outliers_method, outliers_threshold, transformation, transformation_method, normalize, normalize_method, pca, pca_method, pca_components, custom_pipeline, custom_pipeline_position, n_jobs, use_gpu, html, session_id, system_log, log_experiment, experiment_name, experiment_custom_tags, log_plots, log_profile, log_data, verbose, memory, profile, profile_kwargs)
532 self._simple_imputation(numeric_imputation, categorical_imputation)
533 elif imputation_type is not None:
--> 534 raise ValueError(
535 "Invalid value for the imputation_type parameter, got "
536 f"{imputation_type}. Possible values are: simple, iterative."
ValueError: Invalid value for the imputation_type parameter, got iterative. Possible values are: simple, iterative.
```
### Installed Versions
<details>
System:
python: 3.9.7 (v3.9.7:1016ef3790, Aug 30 2021, 16:39:15) [Clang 6.0 (clang-600.0.57)]
executable: /Library/Frameworks/Python.framework/Versions/3.9/bin/python3.9
machine: macOS-10.16-x86_64-i386-64bit
PyCaret required dependencies:
pip: 22.2.1
setuptools: 57.4.0
pycaret: 3.0.0.rc3
IPython: 7.21.0
ipywidgets: 8.0.0rc1
tqdm: 4.62.0
numpy: 1.21.6
pandas: 1.4.3
jinja2: 3.0.2
scipy: 1.8.1
joblib: 1.1.0
sklearn: 1.1.1
pyod: Installed but version unavailable
imblearn: 0.9.1
category_encoders: 2.5.0
lightgbm: 3.3.1
numba: 0.55.2
requests: 2.27.1
matplotlib: 3.5.2
scikitplot: 0.3.7
yellowbrick: 1.4
plotly: 5.6.0
kaleido: 0.2.1
statsmodels: 0.13.2
sktime: 0.11.4
tbats: Installed but version unavailable
pmdarima: 1.8.5
psutil: 5.9.1</details>
|
1.0
|
[BUG]: clustering.setup does not accept 'iterative' in imputation_type - ### pycaret version checks
- [X] I have checked that this issue has not already been reported [here](https://github.com/pycaret/pycaret/issues).
- [X] I have confirmed this bug exists on the [latest version](https://github.com/pycaret/pycaret/releases) of pycaret.
- [ ] I have confirmed this bug exists on the master branch of pycaret (pip install -U git+https://github.com/pycaret/pycaret.git@master).
### Issue Description
clustering.setup() throws error when specifying imputation_type='iterative'
ValueError: Invalid value for the imputation_type parameter, got iterative. Possible values are: simple, iterative.
### Reproducible Example
```python
from pycaret.datasets import get_data
df = get_data('diabetes')
test = clustering.setup(
data=df,
imputation_type='iterative',
)
```
### Expected Behavior
should accept 'iterative' as value
### Actual Results
```python-traceback
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-6-6a1a7ff6b5f0> in <module>
2 df = get_data('diabetes')
3
----> 4 test = clustering.setup(
5 data=df,
6 imputation_type='iterative',
/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/pycaret/clustering/functional.py in setup(data, ordinal_features, numeric_features, categorical_features, date_features, text_features, ignore_features, keep_features, preprocess, imputation_type, numeric_imputation, categorical_imputation, text_features_method, max_encoding_ohe, encoding_method, polynomial_features, polynomial_degree, low_variance_threshold, remove_multicollinearity, multicollinearity_threshold, bin_numeric_features, remove_outliers, outliers_method, outliers_threshold, transformation, transformation_method, normalize, normalize_method, pca, pca_method, pca_components, custom_pipeline, custom_pipeline_position, n_jobs, use_gpu, html, session_id, system_log, log_experiment, experiment_name, experiment_custom_tags, log_plots, log_profile, log_data, verbose, memory, profile, profile_kwargs)
385 exp = _EXPERIMENT_CLASS()
386 set_current_experiment(exp)
--> 387 return exp.setup(
388 data=data,
389 ordinal_features=ordinal_features,
/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/pycaret/internal/pycaret_experiment/unsupervised_experiment.py in setup(self, data, ordinal_features, numeric_features, categorical_features, date_features, text_features, ignore_features, keep_features, preprocess, imputation_type, numeric_imputation, categorical_imputation, text_features_method, max_encoding_ohe, encoding_method, polynomial_features, polynomial_degree, low_variance_threshold, group_features, group_names, remove_multicollinearity, multicollinearity_threshold, bin_numeric_features, remove_outliers, outliers_method, outliers_threshold, transformation, transformation_method, normalize, normalize_method, pca, pca_method, pca_components, custom_pipeline, custom_pipeline_position, n_jobs, use_gpu, html, session_id, system_log, log_experiment, experiment_name, experiment_custom_tags, log_plots, log_profile, log_data, verbose, memory, profile, profile_kwargs)
532 self._simple_imputation(numeric_imputation, categorical_imputation)
533 elif imputation_type is not None:
--> 534 raise ValueError(
535 "Invalid value for the imputation_type parameter, got "
536 f"{imputation_type}. Possible values are: simple, iterative."
ValueError: Invalid value for the imputation_type parameter, got iterative. Possible values are: simple, iterative.
```
### Installed Versions
<details>
System:
python: 3.9.7 (v3.9.7:1016ef3790, Aug 30 2021, 16:39:15) [Clang 6.0 (clang-600.0.57)]
executable: /Library/Frameworks/Python.framework/Versions/3.9/bin/python3.9
machine: macOS-10.16-x86_64-i386-64bit
PyCaret required dependencies:
pip: 22.2.1
setuptools: 57.4.0
pycaret: 3.0.0.rc3
IPython: 7.21.0
ipywidgets: 8.0.0rc1
tqdm: 4.62.0
numpy: 1.21.6
pandas: 1.4.3
jinja2: 3.0.2
scipy: 1.8.1
joblib: 1.1.0
sklearn: 1.1.1
pyod: Installed but version unavailable
imblearn: 0.9.1
category_encoders: 2.5.0
lightgbm: 3.3.1
numba: 0.55.2
requests: 2.27.1
matplotlib: 3.5.2
scikitplot: 0.3.7
yellowbrick: 1.4
plotly: 5.6.0
kaleido: 0.2.1
statsmodels: 0.13.2
sktime: 0.11.4
tbats: Installed but version unavailable
pmdarima: 1.8.5
psutil: 5.9.1</details>
|
process
|
clustering setup does not accept iterative in imputation type pycaret version checks i have checked that this issue has not already been reported i have confirmed this bug exists on the of pycaret i have confirmed this bug exists on the master branch of pycaret pip install u git issue description clustering setup throws error when specifying imputation type iterative valueerror invalid value for the imputation type parameter got iterative possible values are simple iterative reproducible example python from pycaret datasets import get data df get data diabetes test clustering setup data df imputation type iterative expected behavior should accept iterative as value actual results python traceback valueerror traceback most recent call last in df get data diabetes test clustering setup data df imputation type iterative library frameworks python framework versions lib site packages pycaret clustering functional py in setup data ordinal features numeric features categorical features date features text features ignore features keep features preprocess imputation type numeric imputation categorical imputation text features method max encoding ohe encoding method polynomial features polynomial degree low variance threshold remove multicollinearity multicollinearity threshold bin numeric features remove outliers outliers method outliers threshold transformation transformation method normalize normalize method pca pca method pca components custom pipeline custom pipeline position n jobs use gpu html session id system log log experiment experiment name experiment custom tags log plots log profile log data verbose memory profile profile kwargs exp experiment class set current experiment exp return exp setup data data ordinal features ordinal features library frameworks python framework versions lib site packages pycaret internal pycaret experiment unsupervised experiment py in setup self data ordinal features numeric features categorical features date features text features ignore features keep features preprocess imputation type numeric imputation categorical imputation text features method max encoding ohe encoding method polynomial features polynomial degree low variance threshold group features group names remove multicollinearity multicollinearity threshold bin numeric features remove outliers outliers method outliers threshold transformation transformation method normalize normalize method pca pca method pca components custom pipeline custom pipeline position n jobs use gpu html session id system log log experiment experiment name experiment custom tags log plots log profile log data verbose memory profile profile kwargs self simple imputation numeric imputation categorical imputation elif imputation type is not none raise valueerror invalid value for the imputation type parameter got f imputation type possible values are simple iterative valueerror invalid value for the imputation type parameter got iterative possible values are simple iterative installed versions system python aug executable library frameworks python framework versions bin machine macos pycaret required dependencies pip setuptools pycaret ipython ipywidgets tqdm numpy pandas scipy joblib sklearn pyod installed but version unavailable imblearn category encoders lightgbm numba requests matplotlib scikitplot yellowbrick plotly kaleido statsmodels sktime tbats installed but version unavailable pmdarima psutil
| 1
|
2,166
| 5,013,340,502
|
IssuesEvent
|
2016-12-13 14:17:59
|
opentrials/opentrials
|
https://api.github.com/repos/opentrials/opentrials
|
closed
|
Add unique constraint to "records.source_url"
|
3. In Development API data cleaning Processors
|
On #532 we found a few records with repeated `source_url` values, and found out that was a bug. We then added a functionality to the `record_remover` processor to remove trials with the same `source_url`. Now we need to make sure that doesn't happen again by adding a unique constraint on that table. After that's done and tested, we can open a new issue to remove the then-useless functionality from `record_remover`.
# Tasks
- [x] Investigate if we still have records with repeated `source_url` in our database.
- [x] If we still have bad data in our DB, find why they were created, fix the bug and run the `record_remover` processor to clean our DB
- [x] Add a unique constraint on `records.source_url`
- [x] Remove the logic on removing records with duplicated `source_url` from the `record_remover` processor, as with the constraint we'll guarantee that this will never happen
|
1.0
|
Add unique constraint to "records.source_url" - On #532 we found a few records with repeated `source_url` values, and found out that was a bug. We then added a functionality to the `record_remover` processor to remove trials with the same `source_url`. Now we need to make sure that doesn't happen again by adding a unique constraint on that table. After that's done and tested, we can open a new issue to remove the then-useless functionality from `record_remover`.
# Tasks
- [x] Investigate if we still have records with repeated `source_url` in our database.
- [x] If we still have bad data in our DB, find why they were created, fix the bug and run the `record_remover` processor to clean our DB
- [x] Add a unique constraint on `records.source_url`
- [x] Remove the logic on removing records with duplicated `source_url` from the `record_remover` processor, as with the constraint we'll guarantee that this will never happen
|
process
|
add unique constraint to records source url on we found a few records with repeated source url values and found out that was a bug we then added a functionality to the record remover processor to remove trials with the same source url now we need to make sure that doesn t happen again by adding a unique constraint on that table after that s done and tested we can open a new issue to remove the then useless functionality from record remover tasks investigate if we still have records with repeated source url in our database if we still have bad data in our db find why they were created fix the bug and run the record remover processor to clean our db add a unique constraint on records source url remove the logic on removing records with duplicated source url from the record remover processor as with the constraint we ll guarantee that this will never happen
| 1
|
1,899
| 4,726,734,212
|
IssuesEvent
|
2016-10-18 11:14:27
|
openvstorage/blktap
|
https://api.github.com/repos/openvstorage/blktap
|
closed
|
Support for blktap3
|
process_wontfix
|
The current implementation only supports blktap2. We might have to fix some bugs to also support blktap3
|
1.0
|
Support for blktap3 - The current implementation only supports blktap2. We might have to fix some bugs to also support blktap3
|
process
|
support for the current implementation only supports we might have to fix some bugs to also support
| 1
|
62,112
| 6,776,258,221
|
IssuesEvent
|
2017-10-27 17:08:30
|
emfoundation/ce100-app
|
https://api.github.com/repos/emfoundation/ce100-app
|
closed
|
Match challenges with challenges
|
bug please-test priority-2 T2h T4h user-story
|
As a primary user, I get an overview of which of my challenges matches other member’s challenges, So that I can see how many other members share a related challenge and may be potential collaborators in solving it.
|
1.0
|
Match challenges with challenges - As a primary user, I get an overview of which of my challenges matches other member’s challenges, So that I can see how many other members share a related challenge and may be potential collaborators in solving it.
|
non_process
|
match challenges with challenges as a primary user i get an overview of which of my challenges matches other member’s challenges so that i can see how many other members share a related challenge and may be potential collaborators in solving it
| 0
|
174,594
| 27,694,768,022
|
IssuesEvent
|
2023-03-14 00:41:09
|
waldreg/waldreg-client
|
https://api.github.com/repos/waldreg/waldreg-client
|
closed
|
[Feat] 네비게이션
|
design feat
|
### 목적
> global navigation 구현
### 작업 상세 내용
- [x] 네비게이션 라우팅
- [x] 상세페이지 아코디언
- [x] 클릭 시 접기
- [ ] 공개범위
- [x] 애니메이션
|
1.0
|
[Feat] 네비게이션 - ### 목적
> global navigation 구현
### 작업 상세 내용
- [x] 네비게이션 라우팅
- [x] 상세페이지 아코디언
- [x] 클릭 시 접기
- [ ] 공개범위
- [x] 애니메이션
|
non_process
|
네비게이션 목적 global navigation 구현 작업 상세 내용 네비게이션 라우팅 상세페이지 아코디언 클릭 시 접기 공개범위 애니메이션
| 0
|
317,964
| 9,672,091,737
|
IssuesEvent
|
2019-05-22 01:54:04
|
codeforboston/communityconnect
|
https://api.github.com/repos/codeforboston/communityconnect
|
closed
|
Add & use new Category Filter icon in Mobile View
|
CfB - good first issue low priority
|
get file from slack here: https://cfb-public.slack.com/archives/CC85SAJ0Z/p1549501248024600
looks something like 👇 , but use the file in slack ☝️

|
1.0
|
Add & use new Category Filter icon in Mobile View - get file from slack here: https://cfb-public.slack.com/archives/CC85SAJ0Z/p1549501248024600
looks something like 👇 , but use the file in slack ☝️

|
non_process
|
add use new category filter icon in mobile view get file from slack here looks something like 👇 but use the file in slack ☝️
| 0
|
168,947
| 6,392,535,671
|
IssuesEvent
|
2017-08-04 03:06:14
|
minio/minio-go
|
https://api.github.com/repos/minio/minio-go
|
closed
|
optimalPartInfo() default partSize results in running out of memory
|
priority: medium
|
Like #404, I'm streaming data which could be over 1GB and trying to use PutObjectStreaming(), but this results in optimalPartInfo(-1) being called and a partSize of 603979776 being returned.
The problem I face is that this partSize is used as, essentially, the size of the read buffer, and I'm running out of memory (I end up with errors like `fork/exec ...: cannot allocate memory` after the stream completes).
I've tried setting partSize = minPartSize in optimalPartInfo() when the input size is -1, and this solves my memory issue.
Regardless of exactly how it's done, can we get (or choose) a reasonable sized read buffer during PutObjectStreaming() so that we can stream in constant small amount of memory?
|
1.0
|
optimalPartInfo() default partSize results in running out of memory - Like #404, I'm streaming data which could be over 1GB and trying to use PutObjectStreaming(), but this results in optimalPartInfo(-1) being called and a partSize of 603979776 being returned.
The problem I face is that this partSize is used as, essentially, the size of the read buffer, and I'm running out of memory (I end up with errors like `fork/exec ...: cannot allocate memory` after the stream completes).
I've tried setting partSize = minPartSize in optimalPartInfo() when the input size is -1, and this solves my memory issue.
Regardless of exactly how it's done, can we get (or choose) a reasonable sized read buffer during PutObjectStreaming() so that we can stream in constant small amount of memory?
|
non_process
|
optimalpartinfo default partsize results in running out of memory like i m streaming data which could be over and trying to use putobjectstreaming but this results in optimalpartinfo being called and a partsize of being returned the problem i face is that this partsize is used as essentially the size of the read buffer and i m running out of memory i end up with errors like fork exec cannot allocate memory after the stream completes i ve tried setting partsize minpartsize in optimalpartinfo when the input size is and this solves my memory issue regardless of exactly how it s done can we get or choose a reasonable sized read buffer during putobjectstreaming so that we can stream in constant small amount of memory
| 0
|
14,274
| 17,227,209,810
|
IssuesEvent
|
2021-07-20 04:45:50
|
e4exp/paper_manager_abstract
|
https://api.github.com/repos/e4exp/paper_manager_abstract
|
reopened
|
Deduplicating Training Data Makes Language Models Better
|
Analysis Dataset Natural Language Processing
|
- https://arxiv.org/abs/2107.06499
- 2021
既存の言語モデルのデータセットには、重複した例文や長い繰り返しのある部分が多く含まれていることがわかった。
その結果、これらのデータセットで学習された言語モデルのプロンプトなしの出力の1%以上が、学習データからそのままコピーされていることがわかった。
我々は、学習データセットの重複を排除するための2つのツールを開発した。
例えば、6万回以上繰り返される61語の英文をC4から取り除くことができる。
重複排除により、記憶されたテキストを出力する頻度が10分の1に減り、少ない学習ステップで同等以上の精度を達成するモデルを学習することができる。
また、標準的なデータセットの検証セットの4%以上に影響する訓練とテストの重複を減らすことができ、より正確な評価が可能になります。
私たちの研究を再現し、データセットの重複排除を行うためのコードは、こちらのhttps URLで公開しています。
https://github.com/google-research/deduplicate-text-datasets
|
1.0
|
Deduplicating Training Data Makes Language Models Better - - https://arxiv.org/abs/2107.06499
- 2021
既存の言語モデルのデータセットには、重複した例文や長い繰り返しのある部分が多く含まれていることがわかった。
その結果、これらのデータセットで学習された言語モデルのプロンプトなしの出力の1%以上が、学習データからそのままコピーされていることがわかった。
我々は、学習データセットの重複を排除するための2つのツールを開発した。
例えば、6万回以上繰り返される61語の英文をC4から取り除くことができる。
重複排除により、記憶されたテキストを出力する頻度が10分の1に減り、少ない学習ステップで同等以上の精度を達成するモデルを学習することができる。
また、標準的なデータセットの検証セットの4%以上に影響する訓練とテストの重複を減らすことができ、より正確な評価が可能になります。
私たちの研究を再現し、データセットの重複排除を行うためのコードは、こちらのhttps URLで公開しています。
https://github.com/google-research/deduplicate-text-datasets
|
process
|
deduplicating training data makes language models better 既存の言語モデルのデータセットには、重複した例文や長い繰り返しのある部分が多く含まれていることがわかった。 その結果、 以上が、学習データからそのままコピーされていることがわかった。 我々は、 。 例えば、 。 重複排除により、 、少ない学習ステップで同等以上の精度を達成するモデルを学習することができる。 また、 %以上に影響する訓練とテストの重複を減らすことができ、より正確な評価が可能になります。 私たちの研究を再現し、データセットの重複排除を行うためのコードは、こちらのhttps urlで公開しています。
| 1
|
815,681
| 30,567,440,583
|
IssuesEvent
|
2023-07-20 18:56:39
|
upsonp/dart
|
https://api.github.com/repos/upsonp/dart
|
closed
|
Time|Position mission
|
bug Priority
|
Diana asked, "What happens if for some reason the GPS scraper on the ship doesn't get the Time|Position values and Time|Position appears as 'N/A' in the elog file.
Well. It turns out, nothing good. An error is thrown in the elog.py process_attachments_actions_time_location stating there's an issue with the MID object eg `Error processing actions for $@MID@$: 1`, but it's not really very descriptive of the issue.
I've added a check to ensure the time_position (A.K.A Time|Position) value fits a regex format, if not then a KeyError is raised stating exactly what element it is that's mission.
`"Missing or improperly set attributer 'time_position' mapped to 'Time|Position' for $@MID@$: 1"`
|
1.0
|
Time|Position mission - Diana asked, "What happens if for some reason the GPS scraper on the ship doesn't get the Time|Position values and Time|Position appears as 'N/A' in the elog file.
Well. It turns out, nothing good. An error is thrown in the elog.py process_attachments_actions_time_location stating there's an issue with the MID object eg `Error processing actions for $@MID@$: 1`, but it's not really very descriptive of the issue.
I've added a check to ensure the time_position (A.K.A Time|Position) value fits a regex format, if not then a KeyError is raised stating exactly what element it is that's mission.
`"Missing or improperly set attributer 'time_position' mapped to 'Time|Position' for $@MID@$: 1"`
|
non_process
|
time position mission diana asked what happens if for some reason the gps scraper on the ship doesn t get the time position values and time position appears as n a in the elog file well it turns out nothing good an error is thrown in the elog py process attachments actions time location stating there s an issue with the mid object eg error processing actions for mid but it s not really very descriptive of the issue i ve added a check to ensure the time position a k a time position value fits a regex format if not then a keyerror is raised stating exactly what element it is that s mission missing or improperly set attributer time position mapped to time position for mid
| 0
|
49,457
| 7,517,378,905
|
IssuesEvent
|
2018-04-12 03:13:49
|
StarChart-Labs/alloy
|
https://api.github.com/repos/StarChart-Labs/alloy
|
opened
|
Document Migration From Guava Functions to Java 8 Lambdas
|
documentation
|
[Guava Functions](https://github.com/google/guava/blob/master/guava/src/com/google/common/base/Functions.java) is completely deprecated in favor of Java 8's StandardCharSets - document this migration path in the Guava -> StarChart migration documentation
|
1.0
|
Document Migration From Guava Functions to Java 8 Lambdas - [Guava Functions](https://github.com/google/guava/blob/master/guava/src/com/google/common/base/Functions.java) is completely deprecated in favor of Java 8's StandardCharSets - document this migration path in the Guava -> StarChart migration documentation
|
non_process
|
document migration from guava functions to java lambdas is completely deprecated in favor of java s standardcharsets document this migration path in the guava starchart migration documentation
| 0
|
121,043
| 15,833,887,717
|
IssuesEvent
|
2021-04-06 16:07:42
|
cagov/ui-claim-tracker
|
https://api.github.com/repos/cagov/ui-claim-tracker
|
reopened
|
Coordinate with EDD to understand required pre-launch testing/review processes
|
Design Design Ops Product Size: M
|
### Task
Find out what's involved in required pre-launch testing/review processes (e.g. accessibility testing, usability review, user acceptance testing, security assessment and authorization) -- especially timelines.
- @dianagriffin will take point on confirming process/expectations for security assessment (Todd Ibbotson) and UAT (Alex/Al)
### Acceptance Criteria
- [ ] Process/expectations identified for security assessment and UAT
- [ ] Required pre-launch testing/review processes have been incorporated into our MVP delivery plan.
|
2.0
|
Coordinate with EDD to understand required pre-launch testing/review processes - ### Task
Find out what's involved in required pre-launch testing/review processes (e.g. accessibility testing, usability review, user acceptance testing, security assessment and authorization) -- especially timelines.
- @dianagriffin will take point on confirming process/expectations for security assessment (Todd Ibbotson) and UAT (Alex/Al)
### Acceptance Criteria
- [ ] Process/expectations identified for security assessment and UAT
- [ ] Required pre-launch testing/review processes have been incorporated into our MVP delivery plan.
|
non_process
|
coordinate with edd to understand required pre launch testing review processes task find out what s involved in required pre launch testing review processes e g accessibility testing usability review user acceptance testing security assessment and authorization especially timelines dianagriffin will take point on confirming process expectations for security assessment todd ibbotson and uat alex al acceptance criteria process expectations identified for security assessment and uat required pre launch testing review processes have been incorporated into our mvp delivery plan
| 0
|
13,973
| 16,745,700,550
|
IssuesEvent
|
2021-06-11 15:14:50
|
dtcenter/MET
|
https://api.github.com/repos/dtcenter/MET
|
opened
|
Enhance PB2NC to derive Mixed-Layer CAPE (MLCAPE).
|
MET: Grid PreProcessing Tools alert: NEED ACCOUNT KEY alert: NEED PROJECT ASSIGNMENT priority: high requestor: NOAA/EMC type: enhancement
|
## Describe the Enhancement ##
On 6/8/21, NOAA/EMC requested that the PB2NC tool be enhanced to derive additional variations of CAPE. They are most interested in mixed-layer CAPE (representing the average characteristics of the boundary layer). Mixed-layer is the "flavor" of CAPE that was recommended to be verified in our models, via the metrics workshop.
While PB2NC does currently derive CAPE, it is surface-based CAPE and its done by calling this Fortran sub-routine:
https://github.com/dtcenter/MET/blob/main_v10.0/met/src/tools/other/pb2nc/calpbl.f
That routine was taken from the NOAA/EMC VSDB verification code, but no such code exists for mixed-layer CAPE. However, it is defined in UPP.
This task is to enhance PB2NC to derive the mixed-layer variation of CAPE. Consider some of the implementation options:
- We could keep the scope limited and lift/adapt the code for deriving MLCAPE from the UPP source code. Should we keep that code in Fortran or re-write in C++?
- We could expand the scope considerably and investigate a more direct interface to the UPP derivation routines. But this would add a new dependency on UPP.
### Time Estimate ###
*Estimate the amount of work required here.*
*Issues should represent approximately 1 to 3 days of work.*
### Sub-Issues ###
Consider breaking the enhancement down into sub-issues.
- [ ] *Add a checkbox for each sub-issue here.*
### Relevant Deadlines ###
*List relevant project deadlines here or state NONE.*
### Funding Source ###
*Define the source of funding and account keys here or state NONE.*
## Define the Metadata ##
### Assignee ###
- [ ] Select **engineer(s)** or **no engineer** required: not sure yet
- [x] Select **scientist(s)** or **no scientist** required: Perry S
### Labels ###
- [x] Select **component(s)**
- [x] Select **priority**
- [x] Select **requestor(s)**
### Projects and Milestone ###
- [x] Select **Repository** and/or **Organization** level **Project(s)** or add **alert: NEED PROJECT ASSIGNMENT** label
- [x] Select **Milestone** as the next official version or **Future Versions**
## Define Related Issue(s) ##
Consider the impact to the other METplus components.
- [x] [METplus](https://github.com/dtcenter/METplus/issues/new/choose), [MET](https://github.com/dtcenter/MET/issues/new/choose), [METdatadb](https://github.com/dtcenter/METdatadb/issues/new/choose), [METviewer](https://github.com/dtcenter/METviewer/issues/new/choose), [METexpress](https://github.com/dtcenter/METexpress/issues/new/choose), [METcalcpy](https://github.com/dtcenter/METcalcpy/issues/new/choose), [METplotpy](https://github.com/dtcenter/METplotpy/issues/new/choose)
There should be no impacts to other METplus repositories.
## Enhancement Checklist ##
See the [METplus Workflow](https://dtcenter.github.io/METplus/Contributors_Guide/github_workflow.html) for details.
- [ ] Complete the issue definition above, including the **Time Estimate** and **Funding Source**.
- [ ] Fork this repository or create a branch of **develop**.
Branch name: `feature_<Issue Number>_<Description>`
- [ ] Complete the development and test your changes.
- [ ] Add/update log messages for easier debugging.
- [ ] Add/update unit tests.
- [ ] Add/update documentation.
- [ ] Push local changes to GitHub.
- [ ] Submit a pull request to merge into **develop**.
Pull request: `feature <Issue Number> <Description>`
- [ ] Define the pull request metadata, as permissions allow.
Select: **Reviewer(s)** and **Linked issues**
Select: **Repository** level development cycle **Project** for the next official release
Select: **Milestone** as the next official version
- [ ] Iterate until the reviewer(s) accept and merge your changes.
- [ ] Delete your fork or branch.
- [ ] Close this issue.
|
1.0
|
Enhance PB2NC to derive Mixed-Layer CAPE (MLCAPE). - ## Describe the Enhancement ##
On 6/8/21, NOAA/EMC requested that the PB2NC tool be enhanced to derive additional variations of CAPE. They are most interested in mixed-layer CAPE (representing the average characteristics of the boundary layer). Mixed-layer is the "flavor" of CAPE that was recommended to be verified in our models, via the metrics workshop.
While PB2NC does currently derive CAPE, it is surface-based CAPE and its done by calling this Fortran sub-routine:
https://github.com/dtcenter/MET/blob/main_v10.0/met/src/tools/other/pb2nc/calpbl.f
That routine was taken from the NOAA/EMC VSDB verification code, but no such code exists for mixed-layer CAPE. However, it is defined in UPP.
This task is to enhance PB2NC to derive the mixed-layer variation of CAPE. Consider some of the implementation options:
- We could keep the scope limited and lift/adapt the code for deriving MLCAPE from the UPP source code. Should we keep that code in Fortran or re-write in C++?
- We could expand the scope considerably and investigate a more direct interface to the UPP derivation routines. But this would add a new dependency on UPP.
### Time Estimate ###
*Estimate the amount of work required here.*
*Issues should represent approximately 1 to 3 days of work.*
### Sub-Issues ###
Consider breaking the enhancement down into sub-issues.
- [ ] *Add a checkbox for each sub-issue here.*
### Relevant Deadlines ###
*List relevant project deadlines here or state NONE.*
### Funding Source ###
*Define the source of funding and account keys here or state NONE.*
## Define the Metadata ##
### Assignee ###
- [ ] Select **engineer(s)** or **no engineer** required: not sure yet
- [x] Select **scientist(s)** or **no scientist** required: Perry S
### Labels ###
- [x] Select **component(s)**
- [x] Select **priority**
- [x] Select **requestor(s)**
### Projects and Milestone ###
- [x] Select **Repository** and/or **Organization** level **Project(s)** or add **alert: NEED PROJECT ASSIGNMENT** label
- [x] Select **Milestone** as the next official version or **Future Versions**
## Define Related Issue(s) ##
Consider the impact to the other METplus components.
- [x] [METplus](https://github.com/dtcenter/METplus/issues/new/choose), [MET](https://github.com/dtcenter/MET/issues/new/choose), [METdatadb](https://github.com/dtcenter/METdatadb/issues/new/choose), [METviewer](https://github.com/dtcenter/METviewer/issues/new/choose), [METexpress](https://github.com/dtcenter/METexpress/issues/new/choose), [METcalcpy](https://github.com/dtcenter/METcalcpy/issues/new/choose), [METplotpy](https://github.com/dtcenter/METplotpy/issues/new/choose)
There should be no impacts to other METplus repositories.
## Enhancement Checklist ##
See the [METplus Workflow](https://dtcenter.github.io/METplus/Contributors_Guide/github_workflow.html) for details.
- [ ] Complete the issue definition above, including the **Time Estimate** and **Funding Source**.
- [ ] Fork this repository or create a branch of **develop**.
Branch name: `feature_<Issue Number>_<Description>`
- [ ] Complete the development and test your changes.
- [ ] Add/update log messages for easier debugging.
- [ ] Add/update unit tests.
- [ ] Add/update documentation.
- [ ] Push local changes to GitHub.
- [ ] Submit a pull request to merge into **develop**.
Pull request: `feature <Issue Number> <Description>`
- [ ] Define the pull request metadata, as permissions allow.
Select: **Reviewer(s)** and **Linked issues**
Select: **Repository** level development cycle **Project** for the next official release
Select: **Milestone** as the next official version
- [ ] Iterate until the reviewer(s) accept and merge your changes.
- [ ] Delete your fork or branch.
- [ ] Close this issue.
|
process
|
enhance to derive mixed layer cape mlcape describe the enhancement on noaa emc requested that the tool be enhanced to derive additional variations of cape they are most interested in mixed layer cape representing the average characteristics of the boundary layer mixed layer is the flavor of cape that was recommended to be verified in our models via the metrics workshop while does currently derive cape it is surface based cape and its done by calling this fortran sub routine that routine was taken from the noaa emc vsdb verification code but no such code exists for mixed layer cape however it is defined in upp this task is to enhance to derive the mixed layer variation of cape consider some of the implementation options we could keep the scope limited and lift adapt the code for deriving mlcape from the upp source code should we keep that code in fortran or re write in c we could expand the scope considerably and investigate a more direct interface to the upp derivation routines but this would add a new dependency on upp time estimate estimate the amount of work required here issues should represent approximately to days of work sub issues consider breaking the enhancement down into sub issues add a checkbox for each sub issue here relevant deadlines list relevant project deadlines here or state none funding source define the source of funding and account keys here or state none define the metadata assignee select engineer s or no engineer required not sure yet select scientist s or no scientist required perry s labels select component s select priority select requestor s projects and milestone select repository and or organization level project s or add alert need project assignment label select milestone as the next official version or future versions define related issue s consider the impact to the other metplus components there should be no impacts to other metplus repositories enhancement checklist see the for details complete the issue definition above including the time estimate and funding source fork this repository or create a branch of develop branch name feature complete the development and test your changes add update log messages for easier debugging add update unit tests add update documentation push local changes to github submit a pull request to merge into develop pull request feature define the pull request metadata as permissions allow select reviewer s and linked issues select repository level development cycle project for the next official release select milestone as the next official version iterate until the reviewer s accept and merge your changes delete your fork or branch close this issue
| 1
|
22,048
| 30,570,680,477
|
IssuesEvent
|
2023-07-20 21:52:39
|
UnitTestBot/UTBotJava
|
https://api.github.com/repos/UnitTestBot/UTBotJava
|
closed
|
Convert `UtExecutionInstrumentation` from object to class
|
ctg-enhancement comp-instrumented-process comp-spring
|
**Description**
`UtExecutionInstrumentation` is not a singleton by design.
However, it was implemented like this because of some problems with serialization.
They need to be fixed now because we introduce `SpringUtExecutionInstrumentation` and using singletons becomes much more inconvenient.
|
1.0
|
Convert `UtExecutionInstrumentation` from object to class - **Description**
`UtExecutionInstrumentation` is not a singleton by design.
However, it was implemented like this because of some problems with serialization.
They need to be fixed now because we introduce `SpringUtExecutionInstrumentation` and using singletons becomes much more inconvenient.
|
process
|
convert utexecutioninstrumentation from object to class description utexecutioninstrumentation is not a singleton by design however it was implemented like this because of some problems with serialization they need to be fixed now because we introduce springutexecutioninstrumentation and using singletons becomes much more inconvenient
| 1
|
7,876
| 11,046,129,164
|
IssuesEvent
|
2019-12-09 16:18:52
|
prisma/prisma2
|
https://api.github.com/repos/prisma/prisma2
|
closed
|
Lift and dev, popup to create a new database missing
|
bug/2-confirmed kind/bug kind/regression process/candidate
|
```
divyendusingh [footybot]$ prisma2 --version
prisma2@2.0.0-preview017.2, binary version: 6159bf3a263921c3c28ee68e2c9e130b5a69c293
```
To reproduce:
1. Create a schema file with a DB url where the database does not exist
2. Run lift save or dev
```
divyendusingh [footybot]$ prisma2 lift save --name init
Error: Could not connect to database: Starting a mysql pool with 13 connections.
[2019-12-05T09:38:10Z ERROR migration_engine] Database 'footyboy-dev1' does not exist.
```
|
1.0
|
Lift and dev, popup to create a new database missing - ```
divyendusingh [footybot]$ prisma2 --version
prisma2@2.0.0-preview017.2, binary version: 6159bf3a263921c3c28ee68e2c9e130b5a69c293
```
To reproduce:
1. Create a schema file with a DB url where the database does not exist
2. Run lift save or dev
```
divyendusingh [footybot]$ prisma2 lift save --name init
Error: Could not connect to database: Starting a mysql pool with 13 connections.
[2019-12-05T09:38:10Z ERROR migration_engine] Database 'footyboy-dev1' does not exist.
```
|
process
|
lift and dev popup to create a new database missing divyendusingh version binary version to reproduce create a schema file with a db url where the database does not exist run lift save or dev divyendusingh lift save name init error could not connect to database starting a mysql pool with connections database footyboy does not exist
| 1
|
130,299
| 18,155,709,597
|
IssuesEvent
|
2021-09-27 01:05:41
|
snowdensb/Leo
|
https://api.github.com/repos/snowdensb/Leo
|
opened
|
CVE-2021-36374 (Medium) detected in ant-1.8.1.jar
|
security vulnerability
|
## CVE-2021-36374 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ant-1.8.1.jar</b></p></summary>
<p>Apache Ant</p>
<p>Library home page: <a href="http://ant.apache.org/">http://ant.apache.org/</a></p>
<p>Path to dependency file: Leo/core/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/ant/ant/1.8.1/ant-1.8.1.jar,/home/wss-scanner/.m2/repository/org/apache/ant/ant/1.8.1/ant-1.8.1.jar,/home/wss-scanner/.m2/repository/org/apache/ant/ant/1.8.1/ant-1.8.1.jar</p>
<p>
Dependency Hierarchy:
- uimaj-as-activemq-2.9.0.jar (Root Library)
- activemq-partition-5.14.0.jar
- org.linkedin.zookeeper-impl-1.4.0.jar
- org.linkedin.util-groovy-1.7.1.jar
- :x: **ant-1.8.1.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
When reading a specially crafted ZIP archive, or a derived formats, an Apache Ant build can be made to allocate large amounts of memory that leads to an out of memory error, even for small inputs. This can be used to disrupt builds using Apache Ant. Commonly used derived formats from ZIP archives are for instance JAR files and many office files. Apache Ant prior to 1.9.16 and 1.10.11 were affected.
<p>Publish Date: 2021-07-14
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-36374>CVE-2021-36374</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://ant.apache.org/security.html">https://ant.apache.org/security.html</a></p>
<p>Release Date: 2021-07-14</p>
<p>Fix Resolution: org.apache.ant:ant:1.9.16,1.10.11</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.ant","packageName":"ant","packageVersion":"1.8.1","packageFilePaths":["/core/pom.xml","/service/pom.xml","/client/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.apache.uima:uimaj-as-activemq:2.9.0;org.apache.activemq:activemq-partition:5.14.0;org.linkedin:org.linkedin.zookeeper-impl:1.4.0;org.linkedin:org.linkedin.util-groovy:1.7.1;org.apache.ant:ant:1.8.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.ant:ant:1.9.16,1.10.11"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-36374","vulnerabilityDetails":"When reading a specially crafted ZIP archive, or a derived formats, an Apache Ant build can be made to allocate large amounts of memory that leads to an out of memory error, even for small inputs. This can be used to disrupt builds using Apache Ant. Commonly used derived formats from ZIP archives are for instance JAR files and many office files. Apache Ant prior to 1.9.16 and 1.10.11 were affected.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-36374","cvss3Severity":"medium","cvss3Score":"5.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"Required","AV":"Local","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2021-36374 (Medium) detected in ant-1.8.1.jar - ## CVE-2021-36374 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ant-1.8.1.jar</b></p></summary>
<p>Apache Ant</p>
<p>Library home page: <a href="http://ant.apache.org/">http://ant.apache.org/</a></p>
<p>Path to dependency file: Leo/core/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/ant/ant/1.8.1/ant-1.8.1.jar,/home/wss-scanner/.m2/repository/org/apache/ant/ant/1.8.1/ant-1.8.1.jar,/home/wss-scanner/.m2/repository/org/apache/ant/ant/1.8.1/ant-1.8.1.jar</p>
<p>
Dependency Hierarchy:
- uimaj-as-activemq-2.9.0.jar (Root Library)
- activemq-partition-5.14.0.jar
- org.linkedin.zookeeper-impl-1.4.0.jar
- org.linkedin.util-groovy-1.7.1.jar
- :x: **ant-1.8.1.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
When reading a specially crafted ZIP archive, or a derived formats, an Apache Ant build can be made to allocate large amounts of memory that leads to an out of memory error, even for small inputs. This can be used to disrupt builds using Apache Ant. Commonly used derived formats from ZIP archives are for instance JAR files and many office files. Apache Ant prior to 1.9.16 and 1.10.11 were affected.
<p>Publish Date: 2021-07-14
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-36374>CVE-2021-36374</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://ant.apache.org/security.html">https://ant.apache.org/security.html</a></p>
<p>Release Date: 2021-07-14</p>
<p>Fix Resolution: org.apache.ant:ant:1.9.16,1.10.11</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.ant","packageName":"ant","packageVersion":"1.8.1","packageFilePaths":["/core/pom.xml","/service/pom.xml","/client/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.apache.uima:uimaj-as-activemq:2.9.0;org.apache.activemq:activemq-partition:5.14.0;org.linkedin:org.linkedin.zookeeper-impl:1.4.0;org.linkedin:org.linkedin.util-groovy:1.7.1;org.apache.ant:ant:1.8.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.ant:ant:1.9.16,1.10.11"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-36374","vulnerabilityDetails":"When reading a specially crafted ZIP archive, or a derived formats, an Apache Ant build can be made to allocate large amounts of memory that leads to an out of memory error, even for small inputs. This can be used to disrupt builds using Apache Ant. Commonly used derived formats from ZIP archives are for instance JAR files and many office files. Apache Ant prior to 1.9.16 and 1.10.11 were affected.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-36374","cvss3Severity":"medium","cvss3Score":"5.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"Required","AV":"Local","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in ant jar cve medium severity vulnerability vulnerable library ant jar apache ant library home page a href path to dependency file leo core pom xml path to vulnerable library home wss scanner repository org apache ant ant ant jar home wss scanner repository org apache ant ant ant jar home wss scanner repository org apache ant ant ant jar dependency hierarchy uimaj as activemq jar root library activemq partition jar org linkedin zookeeper impl jar org linkedin util groovy jar x ant jar vulnerable library found in base branch master vulnerability details when reading a specially crafted zip archive or a derived formats an apache ant build can be made to allocate large amounts of memory that leads to an out of memory error even for small inputs this can be used to disrupt builds using apache ant commonly used derived formats from zip archives are for instance jar files and many office files apache ant prior to and were affected publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache ant ant isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree org apache uima uimaj as activemq org apache activemq activemq partition org linkedin org linkedin zookeeper impl org linkedin org linkedin util groovy org apache ant ant isminimumfixversionavailable true minimumfixversion org apache ant ant basebranches vulnerabilityidentifier cve vulnerabilitydetails when reading a specially crafted zip archive or a derived formats an apache ant build can be made to allocate large amounts of memory that leads to an out of memory error even for small inputs this can be used to disrupt builds using apache ant commonly used derived formats from zip archives are for instance jar files and many office files apache ant prior to and were affected vulnerabilityurl
| 0
|
10,263
| 13,111,155,202
|
IssuesEvent
|
2020-08-04 22:14:58
|
googleapis/nodejs-translate
|
https://api.github.com/repos/googleapis/nodejs-translate
|
closed
|
Move automl samples to googleapis/nodejs-automl
|
api: translate type: process
|
We have `automl` samples sitting in this repo. While they're samples about translation, they're unrelated to this npm module. We should move these samples over to googleapis/nodejs-automl, so that they're tested as part of the release process of that module.
|
1.0
|
Move automl samples to googleapis/nodejs-automl - We have `automl` samples sitting in this repo. While they're samples about translation, they're unrelated to this npm module. We should move these samples over to googleapis/nodejs-automl, so that they're tested as part of the release process of that module.
|
process
|
move automl samples to googleapis nodejs automl we have automl samples sitting in this repo while they re samples about translation they re unrelated to this npm module we should move these samples over to googleapis nodejs automl so that they re tested as part of the release process of that module
| 1
|
18,137
| 24,182,358,717
|
IssuesEvent
|
2022-09-23 10:04:50
|
apache/arrow-rs
|
https://api.github.com/repos/apache/arrow-rs
|
closed
|
Figure out and document the versioning and release cadence for object_store
|
development-process object-store
|
in https://github.com/apache/arrow-rs/issues/2030 we incorporated the [object_store](https://crates.io/crates/object_store) crate into the arrow-rs repository and under the governance of the Apache Arrow project.
Now we need to figure out:
- [ ] What should trigger a release of a `object_store` (on a schedule, when someone wants, etc?)
- [ ] What versioning scheme should we use (e.g. should it be independent from the arrow/parquet/parquet-flight versions?)
- [ ] File tickets / make release scripts to run the release process
|
1.0
|
Figure out and document the versioning and release cadence for object_store - in https://github.com/apache/arrow-rs/issues/2030 we incorporated the [object_store](https://crates.io/crates/object_store) crate into the arrow-rs repository and under the governance of the Apache Arrow project.
Now we need to figure out:
- [ ] What should trigger a release of a `object_store` (on a schedule, when someone wants, etc?)
- [ ] What versioning scheme should we use (e.g. should it be independent from the arrow/parquet/parquet-flight versions?)
- [ ] File tickets / make release scripts to run the release process
|
process
|
figure out and document the versioning and release cadence for object store in we incorporated the crate into the arrow rs repository and under the governance of the apache arrow project now we need to figure out what should trigger a release of a object store on a schedule when someone wants etc what versioning scheme should we use e g should it be independent from the arrow parquet parquet flight versions file tickets make release scripts to run the release process
| 1
|
64,133
| 8,712,142,824
|
IssuesEvent
|
2018-12-06 21:19:16
|
zendframework/zend-view
|
https://api.github.com/repos/zendframework/zend-view
|
closed
|
Issues with Cycle View Helper docs
|
bug documentation
|
Can someone explain to me what is this: {{{PHP2}}} ?
Also later in the same doc, there is section about:
> You can also cycle in reverse, using the `prev()` method instead of `next()`:
However in the example there is neither a `prev()` nor `next()` method used.
```
<table>
<?php foreach ($this->books as $book): ?>
<tr style="background-color: {{{PHP2}}}">
<td><?php echo $this->escapeHtml($book['author']) ?></td>
</tr>
<?php endforeach ?>
</table>
```
How Cycle plugin works then?
|
1.0
|
Issues with Cycle View Helper docs - Can someone explain to me what is this: {{{PHP2}}} ?
Also later in the same doc, there is section about:
> You can also cycle in reverse, using the `prev()` method instead of `next()`:
However in the example there is neither a `prev()` nor `next()` method used.
```
<table>
<?php foreach ($this->books as $book): ?>
<tr style="background-color: {{{PHP2}}}">
<td><?php echo $this->escapeHtml($book['author']) ?></td>
</tr>
<?php endforeach ?>
</table>
```
How Cycle plugin works then?
|
non_process
|
issues with cycle view helper docs can someone explain to me what is this also later in the same doc there is section about you can also cycle in reverse using the prev method instead of next however in the example there is neither a prev nor next method used books as book escapehtml book how cycle plugin works then
| 0
|
8,334
| 11,494,280,170
|
IssuesEvent
|
2020-02-12 01:06:28
|
kubeflow/kfserving
|
https://api.github.com/repos/kubeflow/kfserving
|
closed
|
Add licenses for dependencies in KFServing images for Kubeflow 1.0
|
area/inference kind/process priority/p0
|
Related to kubeflow/testing#539 https://github.com/kubeflow/kubeflow/issues/4061
Need to add all license content for thrid-party dependencies in KFServing controller images.
For Golang images, you may find these instructions helpful to extract all license files:
https://github.com/kubeflow/metadata/blob/master/third_party_licenses/README.md
|
1.0
|
Add licenses for dependencies in KFServing images for Kubeflow 1.0 - Related to kubeflow/testing#539 https://github.com/kubeflow/kubeflow/issues/4061
Need to add all license content for thrid-party dependencies in KFServing controller images.
For Golang images, you may find these instructions helpful to extract all license files:
https://github.com/kubeflow/metadata/blob/master/third_party_licenses/README.md
|
process
|
add licenses for dependencies in kfserving images for kubeflow related to kubeflow testing need to add all license content for thrid party dependencies in kfserving controller images for golang images you may find these instructions helpful to extract all license files
| 1
|
2,269
| 5,103,054,593
|
IssuesEvent
|
2017-01-04 20:11:17
|
P0cL4bs/WiFi-Pumpkin
|
https://api.github.com/repos/P0cL4bs/WiFi-Pumpkin
|
closed
|
no Internet Connection on Victim-device
|
in process priority solved
|
## What's the problem (or question)?
The Target-Device (Android Phone) doesn't have internet, when connected to PumpAP.
#### Please tell us details about your environment.
Wifi: Atheros Communications, Inc. AR9271 802.11n
+ Ethernet-connection
.
Wifi-Pumpkin: v0.8.4
* Virtual Machine (yes or no and which): no
Linux Debian 4.8.0-kali2-amd64 #1 SMP Debian 4.8.11-1kali1 (2016-12-08) x86_64 GNU/Linux
|
1.0
|
no Internet Connection on Victim-device - ## What's the problem (or question)?
The Target-Device (Android Phone) doesn't have internet, when connected to PumpAP.
#### Please tell us details about your environment.
Wifi: Atheros Communications, Inc. AR9271 802.11n
+ Ethernet-connection
.
Wifi-Pumpkin: v0.8.4
* Virtual Machine (yes or no and which): no
Linux Debian 4.8.0-kali2-amd64 #1 SMP Debian 4.8.11-1kali1 (2016-12-08) x86_64 GNU/Linux
|
process
|
no internet connection on victim device what s the problem or question the target device android phone doesn t have internet when connected to pumpap please tell us details about your environment wifi atheros communications inc ethernet connection wifi pumpkin virtual machine yes or no and which no linux debian smp debian gnu linux
| 1
|
2,737
| 5,623,361,112
|
IssuesEvent
|
2017-04-04 14:49:11
|
convox/rack
|
https://api.github.com/repos/convox/rack
|
closed
|
ECS autoscaling
|
area/processes
|
data: {u'text': u'https://github.com/convox/rack/issues/1797', u'list': {u'name': u'Rack', u'id': u'5899e3224255b666d6e7d9e9'}, u'board': {u'shortLink': u'XxvWeICx', u'id': u'587425638297353d93b2d4af', u'name': u'Roadmap'}, u'card': {u'idShort': 52, u'shortLink': u'ZYLqsaH5', u'id': u'58745985705fe4c547ec34c5', u'name': u'ECS autoscaling'}}
date: 2017-02-08T13:15:39.359Z
id: 589b19fb76fb6bae1de23bb1
idMemberCreator: 4f3c59c1d9d3e56b446861d8
memberCreator: {u'username': u'ddollar', u'fullName': u'David Dollar', u'avatarHash': u'4b941446da3c56990164d470ea22bb2f', u'id': u'4f3c59c1d9d3e56b446861d8', u'initials': u'DD'}
type: commentCard
https://trello.com/c/ZYLqsaH5/52-ecs-autoscaling
|
1.0
|
ECS autoscaling -
data: {u'text': u'https://github.com/convox/rack/issues/1797', u'list': {u'name': u'Rack', u'id': u'5899e3224255b666d6e7d9e9'}, u'board': {u'shortLink': u'XxvWeICx', u'id': u'587425638297353d93b2d4af', u'name': u'Roadmap'}, u'card': {u'idShort': 52, u'shortLink': u'ZYLqsaH5', u'id': u'58745985705fe4c547ec34c5', u'name': u'ECS autoscaling'}}
date: 2017-02-08T13:15:39.359Z
id: 589b19fb76fb6bae1de23bb1
idMemberCreator: 4f3c59c1d9d3e56b446861d8
memberCreator: {u'username': u'ddollar', u'fullName': u'David Dollar', u'avatarHash': u'4b941446da3c56990164d470ea22bb2f', u'id': u'4f3c59c1d9d3e56b446861d8', u'initials': u'DD'}
type: commentCard
https://trello.com/c/ZYLqsaH5/52-ecs-autoscaling
|
process
|
ecs autoscaling data u text u u list u name u rack u id u u board u shortlink u xxvweicx u id u u name u roadmap u card u idshort u shortlink u u id u u name u ecs autoscaling date id idmembercreator membercreator u username u ddollar u fullname u david dollar u avatarhash u u id u u initials u dd type commentcard
| 1
|
620,069
| 19,547,618,375
|
IssuesEvent
|
2022-01-02 06:13:30
|
ramazankilimci/grammar-checker
|
https://api.github.com/repos/ramazankilimci/grammar-checker
|
closed
|
Create kubernetes deployment YAML file
|
field/devops priority/important-soon
|
**Is your feature request related to a problem? Please describe.**
It it related to deployment and performance of the application.
**Describe the solution you'd like**
Kubernetes is needed for Azure Kubernetes Service deployment. YAML file should be created.
**Describe alternatives you've considered**
Alternative can be using a Azure Container Instance Service.
**Additional context**
N/A
|
1.0
|
Create kubernetes deployment YAML file - **Is your feature request related to a problem? Please describe.**
It it related to deployment and performance of the application.
**Describe the solution you'd like**
Kubernetes is needed for Azure Kubernetes Service deployment. YAML file should be created.
**Describe alternatives you've considered**
Alternative can be using a Azure Container Instance Service.
**Additional context**
N/A
|
non_process
|
create kubernetes deployment yaml file is your feature request related to a problem please describe it it related to deployment and performance of the application describe the solution you d like kubernetes is needed for azure kubernetes service deployment yaml file should be created describe alternatives you ve considered alternative can be using a azure container instance service additional context n a
| 0
|
326,026
| 9,941,767,771
|
IssuesEvent
|
2019-07-03 12:27:00
|
zephyrproject-rtos/zephyr
|
https://api.github.com/repos/zephyrproject-rtos/zephyr
|
closed
|
[Coverity CID :199436]Uninitialized variables in /subsys/net/lib/sockets/sockets.c
|
Coverity area: Networking bug priority: medium
|
Static code scan issues seen in File: /subsys/net/lib/sockets/sockets.c
Category: Uninitialized variables
Function: zsock_getsockname_ctx
Component: Networking
CID: 199436
Please fix or provide comments to square it off in coverity in the link: https://scan9.coverity.com/reports.htm#v32951/p12996
|
1.0
|
[Coverity CID :199436]Uninitialized variables in /subsys/net/lib/sockets/sockets.c - Static code scan issues seen in File: /subsys/net/lib/sockets/sockets.c
Category: Uninitialized variables
Function: zsock_getsockname_ctx
Component: Networking
CID: 199436
Please fix or provide comments to square it off in coverity in the link: https://scan9.coverity.com/reports.htm#v32951/p12996
|
non_process
|
uninitialized variables in subsys net lib sockets sockets c static code scan issues seen in file subsys net lib sockets sockets c category uninitialized variables function zsock getsockname ctx component networking cid please fix or provide comments to square it off in coverity in the link
| 0
|
66,659
| 12,810,608,436
|
IssuesEvent
|
2020-07-03 19:17:16
|
joomla/joomla-cms
|
https://api.github.com/repos/joomla/joomla-cms
|
reopened
|
[4.0] RFC Empty headings [a11y]
|
No Code Attached Yet
|
https://dequeuniversity.com/rules/axe/3.5/empty-heading
> Screen readers alert users to the presence of a heading tag. If the heading is empty or the text cannot be accessed, this could either confuse users or even prevent them from accessing information on the page's structure.
An example of where joomla _fails_ on this is the new user form as this line
` <h2><?php echo $this->form->getValue('name'); ?></h2>
`
results in
` <h2></h2>`
There are many ways to _resolve_ this
- 1. Check if its a new or existing record and only display the heading if its existing
```
<?php if ($this->item->id != 0): ?>
<h2><?php echo $this->form->getValue('name'); ?></h2>
<?php endif; ?>
```
- 2. Display different text as the heading if its a new or existing record
```
<?php if ($this->item->id != 0): ?>
<h2><?php echo Text::_('COM_USERS_VIEW_NEW_USER_TITLE'); ?></h2>
<?php else: ?>
<h2><?php echo $this->form->getValue('name'); ?></h2>
<?php endif; ?>
```
3. and probably a lot lot more
|
1.0
|
[4.0] RFC Empty headings [a11y] - https://dequeuniversity.com/rules/axe/3.5/empty-heading
> Screen readers alert users to the presence of a heading tag. If the heading is empty or the text cannot be accessed, this could either confuse users or even prevent them from accessing information on the page's structure.
An example of where joomla _fails_ on this is the new user form as this line
` <h2><?php echo $this->form->getValue('name'); ?></h2>
`
results in
` <h2></h2>`
There are many ways to _resolve_ this
- 1. Check if its a new or existing record and only display the heading if its existing
```
<?php if ($this->item->id != 0): ?>
<h2><?php echo $this->form->getValue('name'); ?></h2>
<?php endif; ?>
```
- 2. Display different text as the heading if its a new or existing record
```
<?php if ($this->item->id != 0): ?>
<h2><?php echo Text::_('COM_USERS_VIEW_NEW_USER_TITLE'); ?></h2>
<?php else: ?>
<h2><?php echo $this->form->getValue('name'); ?></h2>
<?php endif; ?>
```
3. and probably a lot lot more
|
non_process
|
rfc empty headings screen readers alert users to the presence of a heading tag if the heading is empty or the text cannot be accessed this could either confuse users or even prevent them from accessing information on the page s structure an example of where joomla fails on this is the new user form as this line form getvalue name results in there are many ways to resolve this check if its a new or existing record and only display the heading if its existing item id form getvalue name display different text as the heading if its a new or existing record item id form getvalue name and probably a lot lot more
| 0
|
136,312
| 19,759,056,448
|
IssuesEvent
|
2022-01-16 04:35:45
|
hibiken/asynq
|
https://api.github.com/repos/hibiken/asynq
|
closed
|
[FEATURE REQUEST] Driver for NATS so we get geo physical Load balancing and Failover
|
enhancement designing
|
**Is your feature request related to a problem? Please describe.**
I need geo physical load balancing and NATS provides that out of the box.
NATS can run for example 3 servers in each continent. You typically setup 3 in a cluster in each continent.
the nats.go client will automatically connect to the nearest one and automatically failover and then reconnect when a closer server or cluster comes back up.
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
the Redis driver can still be kept but I would like to add a NATS driver also.
NATS server driver that does what the current Redis driver does. This driver will use the nats.go golang client.
**Describe alternatives you've considered**
I think Redis enterprise has clustering and failover . Nats is free and open and has this clustered LB and failover functionality.
**Additional context**
https://github.com/nats-io/nats.go
https://github.com/nats-io/nats-server
Nats also has accounts , users that are hierarchical and so you get multi tenant for free .
https://github.com/nats-io/nsc
Nats also has cluster wide monitoring
https://github.com/nats-io/nats-surveyor
|
1.0
|
[FEATURE REQUEST] Driver for NATS so we get geo physical Load balancing and Failover - **Is your feature request related to a problem? Please describe.**
I need geo physical load balancing and NATS provides that out of the box.
NATS can run for example 3 servers in each continent. You typically setup 3 in a cluster in each continent.
the nats.go client will automatically connect to the nearest one and automatically failover and then reconnect when a closer server or cluster comes back up.
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
the Redis driver can still be kept but I would like to add a NATS driver also.
NATS server driver that does what the current Redis driver does. This driver will use the nats.go golang client.
**Describe alternatives you've considered**
I think Redis enterprise has clustering and failover . Nats is free and open and has this clustered LB and failover functionality.
**Additional context**
https://github.com/nats-io/nats.go
https://github.com/nats-io/nats-server
Nats also has accounts , users that are hierarchical and so you get multi tenant for free .
https://github.com/nats-io/nsc
Nats also has cluster wide monitoring
https://github.com/nats-io/nats-surveyor
|
non_process
|
driver for nats so we get geo physical load balancing and failover is your feature request related to a problem please describe i need geo physical load balancing and nats provides that out of the box nats can run for example servers in each continent you typically setup in a cluster in each continent the nats go client will automatically connect to the nearest one and automatically failover and then reconnect when a closer server or cluster comes back up describe the solution you d like a clear and concise description of what you want to happen the redis driver can still be kept but i would like to add a nats driver also nats server driver that does what the current redis driver does this driver will use the nats go golang client describe alternatives you ve considered i think redis enterprise has clustering and failover nats is free and open and has this clustered lb and failover functionality additional context nats also has accounts users that are hierarchical and so you get multi tenant for free nats also has cluster wide monitoring
| 0
|
8,221
| 11,410,592,441
|
IssuesEvent
|
2020-02-01 00:03:11
|
parcel-bundler/parcel
|
https://api.github.com/repos/parcel-bundler/parcel
|
closed
|
SyntaxError: Invalid or unexpected token when importing a background image in SCSS
|
:bug: Bug CSS Preprocessing Stale
|
**Choose one:** is this a 🐛 bug report or 🙋 feature request?
Bug
### 🎛 Configuration (.babelrc, package.json, cli command)
```scss
.Background {
background: url('assets/bg.jpg');
}
```
### 🤔 Expected Behavior
For it to work
### 😯 Current Behavior
Fails with a JS error `SyntaxError: Invalid or unexpected token` on the following line at the bottom:
```
},{"./bundle-url":15}],9:[function(require,module,exports) {
var reloadCSS = require('_css_loader');
module.hot.dispose(reloadCSS);
module.hot.accept(reloadCSS);
},{"./assets/bg.jpg":[["4edb26dd740df8d2a3,17:[function(require,module,exports) {
```
The background image itself is displaying fine. It's just crashing all the JS due to this error.
### 💻 Code Sample
See above
### 🌍 Your Environment
<!--- Include as many relevant details about the environment you experienced the bug in -->
| Software | Version(s) |
| ---------------- | ---------- |
| Parcel | 1.5.0
| Node | 8.9.3
| npm/Yarn | Yarn, 1.3.2
| Operating System | OS X
<!-- Love parcel? Please consider supporting our collective:
👉 https://opencollective.com/parcel/donate -->
|
1.0
|
SyntaxError: Invalid or unexpected token when importing a background image in SCSS - **Choose one:** is this a 🐛 bug report or 🙋 feature request?
Bug
### 🎛 Configuration (.babelrc, package.json, cli command)
```scss
.Background {
background: url('assets/bg.jpg');
}
```
### 🤔 Expected Behavior
For it to work
### 😯 Current Behavior
Fails with a JS error `SyntaxError: Invalid or unexpected token` on the following line at the bottom:
```
},{"./bundle-url":15}],9:[function(require,module,exports) {
var reloadCSS = require('_css_loader');
module.hot.dispose(reloadCSS);
module.hot.accept(reloadCSS);
},{"./assets/bg.jpg":[["4edb26dd740df8d2a3,17:[function(require,module,exports) {
```
The background image itself is displaying fine. It's just crashing all the JS due to this error.
### 💻 Code Sample
See above
### 🌍 Your Environment
<!--- Include as many relevant details about the environment you experienced the bug in -->
| Software | Version(s) |
| ---------------- | ---------- |
| Parcel | 1.5.0
| Node | 8.9.3
| npm/Yarn | Yarn, 1.3.2
| Operating System | OS X
<!-- Love parcel? Please consider supporting our collective:
👉 https://opencollective.com/parcel/donate -->
|
process
|
syntaxerror invalid or unexpected token when importing a background image in scss choose one is this a 🐛 bug report or 🙋 feature request bug 🎛 configuration babelrc package json cli command scss background background url assets bg jpg 🤔 expected behavior for it to work 😯 current behavior fails with a js error syntaxerror invalid or unexpected token on the following line at the bottom bundle url function require module exports var reloadcss require css loader module hot dispose reloadcss module hot accept reloadcss assets bg jpg function require module exports the background image itself is displaying fine it s just crashing all the js due to this error 💻 code sample see above 🌍 your environment software version s parcel node npm yarn yarn operating system os x love parcel please consider supporting our collective 👉
| 1
|
21,772
| 30,288,249,299
|
IssuesEvent
|
2023-07-09 00:24:25
|
ethereum/EIPs
|
https://api.github.com/repos/ethereum/EIPs
|
closed
|
One-time NFT Sale of unused EIP numbers.
|
w-stale enhancement r-process
|
# Problem
1. We could use funding to pay for full time editors.
2. Users want to be able to choose their EIP numbers.
3. We want to maintain credibly neutrality and not give number preference to "friends of editors".
# Solution:
We do a one time auction of all unused EIP numbers between 1 and 5000 in the form of NFTs. When a user wants to create an EIP, they can either get a number assigned to their EIP (following current or new selection strategy), or they can burn one of the NFTs just before their draft is merged and we'll let them have that number.
The proceeds would go toward funding EIP editors (specifically, funding to hire someone to replace me). Additional things we could spend money on should the proceeds be particularly high:
* Full time staff to work on process automation.
* Fund development/enhancement/maintenance of the executable and consensus specs repositories.
* White glove service assisting people with EIP authoring.
* Authoring and maintaining of external resources related to upcoming Ethereum standards.
|
1.0
|
One-time NFT Sale of unused EIP numbers. - # Problem
1. We could use funding to pay for full time editors.
2. Users want to be able to choose their EIP numbers.
3. We want to maintain credibly neutrality and not give number preference to "friends of editors".
# Solution:
We do a one time auction of all unused EIP numbers between 1 and 5000 in the form of NFTs. When a user wants to create an EIP, they can either get a number assigned to their EIP (following current or new selection strategy), or they can burn one of the NFTs just before their draft is merged and we'll let them have that number.
The proceeds would go toward funding EIP editors (specifically, funding to hire someone to replace me). Additional things we could spend money on should the proceeds be particularly high:
* Full time staff to work on process automation.
* Fund development/enhancement/maintenance of the executable and consensus specs repositories.
* White glove service assisting people with EIP authoring.
* Authoring and maintaining of external resources related to upcoming Ethereum standards.
|
process
|
one time nft sale of unused eip numbers problem we could use funding to pay for full time editors users want to be able to choose their eip numbers we want to maintain credibly neutrality and not give number preference to friends of editors solution we do a one time auction of all unused eip numbers between and in the form of nfts when a user wants to create an eip they can either get a number assigned to their eip following current or new selection strategy or they can burn one of the nfts just before their draft is merged and we ll let them have that number the proceeds would go toward funding eip editors specifically funding to hire someone to replace me additional things we could spend money on should the proceeds be particularly high full time staff to work on process automation fund development enhancement maintenance of the executable and consensus specs repositories white glove service assisting people with eip authoring authoring and maintaining of external resources related to upcoming ethereum standards
| 1
|
20,880
| 27,697,732,601
|
IssuesEvent
|
2023-03-14 04:35:13
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Creating a new file in "$env:TEMP" using New-Item doesn't work in PowerShell 5.1 only works in PowerShell 7.x while this page asks users to use $env:TEMP as temporary folder for the sandbox.
|
automation/svc process-automation/subsvc Pri2
|
[Enter feedback here]
---
#### Document Details
⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.*
* ID: bf4da869-3f32-d2bb-d804-62679c1c3cbd
* Version Independent ID: 49ec3073-a31e-a0ab-882d-17d3a7a420b0
* Content: [Runbook execution in Azure Automation](https://learn.microsoft.com/en-us/azure/automation/automation-runbook-execution#temporary-storage-in-a-sandbox)
* Content Source: [articles/automation/automation-runbook-execution.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/automation/automation-runbook-execution.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @SnehaSudhirG
* Microsoft Alias: **sudhirsneha**
|
1.0
|
Creating a new file in "$env:TEMP" using New-Item doesn't work in PowerShell 5.1 only works in PowerShell 7.x while this page asks users to use $env:TEMP as temporary folder for the sandbox. -
[Enter feedback here]
---
#### Document Details
⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.*
* ID: bf4da869-3f32-d2bb-d804-62679c1c3cbd
* Version Independent ID: 49ec3073-a31e-a0ab-882d-17d3a7a420b0
* Content: [Runbook execution in Azure Automation](https://learn.microsoft.com/en-us/azure/automation/automation-runbook-execution#temporary-storage-in-a-sandbox)
* Content Source: [articles/automation/automation-runbook-execution.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/automation/automation-runbook-execution.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @SnehaSudhirG
* Microsoft Alias: **sudhirsneha**
|
process
|
creating a new file in env temp using new item doesn t work in powershell only works in powershell x while this page asks users to use env temp as temporary folder for the sandbox document details ⚠ do not edit this section it is required for learn microsoft com ➟ github issue linking id version independent id content content source service automation sub service process automation github login snehasudhirg microsoft alias sudhirsneha
| 1
|
225,790
| 17,908,965,216
|
IssuesEvent
|
2021-09-09 00:41:54
|
SWRPG-Capstone/swrpg-fe
|
https://api.github.com/repos/SWRPG-Capstone/swrpg-fe
|
closed
|
Test Character Creation Form
|
Front End Testing
|
Test user flows for the Character Creation Form
Intercept/Stub network requests and responses
|
1.0
|
Test Character Creation Form - Test user flows for the Character Creation Form
Intercept/Stub network requests and responses
|
non_process
|
test character creation form test user flows for the character creation form intercept stub network requests and responses
| 0
|
13,405
| 8,442,874,247
|
IssuesEvent
|
2018-10-18 14:18:00
|
Yakindu/statecharts
|
https://api.github.com/repos/Yakindu/statecharts
|
opened
|
state content is not completely visible
|
Comp-Graphical Editor is-Usability-Issue
|
follow https://github.com/Yakindu/statecharts/blob/master/manual-tests/org.yakindu.sct.test.manual/testcases/sct_testcase_02_editor.textile until 2.8.

double clicking the state content yields

which is annoying. The content is not completely visible.
|
True
|
state content is not completely visible -
follow https://github.com/Yakindu/statecharts/blob/master/manual-tests/org.yakindu.sct.test.manual/testcases/sct_testcase_02_editor.textile until 2.8.

double clicking the state content yields

which is annoying. The content is not completely visible.
|
non_process
|
state content is not completely visible follow until double clicking the state content yields which is annoying the content is not completely visible
| 0
|
8,056
| 11,221,633,862
|
IssuesEvent
|
2020-01-07 18:17:52
|
cncf/sig-security
|
https://api.github.com/repos/cncf/sig-security
|
closed
|
Is the assessment matrix duplicative with the issue process for assessments?
|
assessment-process
|
It feels like the assessment matrix: https://github.com/cncf/sig-security/blob/8f02ab936dd1ec05405ca995221b601c5a7382ca/assessments/assessment-matrix.md
is largely duplicative with the tagged items in the issue tracker: https://github.com/cncf/sig-security/labels/assessment
Do we need both? If so, could we generate from the other (likely issue -> matrix)?
|
1.0
|
Is the assessment matrix duplicative with the issue process for assessments? - It feels like the assessment matrix: https://github.com/cncf/sig-security/blob/8f02ab936dd1ec05405ca995221b601c5a7382ca/assessments/assessment-matrix.md
is largely duplicative with the tagged items in the issue tracker: https://github.com/cncf/sig-security/labels/assessment
Do we need both? If so, could we generate from the other (likely issue -> matrix)?
|
process
|
is the assessment matrix duplicative with the issue process for assessments it feels like the assessment matrix is largely duplicative with the tagged items in the issue tracker do we need both if so could we generate from the other likely issue matrix
| 1
|
329,451
| 28,244,014,182
|
IssuesEvent
|
2023-04-06 09:21:27
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
closed
|
Failing test: Chrome X-Pack UI Functional Tests with ES SSL - Discover, Uptime, ML.x-pack/test/functional_with_es_ssl/apps/discover_ml_uptime/discover/search_source_alert·ts - Discover alerting Search source Alert should display actual state after rule params update on clicking viewInApp link
|
blocker loe:hours failed-test skipped-test impact:high Team:DataDiscovery v8.8.0
|
A test failed on a tracked branch
```
Error: expected '' to equal 'message:msg-1'
at Assertion.assert (expect.js:100:11)
at Assertion.apply (expect.js:227:8)
at Function.equal (expect.js:531:15)
at checkUpdatedRuleParamsState (search_source_alert.ts:279:31)
at runMicrotasks (<anonymous>)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at Context.<anonymous> (search_source_alert.ts:431:7)
at Object.apply (wrap_function.js:73:16)
```
First failure: [CI Build - main](https://buildkite.com/elastic/kibana-on-merge/builds/27719#01869ddc-5c14-4cae-a8db-22be150feeb6)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome X-Pack UI Functional Tests with ES SSL - Discover, Uptime, ML.x-pack/test/functional_with_es_ssl/apps/discover_ml_uptime/discover/search_source_alert·ts","test.name":"Discover alerting Search source Alert should display actual state after rule params update on clicking viewInApp link","test.failCount":4}} -->
|
2.0
|
Failing test: Chrome X-Pack UI Functional Tests with ES SSL - Discover, Uptime, ML.x-pack/test/functional_with_es_ssl/apps/discover_ml_uptime/discover/search_source_alert·ts - Discover alerting Search source Alert should display actual state after rule params update on clicking viewInApp link - A test failed on a tracked branch
```
Error: expected '' to equal 'message:msg-1'
at Assertion.assert (expect.js:100:11)
at Assertion.apply (expect.js:227:8)
at Function.equal (expect.js:531:15)
at checkUpdatedRuleParamsState (search_source_alert.ts:279:31)
at runMicrotasks (<anonymous>)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at Context.<anonymous> (search_source_alert.ts:431:7)
at Object.apply (wrap_function.js:73:16)
```
First failure: [CI Build - main](https://buildkite.com/elastic/kibana-on-merge/builds/27719#01869ddc-5c14-4cae-a8db-22be150feeb6)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome X-Pack UI Functional Tests with ES SSL - Discover, Uptime, ML.x-pack/test/functional_with_es_ssl/apps/discover_ml_uptime/discover/search_source_alert·ts","test.name":"Discover alerting Search source Alert should display actual state after rule params update on clicking viewInApp link","test.failCount":4}} -->
|
non_process
|
failing test chrome x pack ui functional tests with es ssl discover uptime ml x pack test functional with es ssl apps discover ml uptime discover search source alert·ts discover alerting search source alert should display actual state after rule params update on clicking viewinapp link a test failed on a tracked branch error expected to equal message msg at assertion assert expect js at assertion apply expect js at function equal expect js at checkupdatedruleparamsstate search source alert ts at runmicrotasks at processticksandrejections node internal process task queues at context search source alert ts at object apply wrap function js first failure
| 0
|
2,662
| 5,435,789,428
|
IssuesEvent
|
2017-03-05 19:55:07
|
symfony/symfony
|
https://api.github.com/repos/symfony/symfony
|
closed
|
The process has been signaled with signal "11"
|
Bug Process Status: Needs Review Status: Waiting feedback Unconfirmed
|
| Q | A
| ---------------- | -----
| Bug report? | yes
| Feature request? | no
| BC Break report? | no
| RFC? | no
| Symfony version | 2.8.14~2.8.15
| PHP Version | 7.1.0 by remi
| OS System | Fedora 25 X64
It occured when I was update 2.8.14 to 2.8.15 with composer.

Angone has ideas.
|
1.0
|
The process has been signaled with signal "11" - | Q | A
| ---------------- | -----
| Bug report? | yes
| Feature request? | no
| BC Break report? | no
| RFC? | no
| Symfony version | 2.8.14~2.8.15
| PHP Version | 7.1.0 by remi
| OS System | Fedora 25 X64
It occured when I was update 2.8.14 to 2.8.15 with composer.

Angone has ideas.
|
process
|
the process has been signaled with signal q a bug report yes feature request no bc break report no rfc no symfony version php version by remi os system fedora it occured when i was update to with composer angone has ideas
| 1
|
108,788
| 9,332,277,089
|
IssuesEvent
|
2019-03-28 11:48:03
|
brave/brave-browser
|
https://api.github.com/repos/brave/brave-browser
|
closed
|
verified sites not showing correct favicon in panel - follow up to 3525
|
QA/Test-Plan-Specified QA/Yes feature/rewards priority/P1
|
<!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description
Found while testing https://github.com/brave/brave-browser/issues/3525
Verified publisher sites (non YT or Twitch) are not displaying favicon in the panel correctly.
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
1. Clean profile.
2. Enable Rewards
3. Visit a verified site (brave.com, basicattentiontoken.org, ddg, etc)
4. Wait a few seconds, open panel.
## Actual result:
Favicon not shown. If you wait some time, after opening/closing panel the favicon will probably be shown.

## Expected result:
Favicon should be shown.

## Reproduces how often:
easily
## Brave version (brave://version info)
Brave | 0.63.14 Chromium: 73.0.3683.75 (Official Build) dev(64-bit)
-- | --
Revision | 909ee014fcea6828f9a610e6716145bc0b3ebf4a-refs/branch-heads/3683@{#803}
OS | Mac OS X
### Reproducible on current release: no
- Does it reproduce on brave-browser dev/beta builds? Dev yes, beta no.
### Website problems only:
- Does the issue resolve itself when disabling Brave Shields? n/a
- Is the issue reproducible on the latest version of Chrome? n/a
### Additional Information
If you run thru STR with a non-verified site (google, nyt) or a verified youtube or twitch pub, the favicon displays as expected.
Reproduced by @GeetaSarvadnya @kjozwiak on Win and Linux respectively
|
1.0
|
verified sites not showing correct favicon in panel - follow up to 3525 - <!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description
Found while testing https://github.com/brave/brave-browser/issues/3525
Verified publisher sites (non YT or Twitch) are not displaying favicon in the panel correctly.
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
1. Clean profile.
2. Enable Rewards
3. Visit a verified site (brave.com, basicattentiontoken.org, ddg, etc)
4. Wait a few seconds, open panel.
## Actual result:
Favicon not shown. If you wait some time, after opening/closing panel the favicon will probably be shown.

## Expected result:
Favicon should be shown.

## Reproduces how often:
easily
## Brave version (brave://version info)
Brave | 0.63.14 Chromium: 73.0.3683.75 (Official Build) dev(64-bit)
-- | --
Revision | 909ee014fcea6828f9a610e6716145bc0b3ebf4a-refs/branch-heads/3683@{#803}
OS | Mac OS X
### Reproducible on current release: no
- Does it reproduce on brave-browser dev/beta builds? Dev yes, beta no.
### Website problems only:
- Does the issue resolve itself when disabling Brave Shields? n/a
- Is the issue reproducible on the latest version of Chrome? n/a
### Additional Information
If you run thru STR with a non-verified site (google, nyt) or a verified youtube or twitch pub, the favicon displays as expected.
Reproduced by @GeetaSarvadnya @kjozwiak on Win and Linux respectively
|
non_process
|
verified sites not showing correct favicon in panel follow up to have you searched for similar issues before submitting this issue please check the open issues and add a note before logging a new issue please use the template below to provide information about the issue insufficient info will get the issue closed it will only be reopened after sufficient info is provided description found while testing verified publisher sites non yt or twitch are not displaying favicon in the panel correctly steps to reproduce clean profile enable rewards visit a verified site brave com basicattentiontoken org ddg etc wait a few seconds open panel actual result favicon not shown if you wait some time after opening closing panel the favicon will probably be shown expected result favicon should be shown reproduces how often easily brave version brave version info brave chromium official build dev bit revision refs branch heads os mac os x reproducible on current release no does it reproduce on brave browser dev beta builds dev yes beta no website problems only does the issue resolve itself when disabling brave shields n a is the issue reproducible on the latest version of chrome n a additional information if you run thru str with a non verified site google nyt or a verified youtube or twitch pub the favicon displays as expected reproduced by geetasarvadnya kjozwiak on win and linux respectively
| 0
|
411,317
| 27,818,498,562
|
IssuesEvent
|
2023-03-19 00:06:39
|
ICEI-PUC-Minas-PMV-ADS/Pomodoro_List
|
https://api.github.com/repos/ICEI-PUC-Minas-PMV-ADS/Pomodoro_List
|
opened
|
Planos de teste
|
documentation
|
Elaborar os documentos que se referem ao detalhamento dos planos de teste parciais de usabilidade e funcionalidades do sistema, além dos registros de testes.
Os testes deverão incluir casos de testes, para verificar se a aplicação desenvolvida está consistente com a especificação do mesmo.
|
1.0
|
Planos de teste - Elaborar os documentos que se referem ao detalhamento dos planos de teste parciais de usabilidade e funcionalidades do sistema, além dos registros de testes.
Os testes deverão incluir casos de testes, para verificar se a aplicação desenvolvida está consistente com a especificação do mesmo.
|
non_process
|
planos de teste elaborar os documentos que se referem ao detalhamento dos planos de teste parciais de usabilidade e funcionalidades do sistema além dos registros de testes os testes deverão incluir casos de testes para verificar se a aplicação desenvolvida está consistente com a especificação do mesmo
| 0
|
9,302
| 12,311,924,691
|
IssuesEvent
|
2020-05-12 13:12:22
|
googleapis/python-storage
|
https://api.github.com/repos/googleapis/python-storage
|
opened
|
lint check fails on master
|
api: storage type: process
|
A recent change of flake8 or its config causes the code style check to fail on the master branch - it complains about a `string.format(...)` function used in _signing.py file:
`google/cloud/storage/_signing.py:156:13: F523 '...'.format(...) has unused arguments at position(s): 0`
|
1.0
|
lint check fails on master - A recent change of flake8 or its config causes the code style check to fail on the master branch - it complains about a `string.format(...)` function used in _signing.py file:
`google/cloud/storage/_signing.py:156:13: F523 '...'.format(...) has unused arguments at position(s): 0`
|
process
|
lint check fails on master a recent change of or its config causes the code style check to fail on the master branch it complains about a string format function used in signing py file google cloud storage signing py format has unused arguments at position s
| 1
|
101,780
| 31,564,780,501
|
IssuesEvent
|
2023-09-03 17:33:03
|
LibreELEC/LibreELEC.tv
|
https://api.github.com/repos/LibreELEC/LibreELEC.tv
|
closed
|
[Solution Required] python setuptools needs requires pip/pep517-compliant builder
|
BUILDSYSTEM TOOLCHAIN LE 11.0
|
Adding the other known python build issue.
https://pypi.org/project/setuptools > than 52.0.0 need to use requires pip or another pep517-compliant builder
The below package.mk will need to be transitioned to pip or another pep517-compliant builder.
https://github.com/LibreELEC/LibreELEC.tv/blob/f02196c2b9d02aa1ddfe5883767983513519aa2d/packages/python/devel/setuptools/package.mk#L15-L21
The filling PRs have the background #5086 / #4264
|
1.0
|
[Solution Required] python setuptools needs requires pip/pep517-compliant builder - Adding the other known python build issue.
https://pypi.org/project/setuptools > than 52.0.0 need to use requires pip or another pep517-compliant builder
The below package.mk will need to be transitioned to pip or another pep517-compliant builder.
https://github.com/LibreELEC/LibreELEC.tv/blob/f02196c2b9d02aa1ddfe5883767983513519aa2d/packages/python/devel/setuptools/package.mk#L15-L21
The filling PRs have the background #5086 / #4264
|
non_process
|
python setuptools needs requires pip compliant builder adding the other known python build issue than need to use requires pip or another compliant builder the below package mk will need to be transitioned to pip or another compliant builder the filling prs have the background
| 0
|
2,928
| 5,916,903,371
|
IssuesEvent
|
2017-05-22 11:51:51
|
openvstorage/tgt
|
https://api.github.com/repos/openvstorage/tgt
|
closed
|
Make IP:port in tgt.service file configurable
|
process_wontfix
|
_From @kvanhijf on May 11, 2017 15:54_
According to @cnanakos and my personal experience, i couldn't find a way to have the tgtd listen on a specific IP and/or port through a configuration file.
I did manage to find a way to achieve this by making changes to the tgt.service file, like so:
`ExecStart=/usr/sbin/tgtd -f --iscsi portal=1.2.3.4:3261`
Now since we repackage the tgt package into something of our own, i was wondering whether we could make the service file into a systemd template service file, for the framework to insert the wanted values (IP:port)
_Copied from original issue: openvstorage/volumedriver#317_
|
1.0
|
Make IP:port in tgt.service file configurable - _From @kvanhijf on May 11, 2017 15:54_
According to @cnanakos and my personal experience, i couldn't find a way to have the tgtd listen on a specific IP and/or port through a configuration file.
I did manage to find a way to achieve this by making changes to the tgt.service file, like so:
`ExecStart=/usr/sbin/tgtd -f --iscsi portal=1.2.3.4:3261`
Now since we repackage the tgt package into something of our own, i was wondering whether we could make the service file into a systemd template service file, for the framework to insert the wanted values (IP:port)
_Copied from original issue: openvstorage/volumedriver#317_
|
process
|
make ip port in tgt service file configurable from kvanhijf on may according to cnanakos and my personal experience i couldn t find a way to have the tgtd listen on a specific ip and or port through a configuration file i did manage to find a way to achieve this by making changes to the tgt service file like so execstart usr sbin tgtd f iscsi portal now since we repackage the tgt package into something of our own i was wondering whether we could make the service file into a systemd template service file for the framework to insert the wanted values ip port copied from original issue openvstorage volumedriver
| 1
|
741,974
| 25,830,517,243
|
IssuesEvent
|
2022-12-12 15:47:49
|
Unity-Technologies/com.unity.netcode.gameobjects
|
https://api.github.com/repos/Unity-Technologies/com.unity.netcode.gameobjects
|
closed
|
Stale NetworkList event sent to client when its containing object is re-shown.
|
type:bug priority:medium stat:imported
|
### Description
I have a network object observed by a client containing a network list of structs. On direction by the client a timer is started on the server to remove the entries from the list from the bottom at half a second intervals. At any point the client can choose to deselect the object and the server hides the object from the client and continues to remove entries from the list until all are removed, all is good at this point.
The problem is, the next time the object is shown to the client a pending network list event message is also sent to the client to remove an entry from the network list and because the list is empty an exception is thrown.
### Reproduce Steps
1. Spawn object on client and populate network list
2. Start timer to remove list entries one at a time
3. Server hides object from client
4. All entries in list removed, timer stops
5. Server shows object to client
6. NetworkList exception message logged on client
### Actual Outcome
"Exception: Shouldn't be here, index is higher than list length"
### Expected Outcome
No network list event message is sent to client.
### Environment
- OS: macOS Big Sur
- Unity Version: [e.g. 2020.3]
- Netcode Version: 1.1.0
### Additional Context
This happens consistently in my main project, I wasn't able to reproduce it in a test project but it was only a simple test.
Both list.RemoveAt and list.Remove are producing the same error:
`Exception: Shouldn't be here, index is higher than list length
Unity.Netcode.NetworkList1[T].ReadDelta (Unity.Netcode.FastBufferReader reader, System.Boolean keepDirtyDelta) (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/NetworkVariable/Collections/NetworkList.cs:365)
Unity.Netcode.NetworkVariableDeltaMessage.Handle (Unity.Netcode.NetworkContext& context) (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Messaging/Messages/NetworkVariableDeltaMessage.cs:200)
Unity.Netcode.MessagingSystem.ReceiveMessage[T] (Unity.Netcode.FastBufferReader reader, Unity.Netcode.NetworkContext& context, Unity.Netcode.MessagingSystem system) (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Messaging/MessagingSystem.cs:457)
Unity.Netcode.MessagingSystem.HandleMessage (Unity.Netcode.MessageHeader& header, Unity.Netcode.FastBufferReader reader, System.UInt64 senderId, System.Single timestamp, System.Int32 serializedHeaderSize) (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Messaging/MessagingSystem.cs:387)
UnityEngine.Debug:LogException(Exception)
Unity.Netcode.MessagingSystem:HandleMessage(MessageHeader&, FastBufferReader, UInt64, Single, Int32) (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Messaging/MessagingSystem.cs:391)
Unity.Netcode.MessagingSystem:ProcessIncomingMessageQueue() (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Messaging/MessagingSystem.cs:407)
Unity.Netcode.NetworkManager:OnNetworkEarlyUpdate() (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Core/NetworkManager.cs:1581)
Unity.Netcode.NetworkManager:NetworkUpdate(NetworkUpdateStage) (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Core/NetworkManager.cs:1513)
Unity.Netcode.NetworkUpdateLoop:RunNetworkUpdateStage(NetworkUpdateStage) (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Core/NetworkUpdateLoop.cs:185)
Unity.Netcode.<>c:<CreateLoopSystem>b__0_0() (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Core/NetworkUpdateLoop.cs:208)
`
|
1.0
|
Stale NetworkList event sent to client when its containing object is re-shown. - ### Description
I have a network object observed by a client containing a network list of structs. On direction by the client a timer is started on the server to remove the entries from the list from the bottom at half a second intervals. At any point the client can choose to deselect the object and the server hides the object from the client and continues to remove entries from the list until all are removed, all is good at this point.
The problem is, the next time the object is shown to the client a pending network list event message is also sent to the client to remove an entry from the network list and because the list is empty an exception is thrown.
### Reproduce Steps
1. Spawn object on client and populate network list
2. Start timer to remove list entries one at a time
3. Server hides object from client
4. All entries in list removed, timer stops
5. Server shows object to client
6. NetworkList exception message logged on client
### Actual Outcome
"Exception: Shouldn't be here, index is higher than list length"
### Expected Outcome
No network list event message is sent to client.
### Environment
- OS: macOS Big Sur
- Unity Version: [e.g. 2020.3]
- Netcode Version: 1.1.0
### Additional Context
This happens consistently in my main project, I wasn't able to reproduce it in a test project but it was only a simple test.
Both list.RemoveAt and list.Remove are producing the same error:
`Exception: Shouldn't be here, index is higher than list length
Unity.Netcode.NetworkList1[T].ReadDelta (Unity.Netcode.FastBufferReader reader, System.Boolean keepDirtyDelta) (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/NetworkVariable/Collections/NetworkList.cs:365)
Unity.Netcode.NetworkVariableDeltaMessage.Handle (Unity.Netcode.NetworkContext& context) (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Messaging/Messages/NetworkVariableDeltaMessage.cs:200)
Unity.Netcode.MessagingSystem.ReceiveMessage[T] (Unity.Netcode.FastBufferReader reader, Unity.Netcode.NetworkContext& context, Unity.Netcode.MessagingSystem system) (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Messaging/MessagingSystem.cs:457)
Unity.Netcode.MessagingSystem.HandleMessage (Unity.Netcode.MessageHeader& header, Unity.Netcode.FastBufferReader reader, System.UInt64 senderId, System.Single timestamp, System.Int32 serializedHeaderSize) (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Messaging/MessagingSystem.cs:387)
UnityEngine.Debug:LogException(Exception)
Unity.Netcode.MessagingSystem:HandleMessage(MessageHeader&, FastBufferReader, UInt64, Single, Int32) (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Messaging/MessagingSystem.cs:391)
Unity.Netcode.MessagingSystem:ProcessIncomingMessageQueue() (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Messaging/MessagingSystem.cs:407)
Unity.Netcode.NetworkManager:OnNetworkEarlyUpdate() (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Core/NetworkManager.cs:1581)
Unity.Netcode.NetworkManager:NetworkUpdate(NetworkUpdateStage) (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Core/NetworkManager.cs:1513)
Unity.Netcode.NetworkUpdateLoop:RunNetworkUpdateStage(NetworkUpdateStage) (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Core/NetworkUpdateLoop.cs:185)
Unity.Netcode.<>c:<CreateLoopSystem>b__0_0() (at Library/PackageCache/com.unity.netcode.gameobjects@1.1.0/Runtime/Core/NetworkUpdateLoop.cs:208)
`
|
non_process
|
stale networklist event sent to client when its containing object is re shown description i have a network object observed by a client containing a network list of structs on direction by the client a timer is started on the server to remove the entries from the list from the bottom at half a second intervals at any point the client can choose to deselect the object and the server hides the object from the client and continues to remove entries from the list until all are removed all is good at this point the problem is the next time the object is shown to the client a pending network list event message is also sent to the client to remove an entry from the network list and because the list is empty an exception is thrown reproduce steps spawn object on client and populate network list start timer to remove list entries one at a time server hides object from client all entries in list removed timer stops server shows object to client networklist exception message logged on client actual outcome exception shouldn t be here index is higher than list length expected outcome no network list event message is sent to client environment os macos big sur unity version netcode version additional context this happens consistently in my main project i wasn t able to reproduce it in a test project but it was only a simple test both list removeat and list remove are producing the same error exception shouldn t be here index is higher than list length unity netcode readdelta unity netcode fastbufferreader reader system boolean keepdirtydelta at library packagecache com unity netcode gameobjects runtime networkvariable collections networklist cs unity netcode networkvariabledeltamessage handle unity netcode networkcontext context at library packagecache com unity netcode gameobjects runtime messaging messages networkvariabledeltamessage cs unity netcode messagingsystem receivemessage unity netcode fastbufferreader reader unity netcode networkcontext context unity netcode messagingsystem system at library packagecache com unity netcode gameobjects runtime messaging messagingsystem cs unity netcode messagingsystem handlemessage unity netcode messageheader header unity netcode fastbufferreader reader system senderid system single timestamp system serializedheadersize at library packagecache com unity netcode gameobjects runtime messaging messagingsystem cs unityengine debug logexception exception unity netcode messagingsystem handlemessage messageheader fastbufferreader single at library packagecache com unity netcode gameobjects runtime messaging messagingsystem cs unity netcode messagingsystem processincomingmessagequeue at library packagecache com unity netcode gameobjects runtime messaging messagingsystem cs unity netcode networkmanager onnetworkearlyupdate at library packagecache com unity netcode gameobjects runtime core networkmanager cs unity netcode networkmanager networkupdate networkupdatestage at library packagecache com unity netcode gameobjects runtime core networkmanager cs unity netcode networkupdateloop runnetworkupdatestage networkupdatestage at library packagecache com unity netcode gameobjects runtime core networkupdateloop cs unity netcode c b at library packagecache com unity netcode gameobjects runtime core networkupdateloop cs
| 0
|
16,683
| 10,553,847,318
|
IssuesEvent
|
2019-10-03 18:08:06
|
microsoft/BotFramework-WebChat
|
https://api.github.com/repos/microsoft/BotFramework-WebChat
|
closed
|
WebChat Channel Stopped working suddenly for all our bots from yesterday.
|
Bot Services Pending Question customer-reported
|
We have WebChat channel in azure for our bots, starting yesterday afternoon. The webChats stopped responding. Everything was working fine but suddenly stooped working.
<!-- ATTENTION: Bot Framework internals, please remove the `customer-reported` and `Bot Services` labels before submitting this issue. -->
<!-- [GitHub issues](https://github.com/microsoft/botframework-webchat/issues) should be used for bugs and feature requests. See the Support section to get support related to Bot Framework and Web Chat. -->
## Screenshots
<!-- If applicable, add screenshots to help explain your problem. -->
<!-- Be sure to remove or obscure personally identifiable information from your code and screenshots -->
<img width="892" alt="Screen Shot 2019-10-02 at 9 43 37 AM" src="https://user-images.githubusercontent.com/3451844/66054864-3f82bb80-e4fa-11e9-923d-edeb7c00c487.png">
<img width="847" alt="Screen Shot 2019-10-02 at 9 43 53 AM" src="https://user-images.githubusercontent.com/3451844/66054888-47426000-e4fa-11e9-808f-fcb6007d1df7.png">
<img width="837" alt="Screen Shot 2019-10-02 at 9 44 16 AM" src="https://user-images.githubusercontent.com/3451844/66054897-4b6e7d80-e4fa-11e9-9fe7-d8af27c1cfb9.png">
## Version
<!-- What version of Web Chat are you using? Are you using the CDN? NPM package? Or embedding Web Chat to your site via `<iframe>`? -->
<!-- The fastest way to find your Web Chat version is by checking the meta tag on your deployed site. -->
1. Using via https://webchat.botframework.com/embed/ginabeta?s=Secret Key not working.
2. As a React Component = Not working.
All these were working prior.
## Describe the bug
<!-- Give a clear and concise description of what the bug is. -->
<!-- Please be sure to add screenshots of the console errors in your browser, if there are any -->
We have WebChat channel in azure for our bots, starting yesterday afternoon. The webChats stopped responding. Everything was working fine but suddenly stooped working.
I can see the message on Web App logs, but they are getting displayed back to web chat.
## To Reproduce
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
https://webchat.botframework.com/embed/ginabeta?s=YOUR_SECRET_HERE
or React component and try to replicate the Bot secret key on the web chat.
## Expected behavior
<!-- Give a clear and concise description of what you expect to happen when following the reproduction steps above. -->
Behaviour before it stopped working

## Additional context
<!-- Add any other context about the problem here.-->
[Bug]
|
1.0
|
WebChat Channel Stopped working suddenly for all our bots from yesterday. - We have WebChat channel in azure for our bots, starting yesterday afternoon. The webChats stopped responding. Everything was working fine but suddenly stooped working.
<!-- ATTENTION: Bot Framework internals, please remove the `customer-reported` and `Bot Services` labels before submitting this issue. -->
<!-- [GitHub issues](https://github.com/microsoft/botframework-webchat/issues) should be used for bugs and feature requests. See the Support section to get support related to Bot Framework and Web Chat. -->
## Screenshots
<!-- If applicable, add screenshots to help explain your problem. -->
<!-- Be sure to remove or obscure personally identifiable information from your code and screenshots -->
<img width="892" alt="Screen Shot 2019-10-02 at 9 43 37 AM" src="https://user-images.githubusercontent.com/3451844/66054864-3f82bb80-e4fa-11e9-923d-edeb7c00c487.png">
<img width="847" alt="Screen Shot 2019-10-02 at 9 43 53 AM" src="https://user-images.githubusercontent.com/3451844/66054888-47426000-e4fa-11e9-808f-fcb6007d1df7.png">
<img width="837" alt="Screen Shot 2019-10-02 at 9 44 16 AM" src="https://user-images.githubusercontent.com/3451844/66054897-4b6e7d80-e4fa-11e9-9fe7-d8af27c1cfb9.png">
## Version
<!-- What version of Web Chat are you using? Are you using the CDN? NPM package? Or embedding Web Chat to your site via `<iframe>`? -->
<!-- The fastest way to find your Web Chat version is by checking the meta tag on your deployed site. -->
1. Using via https://webchat.botframework.com/embed/ginabeta?s=Secret Key not working.
2. As a React Component = Not working.
All these were working prior.
## Describe the bug
<!-- Give a clear and concise description of what the bug is. -->
<!-- Please be sure to add screenshots of the console errors in your browser, if there are any -->
We have WebChat channel in azure for our bots, starting yesterday afternoon. The webChats stopped responding. Everything was working fine but suddenly stooped working.
I can see the message on Web App logs, but they are getting displayed back to web chat.
## To Reproduce
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
https://webchat.botframework.com/embed/ginabeta?s=YOUR_SECRET_HERE
or React component and try to replicate the Bot secret key on the web chat.
## Expected behavior
<!-- Give a clear and concise description of what you expect to happen when following the reproduction steps above. -->
Behaviour before it stopped working

## Additional context
<!-- Add any other context about the problem here.-->
[Bug]
|
non_process
|
webchat channel stopped working suddenly for all our bots from yesterday we have webchat channel in azure for our bots starting yesterday afternoon the webchats stopped responding everything was working fine but suddenly stooped working screenshots img width alt screen shot at am src img width alt screen shot at am src img width alt screen shot at am src version using via key not working as a react component not working all these were working prior describe the bug we have webchat channel in azure for our bots starting yesterday afternoon the webchats stopped responding everything was working fine but suddenly stooped working i can see the message on web app logs but they are getting displayed back to web chat to reproduce steps to reproduce the behavior go to click on scroll down to see error or react component and try to replicate the bot secret key on the web chat expected behavior behaviour before it stopped working additional context
| 0
|
18,541
| 24,554,897,860
|
IssuesEvent
|
2022-10-12 15:09:21
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[Mobile apps] Enforce e-consent flow again for enrolled participants is not getting displayed in the mobile apps
|
Bug Blocker P0 iOS Android Process: Fixed Process: Tested dev
|
**Steps:**
1. Sign out from the mobile app
2. Go back to SB and Update Enforce e-consent flow again for enrolled participants and Publish the study
3. Now in the mobile apps, sign in and Verify
**AR:** Enforce e-consent flow again for enrolled participants is not getting displayed in the mobile apps
**ER:** Enforce e-consent flow again for enrolled participants should get displayed in the mobile apps
**Note:** Issue needs to be fixed for both standalone and gateway app
|
2.0
|
[Mobile apps] Enforce e-consent flow again for enrolled participants is not getting displayed in the mobile apps - **Steps:**
1. Sign out from the mobile app
2. Go back to SB and Update Enforce e-consent flow again for enrolled participants and Publish the study
3. Now in the mobile apps, sign in and Verify
**AR:** Enforce e-consent flow again for enrolled participants is not getting displayed in the mobile apps
**ER:** Enforce e-consent flow again for enrolled participants should get displayed in the mobile apps
**Note:** Issue needs to be fixed for both standalone and gateway app
|
process
|
enforce e consent flow again for enrolled participants is not getting displayed in the mobile apps steps sign out from the mobile app go back to sb and update enforce e consent flow again for enrolled participants and publish the study now in the mobile apps sign in and verify ar enforce e consent flow again for enrolled participants is not getting displayed in the mobile apps er enforce e consent flow again for enrolled participants should get displayed in the mobile apps note issue needs to be fixed for both standalone and gateway app
| 1
|
2,206
| 5,047,751,797
|
IssuesEvent
|
2016-12-20 10:27:40
|
NIT-dgp/UIP
|
https://api.github.com/repos/NIT-dgp/UIP
|
closed
|
Add tests for all functions and modules
|
difficulty/medium importance/high process/wip
|
Right now we don't test anything it'd be nice if we add tests to our code asap
|
1.0
|
Add tests for all functions and modules - Right now we don't test anything it'd be nice if we add tests to our code asap
|
process
|
add tests for all functions and modules right now we don t test anything it d be nice if we add tests to our code asap
| 1
|
1,898
| 4,726,527,134
|
IssuesEvent
|
2016-10-18 10:30:24
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
`process.stdin` never returns with using Git Bash for Windows
|
child_process process windows
|
* **Version**: v4.4.0 and v5.7.1
* **Platform**: Windows 64bit
From https://github.com/mysticatea/npm-run-all/issues/24
`process.stdin` getter stops a program in the following case:
- On Git Bash of Git for Windows (MINGW64)
- Executing cmd.exe (e.g. `child_process.exec`)
- Inheriting stdin.
This is a repro steps:
```
$ cmd /C="node test.js"
```
```js
"use strict";
const name = process.argv[2] || "parent";
console.log("before stdin:", name, process.binding('tty_wrap').guessHandleType(0));
process.stdin;
console.log("after stdin:", name);
if (process.argv[2] !== "child") {
require("child_process").spawn(
process.argv[0],
[process.argv[1], "child"],
{stdio: "inherit"}
);
}
```
The results of above:
**cmd.exe** OK
```
C:\Users\t-nagashima.AD\Documents\GitHub\WebDevolopment\Pizzeria-ES2015>cmd /C=node test.js
before stdin: parent TTY
after stdin: parent
before stdin: child TTY
after stdin: child
```
**PowerShell** OK
```
PS C:\Users\t-nagashima.AD\Documents\GitHub\WebDevolopment\Pizzeria-ES2015> cmd /C="node test.js"
before stdin: parent TTY
after stdin: parent
before stdin: child TTY
after stdin: child
```
**Git Shell of GitHub for Windows** OK
```
C:\Users\t-nagashima.AD\Documents\GitHub\WebDevolopment\Pizzeria-ES2015 [master +1 ~1 -0 !]> cmd /C="node test.js"
before stdin: parent TTY
after stdin: parent
before stdin: child TTY
after stdin: child
```
**Git Bash of Git for Windows** Wrong
```
t-nagashima@T-NAGASHIMA4 MINGW64 /c/Users/t-nagashima.AD/Documents/GitHub/WebDevolopment/Pizzeria-ES2015 (master)
$ cmd /C="node test.js"
before stdin: parent PIPE
after stdin: parent
before stdin: child PIPE
```
The program stops at `process.stdin`, then `"after stdin: child"` has never printed.
If I types Enter Key, the program resumes.
|
2.0
|
`process.stdin` never returns with using Git Bash for Windows - * **Version**: v4.4.0 and v5.7.1
* **Platform**: Windows 64bit
From https://github.com/mysticatea/npm-run-all/issues/24
`process.stdin` getter stops a program in the following case:
- On Git Bash of Git for Windows (MINGW64)
- Executing cmd.exe (e.g. `child_process.exec`)
- Inheriting stdin.
This is a repro steps:
```
$ cmd /C="node test.js"
```
```js
"use strict";
const name = process.argv[2] || "parent";
console.log("before stdin:", name, process.binding('tty_wrap').guessHandleType(0));
process.stdin;
console.log("after stdin:", name);
if (process.argv[2] !== "child") {
require("child_process").spawn(
process.argv[0],
[process.argv[1], "child"],
{stdio: "inherit"}
);
}
```
The results of above:
**cmd.exe** OK
```
C:\Users\t-nagashima.AD\Documents\GitHub\WebDevolopment\Pizzeria-ES2015>cmd /C=node test.js
before stdin: parent TTY
after stdin: parent
before stdin: child TTY
after stdin: child
```
**PowerShell** OK
```
PS C:\Users\t-nagashima.AD\Documents\GitHub\WebDevolopment\Pizzeria-ES2015> cmd /C="node test.js"
before stdin: parent TTY
after stdin: parent
before stdin: child TTY
after stdin: child
```
**Git Shell of GitHub for Windows** OK
```
C:\Users\t-nagashima.AD\Documents\GitHub\WebDevolopment\Pizzeria-ES2015 [master +1 ~1 -0 !]> cmd /C="node test.js"
before stdin: parent TTY
after stdin: parent
before stdin: child TTY
after stdin: child
```
**Git Bash of Git for Windows** Wrong
```
t-nagashima@T-NAGASHIMA4 MINGW64 /c/Users/t-nagashima.AD/Documents/GitHub/WebDevolopment/Pizzeria-ES2015 (master)
$ cmd /C="node test.js"
before stdin: parent PIPE
after stdin: parent
before stdin: child PIPE
```
The program stops at `process.stdin`, then `"after stdin: child"` has never printed.
If I types Enter Key, the program resumes.
|
process
|
process stdin never returns with using git bash for windows version and platform windows from process stdin getter stops a program in the following case on git bash of git for windows executing cmd exe e g child process exec inheriting stdin this is a repro steps cmd c node test js js use strict const name process argv parent console log before stdin name process binding tty wrap guesshandletype process stdin console log after stdin name if process argv child require child process spawn process argv child stdio inherit the results of above cmd exe ok c users t nagashima ad documents github webdevolopment pizzeria cmd c node test js before stdin parent tty after stdin parent before stdin child tty after stdin child powershell ok ps c users t nagashima ad documents github webdevolopment pizzeria cmd c node test js before stdin parent tty after stdin parent before stdin child tty after stdin child git shell of github for windows ok c users t nagashima ad documents github webdevolopment pizzeria cmd c node test js before stdin parent tty after stdin parent before stdin child tty after stdin child git bash of git for windows wrong t nagashima t c users t nagashima ad documents github webdevolopment pizzeria master cmd c node test js before stdin parent pipe after stdin parent before stdin child pipe the program stops at process stdin then after stdin child has never printed if i types enter key the program resumes
| 1
|
761,118
| 26,667,726,616
|
IssuesEvent
|
2023-01-26 06:59:14
|
EncryptSL/LiteEco
|
https://api.github.com/repos/EncryptSL/LiteEco
|
closed
|
[Bug]: Placeholders not working properly
|
Bug Fixed Approved Priority High
|
### What happened?
I recently updated to the newest snapshot 1.2.3 from 1.2.0 and the placeholders do not seem to be working anymore with my holographic leaderboard. I deleted the old config.yml and translations as stated before restarting the server, but it doesnt seem to have helped. The leaderboard was working prior to the update.
### Plugin Version
1.2.3-SNAPSHOT
### You detected problem on this server platform ?
PaperMC
### Relevant log output
```shell
26.01 00:20:42 [Server] INFO Loading internal permission managers...
26.01 00:20:42 [Server] INFO Performing initial data load...
26.01 00:20:43 [Server] INFO Successfully enabled. (took 2449ms)
26.01 00:20:43 [Server] INFO Enabling Vault v1.7.3-b131
26.01 00:20:43 [Server] WARN Loaded class com.earth2me.essentials.api.Economy from Essentials v2.20.0-dev+39-312d169 which is not a depend or softdepend of this plugin.
26.01 00:20:43 [Server] INFO [Economy] Essentials Economy found: Waiting
26.01 00:20:43 [Server] INFO [Permission] SuperPermissions loaded as backup permission system.
26.01 00:20:43 [Server] INFO Enabled Version 1.7.3-b131
26.01 00:20:43 [Server] INFO Registered Vault permission & chat hook.
26.01 00:20:43 [Server] INFO Enabling LiteEco v1.2.3-SNAPSHOT
26.01 00:20:43 [Server] INFO PlaceholderAPI hook not found
26.01 00:20:43 [Server] INFO Registered Vault like a service.
26.01 00:20:43 [Server] INFO Treasury not found, for better experience please download Treasury or Vault.
26.01 00:20:43 [Server] INFO You are using current version !
26.01 00:20:43 [Server] INFO Registering commands with Cloud Command Framework !
26.01 00:20:43 [Server] INFO Bukkit Listener AccountEconomyManageListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener PlayerEconomyPayListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener AdminEconomyGlobalDepositListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener AdminEconomyGlobalSetListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener AdminEconomyGlobalWithdrawListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener AdminEconomyMoneyDepositListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener AdminEconomyMoneyWithdrawListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener AdminEconomyMoneySetListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener PlayerJoinListener registered () -> ok
26.01 00:20:43 [Server] INFO Listeners registered(9) in time 21 ms -> ok
26.01 00:20:43 [Server] INFO Plugin enabled in time 512 ms
26.01 00:20:43 [Server] INFO Enabling WorldEdit v7.2.13+46576cc
26.01 00:20:43 [Server] INFO Registering commands with com.sk89q.worldedit.bukkit.BukkitServerInterface
26.01 00:20:43 [Server] INFO WEPIF: Vault detected! Using Vault for permissions
26.01 00:20:44 [Server] INFO Using com.sk89q.worldedit.bukkit.adapter.impl.v1_19_R2.PaperweightAdapter as the Bukkit adapter
26.01 00:20:46 [Server] INFO Preparing level "world"
26.01 00:20:47 [Server] INFO Preparing start region for dimension minecraft:overworld
26.01 00:20:47 [Server] INFO Time elapsed: 185 ms
26.01 00:20:47 [Server] INFO Preparing start region for dimension minecraft:the_nether
26.01 00:20:47 [Server] INFO Time elapsed: 47 ms
26.01 00:20:47 [Server] INFO Preparing start region for dimension minecraft:the_end
26.01 00:20:47 [Server] INFO Time elapsed: 38 ms
26.01 00:20:47 [Server] INFO Enabling PlaceholderAPI v2.11.2
26.01 00:20:47 [Server] WARN Loaded class com.viaversion.viaversion.api.type.Type from ViaVersion v4.5.1 which is not a depend or softdepend of this plugin.
26.01 00:20:48 [Server] INFO Fetching available expansion information...
26.01 00:20:48 [Server] INFO Enabling HolographicDisplays v3.0.1-SNAPSHOT-b244
26.01 00:20:48 [Server] INFO Enabling Votifier v2.7.3
26.01 00:20:48 [Server] INFO Loaded token for website: default
26.01 00:20:48 [Server] INFO Using epoll transport to accept votes.
26.01 00:20:48 [Server] INFO Method none selected for vote forwarding: Votes will not be received from a forwarder.
26.01 00:20:48 [Server] INFO Enabling CommandAPI v8.7.1
26.01 00:20:48 [Server] INFO Chat preview is not available
26.01 00:20:48 [Server] INFO Enabling ViaVersion v4.5.1
26.01 00:20:48 [Server] INFO Votifier enabled on socket /51.81.248.137:8448.
26.01 00:20:48 [Server] INFO Enabling floodgate v2.2.0-SNAPSHOT (b74-4f36112)
26.01 00:20:48 [Server] INFO Enabling ViaBackwards v4.5.1
26.01 00:20:48 [Server] INFO Enabling Geyser-Spigot v2.1.0-SNAPSHOT
26.01 00:20:49 [Server] INFO Enabling Essentials v2.20.0-dev+39-312d169
26.01 00:20:49 [Server] INFO Attempting to convert old kits in config.yml to new kits.yml
26.01 00:20:49 [Server] INFO No kits found to migrate.
26.01 00:20:49 [Server] INFO Loaded 38132 items from items.json.
26.01 00:20:49 [Server] INFO Using locale en_US
26.01 00:20:49 [Server] INFO ServerListPingEvent: Spigot iterator API
26.01 00:20:49 [Server] INFO Starting Metrics. Opt-out using the global bStats config.
26.01 00:20:49 [Server] INFO [Economy] Essentials Economy hooked.
26.01 00:20:49 [Server] INFO Using Vault based permissions (LuckPerms)
26.01 00:20:49 [Server] INFO Enabling MobArena v0.107
26.01 00:20:49 [Server] INFO Vault found; economy rewards enabled.
26.01 00:20:50 [Server] INFO Loaded arena 'default'
26.01 00:20:50 [Server] INFO Loaded arena 'Pyramid'
26.01 00:20:50 [Server] INFO Loaded 3 sign templates.
26.01 00:20:50 [Server] INFO Enabling WorldGuard v7.0.8-beta-01+cbb2ba7
26.01 00:20:50 [Server] INFO (world) TNT ignition is PERMITTED.
26.01 00:20:50 [Server] INFO (world) Lighters are PERMITTED.
26.01 00:20:50 [Server] INFO (world) Lava fire is PERMITTED.
26.01 00:20:50 [Server] INFO (world) Fire spread is UNRESTRICTED.
26.01 00:20:50 [Server] INFO Loaded configuration for world 'world'
26.01 00:20:50 [Server] INFO (world_nether) TNT ignition is PERMITTED.
26.01 00:20:50 [Server] INFO (world_nether) Lighters are PERMITTED.
26.01 00:20:50 [Server] INFO (world_nether) Lava fire is PERMITTED.
26.01 00:20:50 [Server] INFO (world_nether) Fire spread is UNRESTRICTED.
26.01 00:20:50 [Server] INFO Loaded configuration for world 'world_nether'
26.01 00:20:50 [Server] INFO (world_the_end) TNT ignition is PERMITTED.
26.01 00:20:50 [Server] INFO (world_the_end) Lighters are PERMITTED.
26.01 00:20:50 [Server] INFO (world_the_end) Lava fire is PERMITTED.
26.01 00:20:50 [Server] INFO (world_the_end) Fire spread is UNRESTRICTED.
26.01 00:20:50 [Server] INFO Loaded configuration for world 'world_the_end'
26.01 00:20:50 [Server] INFO Loading region data...
26.01 00:20:50 [Server] INFO Enabling AdvancedPortals v0.9.2
26.01 00:20:50 [Server] INFO BLOCK_PORTAL_TRAVEL found
26.01 00:20:50 [Server] WARN Proxy features disabled for Advanced Portals as bungee isn't enabled on the server (spigot.yml) or if you are using Paper settings.velocity-support.enabled may not be enabled (paper.yml)
26.01 00:20:50 [Server] INFO Advanced portals have been successfully enabled!
26.01 00:20:50 [Server] INFO Enabling Spartan vPhase 490.1
26.01 00:20:51 [Server] INFO Enabling Citizens v2.0.30-SNAPSHOT (build 2890)
26.01 00:20:51 [Server] INFO Loading external libraries
26.01 00:20:51 [Server] INFO Successfully registered expansion: citizensplaceholder [1.0.0]
26.01 00:20:51 [Server] INFO Loaded economy handling via Vault.
26.01 00:20:51 [Server] INFO Enabling DiscordSRV v1.26.0
26.01 00:20:52 [Server] INFO Enabling TradeSystem v2.1.3
26.01 00:20:52 [Server] INFO __________________________________________________________
26.01 00:20:52 [Server] INFO TradeSystem [2.1.3]
26.01 00:20:52 [Server] INFO Status:
26.01 00:20:52 [Server] INFO MC-Version: 1.19.3 (R0.1-SNAPSHOT, Paper)
26.01 00:20:52 [Server] INFO > Loading sounds
26.01 00:20:52 [Server] INFO > Loading blacklist
26.01 00:20:52 [Server] INFO ...got 3 blocked item(s)
26.01 00:20:52 [Server] INFO > Loading layouts
26.01 00:20:52 [Server] INFO ...got 4 layout(s)
26.01 00:20:52 [Server] INFO > Database logging is disabled
26.01 00:20:52 [Server] INFO Finished (277ms)
26.01 00:20:52 [Server] INFO __________________________________________________________
26.01 00:20:52 [Server] INFO Successfully registered expansion: tradesystem [1.1]
26.01 00:20:52 [Server] INFO Enabling CombatLogX v11.1.0.7.1025
26.01 00:20:52 [Server] INFO CombatLogX was loaded successfully.
26.01 00:20:52 [Server] INFO Enabling expansions...
26.01 00:20:52 [Server] INFO Enabling expansion 'EssentialsX Compatibility v16.3'...
26.01 00:20:52 [Server] INFO [EssentialsX Compatibility] Successfully found a dependency: Essentials v2.20.0-dev+39-312d169
26.01 00:20:52 [Server] INFO Enabling expansion 'PlaceholderAPI Compatibility v16.9'...
26.01 00:20:52 [Server] INFO [PlaceholderAPI Compatibility] Successfully found a dependency: PlaceholderAPI v2.11.2
26.01 00:20:52 [Server] INFO Successfully registered expansion: combatlogx [16.9]
26.01 00:20:52 [Server] INFO Enabling expansion 'WorldGuard Compatibility v16.4'...
26.01 00:20:52 [Server] INFO [WorldGuard Compatibility] Successfully found a dependency: WorldGuard v7.0.8-beta-01+cbb2ba7
26.01 00:20:52 [Server] INFO Enabling expansion 'Scoreboard v16.12'...
26.01 00:20:52 [Server] INFO Successfully enabled 4 expansions.
26.01 00:20:52 [Server] INFO CombatLogX was enabled successfully.
26.01 00:20:52 [Server] INFO Enabling EssentialsSpawn v2.20.0-dev+39-312d169
26.01 00:20:52 [Server] INFO Starting Metrics. Opt-out using the global bStats config.
26.01 00:20:52 [Server] INFO Enabling CoreProtect v21.3
26.01 00:20:52 [Server] INFO CoreProtect has been successfully enabled!
26.01 00:20:52 [Server] INFO Using SQLite for data storage.
26.01 00:20:52 [Server] INFO --------------------
26.01 00:20:52 [Server] INFO Enjoy CoreProtect? Join our Discord!
26.01 00:20:52 [Server] INFO Discord: www.coreprotect.net/discord/
26.01 00:20:52 [Server] INFO --------------------
26.01 00:20:52 [Server] INFO Enabling Wild v4.4.5-SNAPSHOT
26.01 00:20:52 [Server] INFO [JDA] Login Successful!
26.01 00:20:52 [Server] INFO Enabling dtlTradersPlus v6.4.17
26.01 00:20:52 [Server] INFO dtlTradersPlus> Welcome to dtlTradersPlus v6.4.17!
26.01 00:20:52 [Server] WARN Initializing Legacy Material Support. Unless you have legacy plugins and/or data this is a bug!
26.01 00:20:53 [Server] INFO [JDA] Connected to WebSocket
26.01 00:20:53 [Server] INFO DiscordSRV is up-to-date. (819b02508abe8879cfc2a513fc6cea383b70b5c5)
26.01 00:20:53 [Server] Startup Done loading 11536 rows of data to the Research Engine.
26.01 00:20:53 [Server] INFO [JDA] Finished Loading!
26.01 00:20:53 [Server] INFO Found server G:Nightfall Server(781280721003085844)
26.01 00:20:53 [Server] INFO - TC:👋gates-to-hell(783885798407274556)
26.01 00:20:53 [Server] INFO - TC:announcements(1057100501939933307)
26.01 00:20:53 [Server] INFO - TC:rules(781282562449604628)
26.01 00:20:53 [Server] INFO - TC:❗server-info(1057110323410259968)
26.01 00:20:53 [Server] INFO - TC:✔vote(876850581329821707)
26.01 00:20:53 [Server] INFO - TC:player-info(1063622659683917914)
26.01 00:20:53 [Server] INFO - TC:shops-and-economy(1063948529367777280)
26.01 00:20:53 [Server] INFO - TC:💭suggestions(851588490671423509)
26.01 00:20:53 [Server] INFO - TC:❓-questions(869017742110515210)
26.01 00:20:53 [Server] INFO - TC:❗reports(1057101125158961152)
26.01 00:20:53 [Server] INFO - TC:💬general(875161737899413544)
26.01 00:20:53 [Server] INFO - TC:🎬media(781283309853868033)
26.01 00:20:53 [Server] INFO - TC:👾twitch-streams-and-other-games(865047726307016744)
26.01 00:20:53 [Server] INFO - TC:⚡server-chat(813152744428011580)
26.01 00:20:53 [Server] INFO - TC:💳trading-bounties-services(792207484168699916)
26.01 00:20:53 [Server] INFO - TC:disboard(1065038029439373314)
26.01 00:20:53 [Server] INFO - TC:🗻server-photos(858009473549467678)
26.01 00:20:53 [Server] INFO - TC:⛰old-realm-photos(804518940679733278)
26.01 00:20:53 [Server] INFO - TC:admin-spam(781282754284355604)
26.01 00:20:53 [Server] INFO - TC:🔔aware-of-iron(859887048885403648)
26.01 00:20:53 [Server] INFO - TC:🤖bot-commands(783877273715474433)
26.01 00:20:53 [Server] INFO - TC:admin-spam(781283378057838642)
26.01 00:20:53 [Server] INFO - TC:admin-chat(1057118077029986315)
26.01 00:20:53 [Server] INFO - TC:server-console(783880236441796608)
26.01 00:20:53 [Server] INFO - TC:spartan-detections(1064554251243753632)
26.01 00:20:53 [Server] INFO - TC:invite-logger(924897798363684895)
26.01 00:20:53 [Server] INFO - TC:bots(1056356043950669834)
26.01 00:20:53 [Server] INFO - TC:to-do-list(1057093433119998042)
26.01 00:20:53 [Server] INFO Console forwarding assigned to channel TC:server-console(783880236441796608)
26.01 00:21:01 [Server] INFO dtlTradersPlus> Trying to hook into any Vault supported plugin...
26.01 00:21:01 [Server] INFO dtlTradersPlus> Vault found! Trying to hook...
26.01 00:21:01 [Server] INFO dtlTradersPlus> Found economy plugin: LiteEco
26.01 00:21:01 [Server] INFO dtlTradersPlus> Loaded 3 guis from the guis config!
26.01 00:21:01 [Server] INFO dtlTradersPlus> Hooking into Citizens!
26.01 00:21:01 [Server] INFO dtlTradersPlus> Hooking into PlaceholderAPI
26.01 00:21:01 [Server] INFO dtlTradersPlus> dtlTraders is managed by Degitise. www.degitise.com
26.01 00:21:01 [Server] INFO Enabling Essentials hook
26.01 00:21:01 [Server] INFO Enabling CrateReloaded v2.0.26
26.01 00:21:01 [Server] INFO Config Version: 2.X
26.01 00:21:01 [Server] INFO v1_19_R2
26.01 00:21:01 [Server] INFO Hologram Provider: HolographicDisplays
26.01 00:21:01 [Server] INFO Crate Configurations (1): crate.yml
26.01 00:21:02 [Server] INFO Enabling VotingPlugin v6.10.0
26.01 00:21:03 [Server] INFO Giving VotingPlugin.Player permission by default, can be disabled in the config
26.01 00:21:03 [Server] INFO Enabled VotingPlugin 6.10.0
26.01 00:21:03 [Server] INFO Enabling LuckPerms hook
26.01 00:21:03 [Server] INFO Enabling PlaceholderAPI hook
26.01 00:21:04 [Server] INFO Placeholder expansion registration initializing...
26.01 00:21:04 [Server] INFO ******************************************
26.01 00:21:04 [Server] INFO Loading Geyser version 2.1.0-SNAPSHOT (git-master-0b80c58)
26.01 00:21:04 [Server] INFO ******************************************
26.01 00:21:07 [Server] INFO Started Geyser on 51.81.248.137:25565
26.01 00:21:07 [Server] Startup Done (3.668s)! Run /geyser help for help!
26.01 00:21:07 [Server] INFO Running delayed init tasks
26.01 00:21:07 [Server] INFO Fetching version information...
26.01 00:21:07 [Server] INFO Linked 2 Bukkit permissions to commands
26.01 00:21:07 [Server] INFO Reloading datapacks...
26.01 00:21:07 [Server] INFO Checking for Updates ...
26.01 00:21:08 [Server] INFO [Update Checker] A possible update was found for plugin 'BlueSlimeCore'.
26.01 00:21:08 [Server] INFO [Update Checker] Current Version: 2.6.1.222-Beta
26.01 00:21:08 [Server] INFO [Update Checker] New Version: 2.6.0.18
26.01 00:21:08 [Server] INFO [Update Checker] Download Link: https://www.spigotmc.org/resources/83189/
26.01 00:21:08 [Server] INFO [Update Checker] There are no updates available for plugin 'CombatLogX'.
26.01 00:21:08 [Server] INFO Loaded 7 recipes
26.01 00:21:08 [Server] WARN You're 1 EssentialsX dev build(s) out of date!
26.01 00:21:08 [Server] WARN Download it here: https://essentialsx.net/downloads.html
26.01 00:21:08 [Server] INFO No new version available
26.01 00:21:09 [Server] INFO Finished reloading datapacks
26.01 00:21:09 [Server] INFO ViaVersion detected server version: 1.19.3 (761)
26.01 00:21:09 [Server] INFO Essentials found a compatible payment resolution method: Vault Compatibility Layer (v1.7.3-b131)!
26.01 00:21:09 [Server] INFO WorldEdit logging successfully initialized.
26.01 00:21:09 [Server] INFO Successfully registered expansion: discordsrv [1.26.0]
26.01 00:21:09 [Server] WARN Loaded class net.milkbowl.vault.economy.Economy from Vault v1.7.3-b131 which is not a depend or softdepend of this plugin.
26.01 00:21:09 [Server] INFO Successfully registered expansion: vault [1.7.1]
26.01 00:21:09 [Server] INFO 1 placeholder hook(s) registered!
26.01 00:21:09 [Server] INFO Loaded 6 NPCs.
26.01 00:21:09 [Server] INFO Successfully enabled 0 late-load expansions.
26.01 00:21:09 [Server] INFO Finished mapping loading, shutting down loader executor!
26.01 00:21:09 [Server] Startup Done (36.878s)! For help, type "help"
26.01 00:21:09 [Server] INFO Timings Reset
26.01 00:21:09 [Server] INFO Successfully hooked into vault economy!
26.01 00:21:09 [Server] INFO Hooked into vault permissions
```
|
1.0
|
[Bug]: Placeholders not working properly - ### What happened?
I recently updated to the newest snapshot 1.2.3 from 1.2.0 and the placeholders do not seem to be working anymore with my holographic leaderboard. I deleted the old config.yml and translations as stated before restarting the server, but it doesnt seem to have helped. The leaderboard was working prior to the update.
### Plugin Version
1.2.3-SNAPSHOT
### You detected problem on this server platform ?
PaperMC
### Relevant log output
```shell
26.01 00:20:42 [Server] INFO Loading internal permission managers...
26.01 00:20:42 [Server] INFO Performing initial data load...
26.01 00:20:43 [Server] INFO Successfully enabled. (took 2449ms)
26.01 00:20:43 [Server] INFO Enabling Vault v1.7.3-b131
26.01 00:20:43 [Server] WARN Loaded class com.earth2me.essentials.api.Economy from Essentials v2.20.0-dev+39-312d169 which is not a depend or softdepend of this plugin.
26.01 00:20:43 [Server] INFO [Economy] Essentials Economy found: Waiting
26.01 00:20:43 [Server] INFO [Permission] SuperPermissions loaded as backup permission system.
26.01 00:20:43 [Server] INFO Enabled Version 1.7.3-b131
26.01 00:20:43 [Server] INFO Registered Vault permission & chat hook.
26.01 00:20:43 [Server] INFO Enabling LiteEco v1.2.3-SNAPSHOT
26.01 00:20:43 [Server] INFO PlaceholderAPI hook not found
26.01 00:20:43 [Server] INFO Registered Vault like a service.
26.01 00:20:43 [Server] INFO Treasury not found, for better experience please download Treasury or Vault.
26.01 00:20:43 [Server] INFO You are using current version !
26.01 00:20:43 [Server] INFO Registering commands with Cloud Command Framework !
26.01 00:20:43 [Server] INFO Bukkit Listener AccountEconomyManageListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener PlayerEconomyPayListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener AdminEconomyGlobalDepositListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener AdminEconomyGlobalSetListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener AdminEconomyGlobalWithdrawListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener AdminEconomyMoneyDepositListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener AdminEconomyMoneyWithdrawListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener AdminEconomyMoneySetListener registered () -> ok
26.01 00:20:43 [Server] INFO Bukkit Listener PlayerJoinListener registered () -> ok
26.01 00:20:43 [Server] INFO Listeners registered(9) in time 21 ms -> ok
26.01 00:20:43 [Server] INFO Plugin enabled in time 512 ms
26.01 00:20:43 [Server] INFO Enabling WorldEdit v7.2.13+46576cc
26.01 00:20:43 [Server] INFO Registering commands with com.sk89q.worldedit.bukkit.BukkitServerInterface
26.01 00:20:43 [Server] INFO WEPIF: Vault detected! Using Vault for permissions
26.01 00:20:44 [Server] INFO Using com.sk89q.worldedit.bukkit.adapter.impl.v1_19_R2.PaperweightAdapter as the Bukkit adapter
26.01 00:20:46 [Server] INFO Preparing level "world"
26.01 00:20:47 [Server] INFO Preparing start region for dimension minecraft:overworld
26.01 00:20:47 [Server] INFO Time elapsed: 185 ms
26.01 00:20:47 [Server] INFO Preparing start region for dimension minecraft:the_nether
26.01 00:20:47 [Server] INFO Time elapsed: 47 ms
26.01 00:20:47 [Server] INFO Preparing start region for dimension minecraft:the_end
26.01 00:20:47 [Server] INFO Time elapsed: 38 ms
26.01 00:20:47 [Server] INFO Enabling PlaceholderAPI v2.11.2
26.01 00:20:47 [Server] WARN Loaded class com.viaversion.viaversion.api.type.Type from ViaVersion v4.5.1 which is not a depend or softdepend of this plugin.
26.01 00:20:48 [Server] INFO Fetching available expansion information...
26.01 00:20:48 [Server] INFO Enabling HolographicDisplays v3.0.1-SNAPSHOT-b244
26.01 00:20:48 [Server] INFO Enabling Votifier v2.7.3
26.01 00:20:48 [Server] INFO Loaded token for website: default
26.01 00:20:48 [Server] INFO Using epoll transport to accept votes.
26.01 00:20:48 [Server] INFO Method none selected for vote forwarding: Votes will not be received from a forwarder.
26.01 00:20:48 [Server] INFO Enabling CommandAPI v8.7.1
26.01 00:20:48 [Server] INFO Chat preview is not available
26.01 00:20:48 [Server] INFO Enabling ViaVersion v4.5.1
26.01 00:20:48 [Server] INFO Votifier enabled on socket /51.81.248.137:8448.
26.01 00:20:48 [Server] INFO Enabling floodgate v2.2.0-SNAPSHOT (b74-4f36112)
26.01 00:20:48 [Server] INFO Enabling ViaBackwards v4.5.1
26.01 00:20:48 [Server] INFO Enabling Geyser-Spigot v2.1.0-SNAPSHOT
26.01 00:20:49 [Server] INFO Enabling Essentials v2.20.0-dev+39-312d169
26.01 00:20:49 [Server] INFO Attempting to convert old kits in config.yml to new kits.yml
26.01 00:20:49 [Server] INFO No kits found to migrate.
26.01 00:20:49 [Server] INFO Loaded 38132 items from items.json.
26.01 00:20:49 [Server] INFO Using locale en_US
26.01 00:20:49 [Server] INFO ServerListPingEvent: Spigot iterator API
26.01 00:20:49 [Server] INFO Starting Metrics. Opt-out using the global bStats config.
26.01 00:20:49 [Server] INFO [Economy] Essentials Economy hooked.
26.01 00:20:49 [Server] INFO Using Vault based permissions (LuckPerms)
26.01 00:20:49 [Server] INFO Enabling MobArena v0.107
26.01 00:20:49 [Server] INFO Vault found; economy rewards enabled.
26.01 00:20:50 [Server] INFO Loaded arena 'default'
26.01 00:20:50 [Server] INFO Loaded arena 'Pyramid'
26.01 00:20:50 [Server] INFO Loaded 3 sign templates.
26.01 00:20:50 [Server] INFO Enabling WorldGuard v7.0.8-beta-01+cbb2ba7
26.01 00:20:50 [Server] INFO (world) TNT ignition is PERMITTED.
26.01 00:20:50 [Server] INFO (world) Lighters are PERMITTED.
26.01 00:20:50 [Server] INFO (world) Lava fire is PERMITTED.
26.01 00:20:50 [Server] INFO (world) Fire spread is UNRESTRICTED.
26.01 00:20:50 [Server] INFO Loaded configuration for world 'world'
26.01 00:20:50 [Server] INFO (world_nether) TNT ignition is PERMITTED.
26.01 00:20:50 [Server] INFO (world_nether) Lighters are PERMITTED.
26.01 00:20:50 [Server] INFO (world_nether) Lava fire is PERMITTED.
26.01 00:20:50 [Server] INFO (world_nether) Fire spread is UNRESTRICTED.
26.01 00:20:50 [Server] INFO Loaded configuration for world 'world_nether'
26.01 00:20:50 [Server] INFO (world_the_end) TNT ignition is PERMITTED.
26.01 00:20:50 [Server] INFO (world_the_end) Lighters are PERMITTED.
26.01 00:20:50 [Server] INFO (world_the_end) Lava fire is PERMITTED.
26.01 00:20:50 [Server] INFO (world_the_end) Fire spread is UNRESTRICTED.
26.01 00:20:50 [Server] INFO Loaded configuration for world 'world_the_end'
26.01 00:20:50 [Server] INFO Loading region data...
26.01 00:20:50 [Server] INFO Enabling AdvancedPortals v0.9.2
26.01 00:20:50 [Server] INFO BLOCK_PORTAL_TRAVEL found
26.01 00:20:50 [Server] WARN Proxy features disabled for Advanced Portals as bungee isn't enabled on the server (spigot.yml) or if you are using Paper settings.velocity-support.enabled may not be enabled (paper.yml)
26.01 00:20:50 [Server] INFO Advanced portals have been successfully enabled!
26.01 00:20:50 [Server] INFO Enabling Spartan vPhase 490.1
26.01 00:20:51 [Server] INFO Enabling Citizens v2.0.30-SNAPSHOT (build 2890)
26.01 00:20:51 [Server] INFO Loading external libraries
26.01 00:20:51 [Server] INFO Successfully registered expansion: citizensplaceholder [1.0.0]
26.01 00:20:51 [Server] INFO Loaded economy handling via Vault.
26.01 00:20:51 [Server] INFO Enabling DiscordSRV v1.26.0
26.01 00:20:52 [Server] INFO Enabling TradeSystem v2.1.3
26.01 00:20:52 [Server] INFO __________________________________________________________
26.01 00:20:52 [Server] INFO TradeSystem [2.1.3]
26.01 00:20:52 [Server] INFO Status:
26.01 00:20:52 [Server] INFO MC-Version: 1.19.3 (R0.1-SNAPSHOT, Paper)
26.01 00:20:52 [Server] INFO > Loading sounds
26.01 00:20:52 [Server] INFO > Loading blacklist
26.01 00:20:52 [Server] INFO ...got 3 blocked item(s)
26.01 00:20:52 [Server] INFO > Loading layouts
26.01 00:20:52 [Server] INFO ...got 4 layout(s)
26.01 00:20:52 [Server] INFO > Database logging is disabled
26.01 00:20:52 [Server] INFO Finished (277ms)
26.01 00:20:52 [Server] INFO __________________________________________________________
26.01 00:20:52 [Server] INFO Successfully registered expansion: tradesystem [1.1]
26.01 00:20:52 [Server] INFO Enabling CombatLogX v11.1.0.7.1025
26.01 00:20:52 [Server] INFO CombatLogX was loaded successfully.
26.01 00:20:52 [Server] INFO Enabling expansions...
26.01 00:20:52 [Server] INFO Enabling expansion 'EssentialsX Compatibility v16.3'...
26.01 00:20:52 [Server] INFO [EssentialsX Compatibility] Successfully found a dependency: Essentials v2.20.0-dev+39-312d169
26.01 00:20:52 [Server] INFO Enabling expansion 'PlaceholderAPI Compatibility v16.9'...
26.01 00:20:52 [Server] INFO [PlaceholderAPI Compatibility] Successfully found a dependency: PlaceholderAPI v2.11.2
26.01 00:20:52 [Server] INFO Successfully registered expansion: combatlogx [16.9]
26.01 00:20:52 [Server] INFO Enabling expansion 'WorldGuard Compatibility v16.4'...
26.01 00:20:52 [Server] INFO [WorldGuard Compatibility] Successfully found a dependency: WorldGuard v7.0.8-beta-01+cbb2ba7
26.01 00:20:52 [Server] INFO Enabling expansion 'Scoreboard v16.12'...
26.01 00:20:52 [Server] INFO Successfully enabled 4 expansions.
26.01 00:20:52 [Server] INFO CombatLogX was enabled successfully.
26.01 00:20:52 [Server] INFO Enabling EssentialsSpawn v2.20.0-dev+39-312d169
26.01 00:20:52 [Server] INFO Starting Metrics. Opt-out using the global bStats config.
26.01 00:20:52 [Server] INFO Enabling CoreProtect v21.3
26.01 00:20:52 [Server] INFO CoreProtect has been successfully enabled!
26.01 00:20:52 [Server] INFO Using SQLite for data storage.
26.01 00:20:52 [Server] INFO --------------------
26.01 00:20:52 [Server] INFO Enjoy CoreProtect? Join our Discord!
26.01 00:20:52 [Server] INFO Discord: www.coreprotect.net/discord/
26.01 00:20:52 [Server] INFO --------------------
26.01 00:20:52 [Server] INFO Enabling Wild v4.4.5-SNAPSHOT
26.01 00:20:52 [Server] INFO [JDA] Login Successful!
26.01 00:20:52 [Server] INFO Enabling dtlTradersPlus v6.4.17
26.01 00:20:52 [Server] INFO dtlTradersPlus> Welcome to dtlTradersPlus v6.4.17!
26.01 00:20:52 [Server] WARN Initializing Legacy Material Support. Unless you have legacy plugins and/or data this is a bug!
26.01 00:20:53 [Server] INFO [JDA] Connected to WebSocket
26.01 00:20:53 [Server] INFO DiscordSRV is up-to-date. (819b02508abe8879cfc2a513fc6cea383b70b5c5)
26.01 00:20:53 [Server] Startup Done loading 11536 rows of data to the Research Engine.
26.01 00:20:53 [Server] INFO [JDA] Finished Loading!
26.01 00:20:53 [Server] INFO Found server G:Nightfall Server(781280721003085844)
26.01 00:20:53 [Server] INFO - TC:👋gates-to-hell(783885798407274556)
26.01 00:20:53 [Server] INFO - TC:announcements(1057100501939933307)
26.01 00:20:53 [Server] INFO - TC:rules(781282562449604628)
26.01 00:20:53 [Server] INFO - TC:❗server-info(1057110323410259968)
26.01 00:20:53 [Server] INFO - TC:✔vote(876850581329821707)
26.01 00:20:53 [Server] INFO - TC:player-info(1063622659683917914)
26.01 00:20:53 [Server] INFO - TC:shops-and-economy(1063948529367777280)
26.01 00:20:53 [Server] INFO - TC:💭suggestions(851588490671423509)
26.01 00:20:53 [Server] INFO - TC:❓-questions(869017742110515210)
26.01 00:20:53 [Server] INFO - TC:❗reports(1057101125158961152)
26.01 00:20:53 [Server] INFO - TC:💬general(875161737899413544)
26.01 00:20:53 [Server] INFO - TC:🎬media(781283309853868033)
26.01 00:20:53 [Server] INFO - TC:👾twitch-streams-and-other-games(865047726307016744)
26.01 00:20:53 [Server] INFO - TC:⚡server-chat(813152744428011580)
26.01 00:20:53 [Server] INFO - TC:💳trading-bounties-services(792207484168699916)
26.01 00:20:53 [Server] INFO - TC:disboard(1065038029439373314)
26.01 00:20:53 [Server] INFO - TC:🗻server-photos(858009473549467678)
26.01 00:20:53 [Server] INFO - TC:⛰old-realm-photos(804518940679733278)
26.01 00:20:53 [Server] INFO - TC:admin-spam(781282754284355604)
26.01 00:20:53 [Server] INFO - TC:🔔aware-of-iron(859887048885403648)
26.01 00:20:53 [Server] INFO - TC:🤖bot-commands(783877273715474433)
26.01 00:20:53 [Server] INFO - TC:admin-spam(781283378057838642)
26.01 00:20:53 [Server] INFO - TC:admin-chat(1057118077029986315)
26.01 00:20:53 [Server] INFO - TC:server-console(783880236441796608)
26.01 00:20:53 [Server] INFO - TC:spartan-detections(1064554251243753632)
26.01 00:20:53 [Server] INFO - TC:invite-logger(924897798363684895)
26.01 00:20:53 [Server] INFO - TC:bots(1056356043950669834)
26.01 00:20:53 [Server] INFO - TC:to-do-list(1057093433119998042)
26.01 00:20:53 [Server] INFO Console forwarding assigned to channel TC:server-console(783880236441796608)
26.01 00:21:01 [Server] INFO dtlTradersPlus> Trying to hook into any Vault supported plugin...
26.01 00:21:01 [Server] INFO dtlTradersPlus> Vault found! Trying to hook...
26.01 00:21:01 [Server] INFO dtlTradersPlus> Found economy plugin: LiteEco
26.01 00:21:01 [Server] INFO dtlTradersPlus> Loaded 3 guis from the guis config!
26.01 00:21:01 [Server] INFO dtlTradersPlus> Hooking into Citizens!
26.01 00:21:01 [Server] INFO dtlTradersPlus> Hooking into PlaceholderAPI
26.01 00:21:01 [Server] INFO dtlTradersPlus> dtlTraders is managed by Degitise. www.degitise.com
26.01 00:21:01 [Server] INFO Enabling Essentials hook
26.01 00:21:01 [Server] INFO Enabling CrateReloaded v2.0.26
26.01 00:21:01 [Server] INFO Config Version: 2.X
26.01 00:21:01 [Server] INFO v1_19_R2
26.01 00:21:01 [Server] INFO Hologram Provider: HolographicDisplays
26.01 00:21:01 [Server] INFO Crate Configurations (1): crate.yml
26.01 00:21:02 [Server] INFO Enabling VotingPlugin v6.10.0
26.01 00:21:03 [Server] INFO Giving VotingPlugin.Player permission by default, can be disabled in the config
26.01 00:21:03 [Server] INFO Enabled VotingPlugin 6.10.0
26.01 00:21:03 [Server] INFO Enabling LuckPerms hook
26.01 00:21:03 [Server] INFO Enabling PlaceholderAPI hook
26.01 00:21:04 [Server] INFO Placeholder expansion registration initializing...
26.01 00:21:04 [Server] INFO ******************************************
26.01 00:21:04 [Server] INFO Loading Geyser version 2.1.0-SNAPSHOT (git-master-0b80c58)
26.01 00:21:04 [Server] INFO ******************************************
26.01 00:21:07 [Server] INFO Started Geyser on 51.81.248.137:25565
26.01 00:21:07 [Server] Startup Done (3.668s)! Run /geyser help for help!
26.01 00:21:07 [Server] INFO Running delayed init tasks
26.01 00:21:07 [Server] INFO Fetching version information...
26.01 00:21:07 [Server] INFO Linked 2 Bukkit permissions to commands
26.01 00:21:07 [Server] INFO Reloading datapacks...
26.01 00:21:07 [Server] INFO Checking for Updates ...
26.01 00:21:08 [Server] INFO [Update Checker] A possible update was found for plugin 'BlueSlimeCore'.
26.01 00:21:08 [Server] INFO [Update Checker] Current Version: 2.6.1.222-Beta
26.01 00:21:08 [Server] INFO [Update Checker] New Version: 2.6.0.18
26.01 00:21:08 [Server] INFO [Update Checker] Download Link: https://www.spigotmc.org/resources/83189/
26.01 00:21:08 [Server] INFO [Update Checker] There are no updates available for plugin 'CombatLogX'.
26.01 00:21:08 [Server] INFO Loaded 7 recipes
26.01 00:21:08 [Server] WARN You're 1 EssentialsX dev build(s) out of date!
26.01 00:21:08 [Server] WARN Download it here: https://essentialsx.net/downloads.html
26.01 00:21:08 [Server] INFO No new version available
26.01 00:21:09 [Server] INFO Finished reloading datapacks
26.01 00:21:09 [Server] INFO ViaVersion detected server version: 1.19.3 (761)
26.01 00:21:09 [Server] INFO Essentials found a compatible payment resolution method: Vault Compatibility Layer (v1.7.3-b131)!
26.01 00:21:09 [Server] INFO WorldEdit logging successfully initialized.
26.01 00:21:09 [Server] INFO Successfully registered expansion: discordsrv [1.26.0]
26.01 00:21:09 [Server] WARN Loaded class net.milkbowl.vault.economy.Economy from Vault v1.7.3-b131 which is not a depend or softdepend of this plugin.
26.01 00:21:09 [Server] INFO Successfully registered expansion: vault [1.7.1]
26.01 00:21:09 [Server] INFO 1 placeholder hook(s) registered!
26.01 00:21:09 [Server] INFO Loaded 6 NPCs.
26.01 00:21:09 [Server] INFO Successfully enabled 0 late-load expansions.
26.01 00:21:09 [Server] INFO Finished mapping loading, shutting down loader executor!
26.01 00:21:09 [Server] Startup Done (36.878s)! For help, type "help"
26.01 00:21:09 [Server] INFO Timings Reset
26.01 00:21:09 [Server] INFO Successfully hooked into vault economy!
26.01 00:21:09 [Server] INFO Hooked into vault permissions
```
|
non_process
|
placeholders not working properly what happened i recently updated to the newest snapshot from and the placeholders do not seem to be working anymore with my holographic leaderboard i deleted the old config yml and translations as stated before restarting the server but it doesnt seem to have helped the leaderboard was working prior to the update plugin version snapshot you detected problem on this server platform papermc relevant log output shell info loading internal permission managers info performing initial data load info successfully enabled took info enabling vault warn loaded class com essentials api economy from essentials dev which is not a depend or softdepend of this plugin info essentials economy found waiting info superpermissions loaded as backup permission system info enabled version info registered vault permission chat hook info enabling liteeco snapshot info placeholderapi hook not found info registered vault like a service info treasury not found for better experience please download treasury or vault info you are using current version info registering commands with cloud command framework info bukkit listener accounteconomymanagelistener registered ok info bukkit listener playereconomypaylistener registered ok info bukkit listener admineconomyglobaldepositlistener registered ok info bukkit listener admineconomyglobalsetlistener registered ok info bukkit listener admineconomyglobalwithdrawlistener registered ok info bukkit listener admineconomymoneydepositlistener registered ok info bukkit listener admineconomymoneywithdrawlistener registered ok info bukkit listener admineconomymoneysetlistener registered ok info bukkit listener playerjoinlistener registered ok info listeners registered in time ms ok info plugin enabled in time ms info enabling worldedit info registering commands with com worldedit bukkit bukkitserverinterface info wepif vault detected using vault for permissions info using com worldedit bukkit adapter impl paperweightadapter as the bukkit adapter info preparing level world info preparing start region for dimension minecraft overworld info time elapsed ms info preparing start region for dimension minecraft the nether info time elapsed ms info preparing start region for dimension minecraft the end info time elapsed ms info enabling placeholderapi warn loaded class com viaversion viaversion api type type from viaversion which is not a depend or softdepend of this plugin info fetching available expansion information info enabling holographicdisplays snapshot info enabling votifier info loaded token for website default info using epoll transport to accept votes info method none selected for vote forwarding votes will not be received from a forwarder info enabling commandapi info chat preview is not available info enabling viaversion info votifier enabled on socket info enabling floodgate snapshot info enabling viabackwards info enabling geyser spigot snapshot info enabling essentials dev info attempting to convert old kits in config yml to new kits yml info no kits found to migrate info loaded items from items json info using locale en us info serverlistpingevent spigot iterator api info starting metrics opt out using the global bstats config info essentials economy hooked info using vault based permissions luckperms info enabling mobarena info vault found economy rewards enabled info loaded arena default info loaded arena pyramid info loaded sign templates info enabling worldguard beta info world tnt ignition is permitted info world lighters are permitted info world lava fire is permitted info world fire spread is unrestricted info loaded configuration for world world info world nether tnt ignition is permitted info world nether lighters are permitted info world nether lava fire is permitted info world nether fire spread is unrestricted info loaded configuration for world world nether info world the end tnt ignition is permitted info world the end lighters are permitted info world the end lava fire is permitted info world the end fire spread is unrestricted info loaded configuration for world world the end info loading region data info enabling advancedportals info block portal travel found warn proxy features disabled for advanced portals as bungee isn t enabled on the server spigot yml or if you are using paper settings velocity support enabled may not be enabled paper yml info advanced portals have been successfully enabled info enabling spartan vphase info enabling citizens snapshot build info loading external libraries info successfully registered expansion citizensplaceholder info loaded economy handling via vault info enabling discordsrv info enabling tradesystem info info tradesystem info status info mc version snapshot paper info loading sounds info loading blacklist info got blocked item s info loading layouts info got layout s info database logging is disabled info finished info info successfully registered expansion tradesystem info enabling combatlogx info combatlogx was loaded successfully info enabling expansions info enabling expansion essentialsx compatibility info successfully found a dependency essentials dev info enabling expansion placeholderapi compatibility info successfully found a dependency placeholderapi info successfully registered expansion combatlogx info enabling expansion worldguard compatibility info successfully found a dependency worldguard beta info enabling expansion scoreboard info successfully enabled expansions info combatlogx was enabled successfully info enabling essentialsspawn dev info starting metrics opt out using the global bstats config info enabling coreprotect info coreprotect has been successfully enabled info using sqlite for data storage info info enjoy coreprotect join our discord info discord info info enabling wild snapshot info login successful info enabling dtltradersplus info dtltradersplus welcome to dtltradersplus warn initializing legacy material support unless you have legacy plugins and or data this is a bug info connected to websocket info discordsrv is up to date startup done loading rows of data to the research engine info finished loading info found server g nightfall server info tc 👋gates to hell info tc announcements info tc rules info tc ❗server info info tc ✔vote info tc player info info tc shops and economy info tc 💭suggestions info tc ❓ questions info tc ❗reports info tc 💬general info tc 🎬media info tc 👾twitch streams and other games info tc ⚡server chat info tc 💳trading bounties services info tc disboard info tc 🗻server photos info tc ⛰old realm photos info tc admin spam info tc 🔔aware of iron info tc 🤖bot commands info tc admin spam info tc admin chat info tc server console info tc spartan detections info tc invite logger info tc bots info tc to do list info console forwarding assigned to channel tc server console info dtltradersplus trying to hook into any vault supported plugin info dtltradersplus vault found trying to hook info dtltradersplus found economy plugin liteeco info dtltradersplus loaded guis from the guis config info dtltradersplus hooking into citizens info dtltradersplus hooking into placeholderapi info dtltradersplus dtltraders is managed by degitise info enabling essentials hook info enabling cratereloaded info config version x info info hologram provider holographicdisplays info crate configurations crate yml info enabling votingplugin info giving votingplugin player permission by default can be disabled in the config info enabled votingplugin info enabling luckperms hook info enabling placeholderapi hook info placeholder expansion registration initializing info info loading geyser version snapshot git master info info started geyser on startup done run geyser help for help info running delayed init tasks info fetching version information info linked bukkit permissions to commands info reloading datapacks info checking for updates info a possible update was found for plugin blueslimecore info current version beta info new version info download link info there are no updates available for plugin combatlogx info loaded recipes warn you re essentialsx dev build s out of date warn download it here info no new version available info finished reloading datapacks info viaversion detected server version info essentials found a compatible payment resolution method vault compatibility layer info worldedit logging successfully initialized info successfully registered expansion discordsrv warn loaded class net milkbowl vault economy economy from vault which is not a depend or softdepend of this plugin info successfully registered expansion vault info placeholder hook s registered info loaded npcs info successfully enabled late load expansions info finished mapping loading shutting down loader executor startup done for help type help info timings reset info successfully hooked into vault economy info hooked into vault permissions
| 0
|
339
| 6,760,386,191
|
IssuesEvent
|
2017-10-24 20:26:53
|
dotnet/roslyn
|
https://api.github.com/repos/dotnet/roslyn
|
closed
|
ContainedDocument can leak if exceptions are thrown
|
4 - In Review Area-IDE Pending Shiproom Approval Tenet-Performance Tenet-Reliability Urgency-Now
|
Underlying issue:
During the call to `CreateContainedLanguage`, a call is made to `AbstractProject.AddDocument`. This call adds the `ContainedDocument` to two sets. If the key already exists in the second set, an exception is thrown and the caller will lose the ability to dispose of the `IVsContainedLanguage` instance, leaving a strong reference to the `ContainedDocument` permanently in the first set.
Heap dumps where this bug manifests show large numbers of `ContainedDocument` instances leaking. One case has 343,931 instances.
The offending code block is the following:
https://github.com/dotnet/roslyn/blob/a4239cbcadcf8a1f1e8485e430a7ac190b78fc85/src/VisualStudio/Core/Def/Implementation/ProjectSystem/AbstractProject.cs#L1028-L1032
💭 It should be possible to fix this by simply reversing the two calls to `Add`. Since `ContainedDocument` always uses a new GUID for each instance, the addition using that GUID should never fail. However, a safer fix would ensure that the pair of additions is atomic, regardless of input.
|
True
|
ContainedDocument can leak if exceptions are thrown - Underlying issue:
During the call to `CreateContainedLanguage`, a call is made to `AbstractProject.AddDocument`. This call adds the `ContainedDocument` to two sets. If the key already exists in the second set, an exception is thrown and the caller will lose the ability to dispose of the `IVsContainedLanguage` instance, leaving a strong reference to the `ContainedDocument` permanently in the first set.
Heap dumps where this bug manifests show large numbers of `ContainedDocument` instances leaking. One case has 343,931 instances.
The offending code block is the following:
https://github.com/dotnet/roslyn/blob/a4239cbcadcf8a1f1e8485e430a7ac190b78fc85/src/VisualStudio/Core/Def/Implementation/ProjectSystem/AbstractProject.cs#L1028-L1032
💭 It should be possible to fix this by simply reversing the two calls to `Add`. Since `ContainedDocument` always uses a new GUID for each instance, the addition using that GUID should never fail. However, a safer fix would ensure that the pair of additions is atomic, regardless of input.
|
non_process
|
containeddocument can leak if exceptions are thrown underlying issue during the call to createcontainedlanguage a call is made to abstractproject adddocument this call adds the containeddocument to two sets if the key already exists in the second set an exception is thrown and the caller will lose the ability to dispose of the ivscontainedlanguage instance leaving a strong reference to the containeddocument permanently in the first set heap dumps where this bug manifests show large numbers of containeddocument instances leaking one case has instances the offending code block is the following 💭 it should be possible to fix this by simply reversing the two calls to add since containeddocument always uses a new guid for each instance the addition using that guid should never fail however a safer fix would ensure that the pair of additions is atomic regardless of input
| 0
|
52,080
| 6,218,100,002
|
IssuesEvent
|
2017-07-08 21:27:29
|
azerupi/mdBook
|
https://api.github.com/repos/azerupi/mdBook
|
closed
|
`mdbook test` with extern crates
|
A-Tests S-Wishlist T-Enhancement
|
[When invoking `rustdoc`](https://github.com/azerupi/mdBook/blob/master/src/book/mod.rs#L382) it should be possible to specific some options namely `-L` and `--extern` for `rustdoc` to allow testing examples with `extern crate xxx;`
|
1.0
|
`mdbook test` with extern crates - [When invoking `rustdoc`](https://github.com/azerupi/mdBook/blob/master/src/book/mod.rs#L382) it should be possible to specific some options namely `-L` and `--extern` for `rustdoc` to allow testing examples with `extern crate xxx;`
|
non_process
|
mdbook test with extern crates it should be possible to specific some options namely l and extern for rustdoc to allow testing examples with extern crate xxx
| 0
|
444,399
| 12,811,618,715
|
IssuesEvent
|
2020-07-04 00:06:42
|
StrangeLoopGames/EcoIssues
|
https://api.github.com/repos/StrangeLoopGames/EcoIssues
|
closed
|
[0.9.0 staging-1586] Web Elections: strange wrapping for stats blocks
|
Category: Web Priority: Medium
|
1680x1050

1920x1080

It looks rather strange
|
1.0
|
[0.9.0 staging-1586] Web Elections: strange wrapping for stats blocks - 1680x1050

1920x1080

It looks rather strange
|
non_process
|
web elections strange wrapping for stats blocks it looks rather strange
| 0
|
17,995
| 24,011,941,970
|
IssuesEvent
|
2022-09-14 19:40:54
|
zammad/zammad
|
https://api.github.com/repos/zammad/zammad
|
closed
|
HTML comments/conditions being rendered from emails
|
bug verified mail processing
|
### Infos:
* Used Zammad version: 5.2.x
* Installation method (source, package, ..): Source
* Operating system: OpenSUSE 15.3
* Database + version: 10.5.15-MariaDB
* Elasticsearch version: 7.17.5
* Browser + version: Chrome 103
### Expected beha
Zammad should ignore (i.e. strip/remove) conditional comments in HTML of the format <![xxx]>, for example <![if !vml]><![endif]>. These shouldn't be displayed in the article/ticket content.
### Actual behavior:
Zammad displays such conditional comments, e.g. <![if !vml]><![endif]>. These are non-standard HTML comments introduced by Microsoft (of course) known as "Downlevel-revealed conditional comments" according to [Wikipedia](https://en.wikipedia.org/wiki/Conditional_comment). All browsers ignore these, but Zammad must be escaping the characters as they're displayed in the rendered html output.
We've only observed this since upgrading from Zammad 5.1.x to 5.2.x.

### Steps to reproduce the behavior:
Injest email into zammad that contains such conditional formatting, typically generating by Outlook/Word.
|
1.0
|
HTML comments/conditions being rendered from emails - ### Infos:
* Used Zammad version: 5.2.x
* Installation method (source, package, ..): Source
* Operating system: OpenSUSE 15.3
* Database + version: 10.5.15-MariaDB
* Elasticsearch version: 7.17.5
* Browser + version: Chrome 103
### Expected beha
Zammad should ignore (i.e. strip/remove) conditional comments in HTML of the format <![xxx]>, for example <![if !vml]><![endif]>. These shouldn't be displayed in the article/ticket content.
### Actual behavior:
Zammad displays such conditional comments, e.g. <![if !vml]><![endif]>. These are non-standard HTML comments introduced by Microsoft (of course) known as "Downlevel-revealed conditional comments" according to [Wikipedia](https://en.wikipedia.org/wiki/Conditional_comment). All browsers ignore these, but Zammad must be escaping the characters as they're displayed in the rendered html output.
We've only observed this since upgrading from Zammad 5.1.x to 5.2.x.

### Steps to reproduce the behavior:
Injest email into zammad that contains such conditional formatting, typically generating by Outlook/Word.
|
process
|
html comments conditions being rendered from emails infos used zammad version x installation method source package source operating system opensuse database version mariadb elasticsearch version browser version chrome expected beha zammad should ignore i e strip remove conditional comments in html of the format for example these shouldn t be displayed in the article ticket content actual behavior zammad displays such conditional comments e g these are non standard html comments introduced by microsoft of course known as downlevel revealed conditional comments according to all browsers ignore these but zammad must be escaping the characters as they re displayed in the rendered html output we ve only observed this since upgrading from zammad x to x steps to reproduce the behavior injest email into zammad that contains such conditional formatting typically generating by outlook word
| 1
|
785,878
| 27,626,734,909
|
IssuesEvent
|
2023-03-10 07:33:32
|
gamefreedomgit/Maelstrom
|
https://api.github.com/repos/gamefreedomgit/Maelstrom
|
closed
|
[Quest][Cata][Npc] Mr. Goldmine's Wild Ride + A Little on the Side + While We're Here + Rune Ruination + A Fiery Reunion
|
NPC Quest - Cataclysm (80+) Quest - Event Priority: Critical Status: Confirmed Twilight Highlands
|
**How to reproduce:**
1. event for quest Mr. Goldmine's Wild Ride simply is not working at all... quest giver is missing and also all of the npcs inside of the cave are missing (both horde and alliance are unable to do it) more details about missing npcs: https://github.com/gamefreedomgit/Maelstrom/issues/1941
2. since event is missing and npc at the end of the ride is not there neither(objects missing as well) so you can't accept and complete this quest "A Little on the Side"
3. quest While We're Here is not doable since npcs that spawns after doing Mr. Goldmine's Wild Ride is not there and also npc objective that you are supposed to kill are also missing
4. quest Rune Ruination is simply not doable runes themselves are there but its not possible to interact
5. after doing all the quests above A Fiery Reunion should be available for the players to do but sadly npcs are missing and its impossible to do the quest
**How it should work:**
Mr. Goldmine's Wild Ride : https://www.youtube.com/watch?v=lPnwwhxjSBE
A Little on the Side : https://www.youtube.com/watch?v=U6PYyPhFiuw
While We're Here : https://www.youtube.com/watch?v=EwVaeo3oXAs
Rune Ruination : https://www.youtube.com/watch?v=wPyc_XPNphY
A Fiery Reunion : https://www.youtube.com/watch?v=stBdNSr9EAs
**Database links:**
Mr. Goldmine's Wild Ride : https://cata-twinhead.twinstar.cz/?search=Mr.+Goldmine%27s+Wild+Ride#quests
A Little on the Side : https://cata-twinhead.twinstar.cz/?quest=27742
While We're Here : https://cata-twinhead.twinstar.cz/?quest=27743
Rune Ruination : https://cata-twinhead.twinstar.cz/?quest=27744
A Fiery Reunion : https://cata-twinhead.twinstar.cz/?quest=27745
|
1.0
|
[Quest][Cata][Npc] Mr. Goldmine's Wild Ride + A Little on the Side + While We're Here + Rune Ruination + A Fiery Reunion - **How to reproduce:**
1. event for quest Mr. Goldmine's Wild Ride simply is not working at all... quest giver is missing and also all of the npcs inside of the cave are missing (both horde and alliance are unable to do it) more details about missing npcs: https://github.com/gamefreedomgit/Maelstrom/issues/1941
2. since event is missing and npc at the end of the ride is not there neither(objects missing as well) so you can't accept and complete this quest "A Little on the Side"
3. quest While We're Here is not doable since npcs that spawns after doing Mr. Goldmine's Wild Ride is not there and also npc objective that you are supposed to kill are also missing
4. quest Rune Ruination is simply not doable runes themselves are there but its not possible to interact
5. after doing all the quests above A Fiery Reunion should be available for the players to do but sadly npcs are missing and its impossible to do the quest
**How it should work:**
Mr. Goldmine's Wild Ride : https://www.youtube.com/watch?v=lPnwwhxjSBE
A Little on the Side : https://www.youtube.com/watch?v=U6PYyPhFiuw
While We're Here : https://www.youtube.com/watch?v=EwVaeo3oXAs
Rune Ruination : https://www.youtube.com/watch?v=wPyc_XPNphY
A Fiery Reunion : https://www.youtube.com/watch?v=stBdNSr9EAs
**Database links:**
Mr. Goldmine's Wild Ride : https://cata-twinhead.twinstar.cz/?search=Mr.+Goldmine%27s+Wild+Ride#quests
A Little on the Side : https://cata-twinhead.twinstar.cz/?quest=27742
While We're Here : https://cata-twinhead.twinstar.cz/?quest=27743
Rune Ruination : https://cata-twinhead.twinstar.cz/?quest=27744
A Fiery Reunion : https://cata-twinhead.twinstar.cz/?quest=27745
|
non_process
|
mr goldmine s wild ride a little on the side while we re here rune ruination a fiery reunion how to reproduce event for quest mr goldmine s wild ride simply is not working at all quest giver is missing and also all of the npcs inside of the cave are missing both horde and alliance are unable to do it more details about missing npcs since event is missing and npc at the end of the ride is not there neither objects missing as well so you can t accept and complete this quest a little on the side quest while we re here is not doable since npcs that spawns after doing mr goldmine s wild ride is not there and also npc objective that you are supposed to kill are also missing quest rune ruination is simply not doable runes themselves are there but its not possible to interact after doing all the quests above a fiery reunion should be available for the players to do but sadly npcs are missing and its impossible to do the quest how it should work mr goldmine s wild ride a little on the side while we re here rune ruination a fiery reunion database links mr goldmine s wild ride a little on the side while we re here rune ruination a fiery reunion
| 0
|
7,657
| 10,741,031,801
|
IssuesEvent
|
2019-10-29 19:23:15
|
openopps/openopps-platform
|
https://api.github.com/repos/openopps/openopps-platform
|
opened
|
View applicant - sort by
|
Apply Process
|
Who: Community and sitewide admins
What: ability to sort on applicant lists
Why: in order to
Acceptance criteria:
- Add a sort by to the applicant listing
- Default sort will be by last name
- Sort by options will be for all fields on the table: Name, email, Status, and Last update
Related tickets:
4049 -
|
1.0
|
View applicant - sort by - Who: Community and sitewide admins
What: ability to sort on applicant lists
Why: in order to
Acceptance criteria:
- Add a sort by to the applicant listing
- Default sort will be by last name
- Sort by options will be for all fields on the table: Name, email, Status, and Last update
Related tickets:
4049 -
|
process
|
view applicant sort by who community and sitewide admins what ability to sort on applicant lists why in order to acceptance criteria add a sort by to the applicant listing default sort will be by last name sort by options will be for all fields on the table name email status and last update related tickets
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.